The technology management program is designed primarily as a degree completion transfer program to articulate with accredited AS and AAS programs in industrial and engineering technology related fields. The program focuses on the technology management skills to prepare students for career advancement in supervision and management. Graduates in the program are competent in supervision, and management. Graduates in the program are competent in supervision, quality control, production planning, workplace law, project management and a variety of other skills that prepare you for technology management.

Recent Submissions

  • DETERMINING THE INFLUENCE OF THE NETWORK TIME PROTOCOL (NTP) ON THE DOMAIN NAME SERVICE SECURITY EXTENSION (DNSSEC) PROTOCOL

    Cold, Sherman J. (Indiana State University, 2015-04)
    Recent hacking events against Sony Entertainment, Target, Home Depot, and bank Automated Teller Machines (ATMs) fosters a growing perception that the Internet is an insecure environment. While Internet Privacy Concerns (IPCs) continue to grow out of a general concern for personal privacy, the availability of inexpensive Internet-capable mobile devices increases the Internet of Things (IoT), a network of everyday items embedded with the ability to connect and exchange data. Domain Name Services (DNS) has been integral part of the Internet for name resolution since the beginning. Domain Name Services has several documented vulnerabilities; for example, cache poisoning. The solution adopted by the Internet Engineering Task Force (IETF) to strengthen DNS is DNS Security Extensions (DNSSEC). DNS Security Extensions uses support for cryptographically signed name resolution responses. The cryptography used by DNSSEC is the Public Key Infrastructure (PKI). Some researchers have suggested that the time stamp used in the public certificate of the name resolution response influences DNSSEC vulnerability to a Man-in-the-Middle (MiTM) attack. This quantitative study determined the efficacy of using the default relative Unix epoch time stamp versus an absolute time stamp provided by the Network Time Protocol (NTP). Both a two-proportion test and Fisher’s exact test were used on a large sample size to show that there is a statistically significant better performance in security behavior when using NTP absolute time instead of the traditional relative Unix epoch time with DNSSEC.
  • GENERIC ORGANIZATIONAL STRATEGY INTEGRATION IMPACTS ON PROFIT MARGIN RATIO AND INVENTORY TURNOVER IN PUBLICALLY TRADED OKLAHOMA MANUFACTURING ORGANIZATIONS

    Bell, Christopher (Indiana State University, 2015-05)
    This study sought to determine if and to what extent strategy integration was related to the financial indicators profit margin ratio and inventory turnover for publically traded manufacturing organizations in Oklahoma. Current strategy theory states that the more thoroughly an organization adopts a given strategy the greater the effect will be on these financial indicators. Hence the need to more fully understand the extent and rates at which strategy integration effects these indicators. This study looked at perceived strategy integration scores for publically traded Oklahoma manufacturing organizations taken from June to August 2014 and financial indicators from 2012 and 2013. The perceived strategy integration scores were obtained via survey while the financial indicators were calculated using Section 10-K filings from the United States Securities and Exchange Commission (US SEC or SEC). Reliable financial information is not publically available for many private organizations, so, they were excluded from the study. Summary analysis of the data indicated that strategies were not in use in equal proportions with Niche Differentiation being most popular by far. Market focus appeared to be an indicator of inventory turnover standard deviation with Broad focus and Combination strategy groups having lower standard deviation. While the product focus appeared to indicate profit margin ratio range with Low Cost strategies having lower profit margins. After performing additional analysis it was found that performance enhancing technologies and other complicating factors may have had a larger impact than previously believed. A correlation was unable to be established for most strategies. For the Niche Low Cost Strategy a relationship was found where profit margins decreased 1.634% for each 1 point increase in perceived strategy integration score. It was also found that the Broad Differentiation Strategy it was found that inventory turns increased 0.7006 turns for every 1 point increase in perceived strategy integration score. No other strategies were found to have correlation coefficients that were statistically different from the null hypothesis. However, anecdotal evidence was found in support of several other of Porter’s theories.
  • A MODEL OF PACKET LOSS CAUSED BY INTERFERENCE BETWEEN THE BLUETOOTH LOW ENERGY COMPONENT OF AN IOS WEARABLE BODY AREA NETWORK AND RESIDENTIAL MICROWAVE OVENS

    Barge, William C. (Indiana State University, 2015-05)
    Cardiovascular diseases are the leading cause of death in the United States. Advances in wireless technology have made possible the remote monitoring of a patient’s heart sensors as part of a body area network. Previous studies have suggested that stray wireless transmissions in the industrial, scientific, and medical (ISM) band cause interference resulting in packet loss in Bluetooth piconets. This study investigates the impact that wireless transmissions from residential microwave ovens have on the Bluetooth Low Energy (BLE) component of the body area network. Using a systematic data collection approach, two variables were manipulated. The distance between the microwave oven and the BLE piconet was varied from 0.5 meter to 5.0 meters at one-half meter increments. At each distance, the power level of the microwave oven was varied from the lowest power setting to the highest power setting. The two variables that were collected were the microwave interference generated by channel and the packet loss by channel. The results suggest more packet loss is due to the microwave oven’s power level than by the distance, the interference caused by the microwave oven affects all BLE channels equally, and the packet loss by channel is a good predictor of microwave oven interference. The significance of this study lies in providing beneficial information to the medical and digital communication industries concerning the causes and solutions to disruptions in the Bluetooth-enabled body area network devices in a very common situation. The results of this study may lend support for improvements and widespread use of body area network medical systems, which may have the benefit of better monitoring, more data, and reduced fatalities due to misdiagnosed heart conditions.
  • DEVELOPMENT OF A METHODOLOGY FOR EVALUATING QUALITY CHARACTERISTICS OF FUSED DEPOSITION MODELING

    Winston Sealy, Dominique (Indiana State University, 2014-12)
    Additive Manufacturing rapid reproductive systems are gaining popularity within the manufacturing industry. One of the many benefits of such systems has been the exploration of building practical sacrificial patterns for investment casted metals. Methods such as, Castform and Quickcast, has been developed for selective laser sintering and Stereolithography apparatus technologies respectively. Research has demonstrated significant cost savings when Additive manufacturing rapid reproductive systems are utilized for customized or small batch production of sacrificial patterns. The purpose of this study was to develop a methodology for evaluating quality characteristics of Fused Deposition Modeling. Since Fused Deposition Modeling have been demonstrated by a number of experimental studies as a viable alternative to wax sacrificial patterns, this study explored the effects of wall thickness and raster resolution on quality characteristics such as, diametric accuracy, cylindricity, and concentricity. The results of the study indicated raster resolution had no effect on the measured quality characteristics, however, the ANOVA and Kruskal-Wallis tests showed statistical significance (α=0.05) for wall thickness of cylindricity of a small diameter (0.5”) and concentricity of two cylindrical features of diameters 0.5” and 1”. Moreover, the main contributions of this study involved the development of an accurate and robust design of experiment methodology. In addition, implications and recommendations for practice were also discussed.
  • ANALYSIS OF IPV6 READINESS OF END-USER ENTERPRISES IN THE NORTH CAROLINA EASTERN REGION

    Pickard, John (Indiana State University, 2014-12)
    On February 3rd, 2011, the Internet Addressing and Numbers Authority (IANA) allocated the last five /8 blocks of IPv4 addresses to each of the five Regional Internet Registries (RIRs). Since that event, four of the five RIRs have depleted their IPv4 allocations and began operating under final IPv4 address depletion policies. The exhaustion of the IPv4 address pools maintained by the registries means that IPv4 is now a legacy protocol and that all future Internet growth will be over IPv6. This exhaustion also means that organizations must take action to accommodate IPv6 adoption or risk compromising business agility and continuity – especially those organizations with public-facing content that rely on the Internet. Yet, anecdotal evidence and recent published studies indicate that few organizations have moved to adopt IPv6. The evidence suggests a low sense of urgency and lack of understanding among organizational leaders regarding the potential consequences that IPv4 exhaustion will have on their organization’s business model. An understanding pertaining to the IPv6 adoption readiness within organizations is needed so that programs can be established to raise the awareness of organizational decision makers to risks of not having an IPv6 strategy and to inspire them to take action. This study achieved this objective by investigating the IPv6 readiness of enterprise organizations located in eastern North Carolina through a survey sent to the senior IT decision makers of 463 end-user enterprise organizations. IPv6 readiness was measured across five facets of organizational IPv6 preparedness; training, high-level planning, assessment of the current environment, IPv6 policy, and IPv6 deployment. Statistical analyses identified the significant technology adoption factors associated with IPv6 readiness as measured on a six-stage Guttman scale, ranging from simply “aware” of IPv6, to general IPv6 deployment. Results revealed that the majority of organizations have made little to no preparation toward IPv6 adoption and do not see IPv6 adoption as an urgent issue. Further it was found that the factors most significantly associated with low levels of IPv6 readiness were lack of perceived advantages of IPv6 and lack of perceived pressures from industry partners and customers to adopt IPv6. Based on the findings of this study, a recommended approach to developing an effective IPv6 strategy, as well as, a framework for IPv6 adoption planning is presented for organizational leaders and IT decision makers to use as a guide toward a successful IPv6 transition.
  • Analytical Modeling and Feasibility Study for Adoption of Renewable Energy Sources in a Single Family Dwelling

    Linn, Jimmy B. (Indiana State University, 2014-12)
    In the last four or five decades, increased political and social pressure has been placed on commercial and residential consumers to reduce consumption of fossil fuels and invest in alternative methods of energy production for electricity, heating and cooling. Commercially, wind turbines and photovoltaic energy production equipment has sprung up all over the country. Other forms of energy production such as hydroelectric and geothermal energy production facilities have also been built. During this time however, very few residential ‘green energy’ investments have been made. Only in recent years have residential home owners begun to “wet their feet” on ‘green’ energy equipment. Cost has been the major factor. Of late though, costs have been coming down and efficiency has been going up making home owners begin to sense that alternative energy may now be entering the realm of economic feasibility. Unfortunately, home owners have had no reliable or credible tools to assess economic viability of such systems. The purpose of this research is to develop a tool to access the potential of alternative energy sources and test it statistically by surveying subjects in five different ‘green’ energy categories. Since atmospheric (air-to-air) heat pumps have been around for many years and represent a mature heating and cooling technology, upgrading older inefficient HVAC equipment to new high efficiency atmospheric heat pumps is the category used to baseline the experiment. Ground source heat pumps and direct solar heating systems were modeled and compared to the baseline. Wind energy and photovoltaic energy production systems were modeled, surveyed and compared to using only grid supplied electricity. Results show that in four of the five cases tested, the less mature ‘green’ energy equipment; photovoltaic solar, direct solar and ground source heat pump equipment are in general not economically viable without tax rebates to significantly lower the net investment. Setback rules and environmental and aesthetic ordinances against siting them in those counties severely restrict the population of wind energy devices so that an effective test of this category using the model could not be done. The model performed well with the baseline data. Performance of the model with ground source heat pumps was reasonable, but improvements in the model reflecting differing features of ground source heat pumps need to be made. Performance of the model with photovoltaic energy production equipment was also good. Extending the test population to all fifty states and extending the utility bill test range from one year to five years will provide much more useful data to test and improve the model. Although the model development and testing done in this work only represents a small contribution to the bridging of a large gap in consumer confidence in green energy products, it represents a big step into an area that very few have attempted to venture into.
  • A CASE STUDY OF INFORMATION SYSTEM SECURITY COMPLIANCE OF SMALL MEDICAL AND DENTAL PRACTICES

    Folse, Debra Landry (Indiana State University, 2014-12)
    Small medical and dental practices must comply with the Health Insurance Portability and Accountability Act (HIPAA) of 1996, and Title XIII Health Information Technology for Economic and Clinical Health (HITECH) of the American Recovery and Reinvestment Act (ARRA) of 2009. The case study, utilizing interviews, observations, and existing documentation of two medical and the two dental practices, not only analyzed the compliance solution choices made involving procedures and technologies, but also analyzed the emotion aspects of fear of non-compliance, perceived confidence in compliance, and the primary and secondary appraisals of the compelled compliance. Although compliance is not an easy process, small medical and dental practices can discover a number of possible options and identify the best fit solution for their practice in the effort to affect compliance.
  • IMPACT OF CORPORATE SUSTAINABILITY BEHAVIOR ON FINANCIAL PERFORMANCE IN AEROSPACE COMPANIES

    Blake, Petulia (Indiana State University, 2014-12)
    Sustainability is increasingly becoming an integral part of how organizations communicate their business operation to stakeholders. As it is common knowledge that organizations are more inclined to invest in programs that contribute to their bottom-line, this study presents an analysis of the relationship between corporate sustainability behaviors and their impact on financial performance. The sample size was 40 United States (U.S.) aerospace companies, selected from the “Top 100 Aerospace Companies” world-wide in a report produced by Candesic consulting firm in 2012. Of the 40 U.S. companies, 21 were found to provide some form of sustainability report. Quantitative and qualitative instruments were constructed to identify and measure the following sustainability behaviors: 1) Report versus Non-Reporting Status, 2) sustainability initiative integration (SII), 3) sustainability strategic integration (SSI), 4) trends in sustainability reporting, and 5) Global Reporting Initiative (GRI) versus Non-GRI status. Archival data such as sustainability reports and financial reports were used to compare the relationship between the five independent variables and the 5-year profit-margin ratio mean of the companies in the study. All financial information was obtained from Reuters, a financial and business news source. After retrieving and analyzing all the reports, it was found that there is no significant relationship between the sustainability behaviors identified and financial performance. Although, the sustainability reporting trends indicate a slight relationship between reporting start date and the 5-year average profit margin mean, this researcher understands that there may be other factors involved. Further, there appears to be some relationship among the independent variables Sustainability Initiative Integration (SII), Sustainability Strategic iv Integration (SSI) and GRI status. For instance, organizations that use the GRI metrics tend to provide a strong message indicating that sustainability is aligned with the business goals, which influence how they market and innovate products and services. The strength of this study is the qualitative components that will contribute to further understanding and development of corporate sustainability within a multidisciplinary context. The study created instruments primarily for the purpose of determining the impact corporate sustainability behavior has on financial performance while simultaneously providing new insight on new and changing organizational values and leadership communication. A corporate sustainability report is a comprehensive document that gives external and internal stakeholders’ information on how organizations are responding to social, economic, and environmental issues. This study illustrates how a sustainability report reflects an organization’s level of involvement in environmental, social, and economic issues which are relevant to any academic environment that seek to gain more understanding of how businesses pay attention to societal demands while striving to have competitive advantage in the global market.
  • ENGINEER MANUAL 385 EFFECTIVENESS: A STUDY OF PREDICTIVE ANALYTICS

    Arias, Scott (Indiana State University, 2014-12)
    Under the guidance of the United States Army Corps of Engineering Manual 385 (EM 385), the federal government has taken a stringent stance on construction safety. Using the mandated Occupational Safety and Health regulations and the 29 Code of Federal Regulation as a safety foundation, the EM 385 requires project-specific planning, continuous oversight and direct control of all safety activities. These mandates, required of every Department of Defense entity, focus on safety management not found within other federal agencies, in an attempt to reduce the number and severity of mishaps. This study looks for causation between the use of the EM 385 and the number and the severity of mishaps using three multiple regression analysis. The research population studied included construction contractors who performed work within various federal government agencies. The data was compiled using 2008 data that was merged using the federal construction spending data with mishap rates obtained from the OSHA Data Initiative (ODI). The explanatory variables considered in this research were EM 385 use, contractor size, project size, construction sector, pricing structure, solicitation procedure, OSHA region, disadvantaged business status and type of federal set-aside. The three dependent variables included the total case rate (TCR), the days away, restricted, and transferred (DART) rate, and the days away from work (DAFWII) rate. Analysis of this data revealed that there were no conclusive results showing a causal relationship between the EM 385 and a reduction in the number and severity of mishaps.
  • A COMPARATIVE STUDY OF THE IMPACT OF PROJECT DELIVERY METHODS USED ON PUBLIC HIGHWAY-RAIL INTERSECTION PROJECTS IN NEW YORK STATE

    Osipitan, Osileke Odusanya (Indiana State University, 2014-08)
    The improvement of infrastructure by a public agency is to ensure satisfaction of the general public using such infrastructure, based on available funds. In order to implement and sustain the public good, an efficient project delivery method or the assessment of existing project delivery methods used to develop such infrastructure is necessary. Project delivery method is a process that defines the relationship between parties involved in a specific project. Any of the methods could affect a project budget, schedule, quality and the involvement of the project owner. This study investigated the impact of project delivery methods used by different type/class of railroad organization, which include passenger and freight railroad organizations on completed public highway-rail intersection projects in New York State, within a period of 10 years. Two hundred and fifty six (256) projects with similar scope, which were performed at independent locations, were selected. The research questions were answered based on hypotheses, which were tested with non parametric test using SPSS Statistical package version 20. The Mann-Whitney U test was used to determine the statistical significant difference between the total cost of Highway-Rail Intersection projects when Design-Build and Design-Bid-Build methods were used by railroad companies. The Kruskall Wallis test was used to determine the statistically significant difference between the total cost of projects performed by Passenger, Class 1 (Large), Class 2 (Regional) and Class 3 (Short-Line railroads) railroad companies operating in New York State, and a post-hoc test depicts the significant differences between the railroad organizations that differ. Findings indicated that there were statistical significant differences in total costs for project delivery methods as well as types/class of railroad organizations. It was recommended that the New York State Department of Transportation should partner with the railroad organizations for share cost agreement, develop short or long term plans to either close railroad grade crossings or grade separate crossings along railroad corridors so that passenger and Class 1 railroad organizations can significantly contribute to HRI improvements. Furthermore, NYSDOT need to adequately monitor HRI projects performed by the railroad organizations.
  • ASSESSMENT OF FUTURE EMPLOYMENT AND COMPETENCY SKILLS IN BUILDING INFORMATION MODELING: A DELPHI STUDY

    Raiola, Joseph A., III (Indiana State University, 2014-08)
    Many mainstream architecture, engineering, and construction (AEC) professionals are using Building Information Modeling (BIM). Although more and more firms are using BIM and this trend is forecast to continue, it is unclear what skills and competencies a construction manager will need in five years related to BIM. This research aims to answer that question through the use of a Delphi panel comprised of AEC professionals. The panel consists of members with the following qualifications: a minimum of eight years industry or academic experience or a combination of the two, a minimum of three years BIM experience, and membership in a nationally recognized professional organization. The results of the three round Delphi study identified skills and competencies in the following areas related to construction management: cost estimating (78 skills and competencies), scheduling and control (85 skills and competencies), project administration (71 skills and competencies), contract documents (29 skills and competencies), and other skills that were not in other categories (20 skills and competencies). In addition, this study reached consensus on descriptors that individual firms (24 descriptors) and construction mangers (22 descriptors) will need to possess to maintain or increase BIM usage in five-years. Although the panel identified many “new” BIM related construction management skills and competencies, “traditional” skills and competencies are a top response in each respective category. Within these “traditional” skills was the reinforcement of soft skills. BIM is a collaborative project management system so many soft skills are more important than with traditional project management systems. BIM requires some efficient communication along with strong soft skills, an area reinforced by the findings of this research. Furthermore, this research concluded that as BIM diffuses into the construction community, social systems interested in increasing BIM usage should augment “traditional” skill sets with the “new” BIM related skills and competencies. Any academic programs seeking to implement BIM related topics into existing courses should do so in a careful manner. This research revealed in five-years BIM will continue to enter the mainstream. Building Information Modeling theory suggests that AEC industry will completely change because of BIM. However, this is not the entirely the case. This research discovered that soft skills are more important because of BIM diffusion. This research will be of particular interest to industry and academic programs seeking to increase BIM usage, or begin development of curriculum that incorporates BIM. The results include a consensus of the most important skills and competencies related to BIM for a construction manager to possess, as ranked by mean and standard deviation,
  • DEVELOPMENT OF AN OPTIMAL LEAN SIX SIGMA MODEL

    Taylor, James (Indiana State University, 2014-08)
    Lean six sigma is a hybrid continuous improvement methodology that is not standardized and is not well understood. A review of literature found that the spectrum of lean six sigma approaches extends from those that are lean dominant to those that are six sigma dominant. This research illuminated the lean six sigma methodology by methodically assessing the literature via text mining and cluster analysis. Text mining was used to establish the degree to which lean six sigma models, as described in articles published in the International Journal of Lean Six Sigma, are lean dominant versus six sigma dominant. The iterative cluster analysis was used to identify clusters of articles that were interpretable. A cluster of lean dominant lean six sigma articles was identified and statistically validated as being distinct from other models. It was determined that characteristics of a lean dominant lean six sigma include the text mining key words “waste”, “value”, and “kaizen.” The research also found that these lean dominant lean six sigma articles ascertain lean as the dominant philosophy and six sigma as a subordinate tool used in achieving the lean objectives. The findings of the research as well extrapolation of the literature informed a recommended lean six sigma model. The recommended model is lean dominant and consists of two subordinate methods – six sigma and statistical process control. The three synergistic approaches not only each serve in their own way to manifest process improvements, they also all contribute to organizational learning which is considered a chief contributor to competitive advantage.
  • An Environmental Value Engineering (EVE) Analysis of the Environmental Impacts of Component Production for Traditional Wood-Framed and Structural-Insulated Panel Wall Systems

    Miller, Richard F. (Indiana State University, 2013-12)
    The building industry is a continually changing process, and for many years the traditional wall-framing approach has been used for residential and small commercial projects. In recent years, the introduction of new processes, procedures, and manufactured components have made an impact on the construction process and project. With the advent of these new technologies, come problems concerning the accounting for and determination of their environmental impact through analysis. In order to mitigate these problems and substantiate the environmental impact of manufactured components, it is necessary to implement a strategy, through analysis, that accounts for manufactured components and their impact to the life cycle of a built environment alternative. The purpose of this study was to investigate the material and component production phase using a traditional analysis methodology to determine the environmental impact and assess the influence these inputs and phases have on an Environmental value engineering (EVE) analysis. The study utilized the EVE analysis methodology to compare a traditional wood-framed system and those of a manufactured product, structural insulated panel systems (SIPs), in order to elucidate the component production phase, compare input impacts, identify the least environmental intrusive alternative wall system, and quantify a gap that existed in the EVE methodology. The statistical techniques used for this study were; comparative analysis, descriptive statistics, input source frequencies and impact data analysis of known and assumed values. The research findings indicate that the accounting of the component production phase for structural insulated panels increased the accuracy of the EVE analysis by 4.1% and that the separation of the manufacturing phases; material production, design, and component production incur a 11.2% more accurate accounting when compared with assumed or combined phases. The impact analysis results indicated that the material production phase imposed the highest influence for both alternative wall systems with inputs of 80% for traditional wood-framed systems and 84% for structural insulated panels. Input source results revealed that equipment, facilities, and materials have the highest impact for each wall system while the land and services (labor) were the lowest based on each wall alternative requirements. The results revealed that the manufactured system, structural insulated panel system, has the least environmental impact on the built environment. The study reinforced the need for developing strategies to incorporate the component production phase to more accurately portray the environmental impact in analysis.
  • TECHNOLOGY’S IMPACT ON WHOLESALE DISTRIBUTION BRANCH OPERATIONS

    Angolia, Mark G. (Indiana State University, 2013-12)
    The primary role of a warehouse is to decouple supply from demand, minimize cost, maintain a high degree of inventory control, and assure customer service. To these ends, organizational capabilities, technology, and business practices will determine an operation’s effectiveness. This research investigated the impact of technology and warehousing practices on key performance indicators for wholesale distribution branch operations. An on-line questionnaire gathered objective data from distribution branches on types of technologies utilized, warehouse best practices employed, and inventory control or customer service metrics used to monitor performance. Correlation analysis, multiple linear regression, analysis of variance, and stepwise regression were utilized to determine the impact of the individual technologies, as well as interactions between technology and practices. A salient insight of this research was that technology adoption alone did not produce a discernible difference in performance, and appeared to require industry best practices to generate improvements. Also, when information technology was adopted, there seemed to be approximately one year of implementation required before positive operational results materialized and/or stabilized. The research pointed to warehouse management systems as the predominant information and communication technology (ICT) for discernible differences in inventory related performance, with improved performance realized when combined with ABC inventory stock analysis and/or physical inventory practices. The use of automatic identification and data capture (AIDC) technologies did not show any effect on inventory or customer service metrics, indicating that they are a support tool rather than an impact technology. Neither ICT nor AIDC technologies demonstrated a predictive value for inventory accuracy or on-time shipping performance. Predictive models were created for fill rate and inventory accuracy, but the veracity of the models is somewhat limited by the sample size and study population.
  • DISTRIBUTED COMPUTING IN INTERNET OF THINGS (IoT) USING MOBILE AD HOC NETWORK (MANET): A SWARM INTELLIGENCE BASED APPROACH

    SELVADURAI, JOHN (Cunningham Memorial Library, Terre Haute, Indiana State University., 2017-12)
    Internet of Things (IoT) is a fast-growing technological trend, which is expected to revolutionize the world by changing the way we do things. IoT is a concept that encourages all the electronic devices to connect to the internet and interact with each other. By connecting all these devices to the internet, new markets can be created, productivity can be improved, operating costs can be reduced and many other benefits can be obtained. In IoT architecture, often sensors and aggregators collect data and send to a cloud server for analyzing via the traditional cloud-server model. This client-server architecture is not adequate to fulfill the growing requirements of IoT applications because this model is subjected to cloud latency. This research proposed a distributed computing model called Distributed Shared Optimization (DSO) to eliminate the delay caused by cloud latency. DSO is based on swarm intelligence where algorithms are built by modeling the behaviors of biological agents such as bees, ants, and birds. Mobile Ad-hoc Network (MANET) is used as the platform to build distributed computing. The infrastructure-less and leader-less features of MANET make it the ideal candidate to build IoT with swarm intelligence. To test the theory, this research also built a simulation program and conducted multiple simulations on both DSO and client-server models. The simulation data was analyzed by descriptive statistics and One-Way ANOVA. This research found that there is a significant difference in computing time between DSO and client-server models. Further, Multiple-Regression technique was conducted on DSO simulation data to identify the effect sensors and data had towards DSO computing time.
  • THE EFFECTIVENESS OF CONCURRENT DESIGN ON THE COST AND SCHEDULE PERFORMANCE OF DEFENSE WEAPONS SYSTEM ACQUISITIONS

    Robertson, Randolph B. (Cunningham Memorial library, Terre Haute,Indiana State University, 2017-12)
    This study investigates the impact of concurrent design on the cost growth and schedule growth of US Department of Defense Major Defense Acquisition Systems (MDAPs). It is motivated by the question of whether employment of concurrent design in the development of a major weapon system will produce better results in terms of cost and schedule than traditional serial development methods. Selected Acquisition Reports were used to determine the cost and schedule growth of MDAPs as well as the degree of concurrency employed. Two simple linear regression analyses were used to determine the degree to which cost growth and schedule growth vary with concurrency. The results were somewhat surprising in that for major weapon systems the utilization of concurrency as it was implemented in the programs under study was shown to have no effect on cost performance, and that performance to development schedule, one of the purported benefits of concurrency, was actually shown to deteriorate with increases in concurrency. These results, while not an indictment of the concept of concurrency, indicate that better practices and methods are needed in the implementation of concurrency in major weapon systems. The findings are instructive to stakeholders in the weapons acquisition process in their consideration of whether and how to employ concurrent design strategies in their planning of new weapons acquisition programs.
  • A STUDY OF THE FACTORS INFLUENCING LAST MILE RESIDENTIAL FIXED BROADBAND PRICING IN KENTUCKY

    Ramage, Michael (Cunningham Memorial Library, Terre Haute, Indiana State University., 2017-12)
    Ever since the first telegraph, a technology management challenge has existed to expand the availability of communication services farther into rural and unserved areas, while maintaining the affordability of those services to residential users. Over the years, that challenge has transformed from telegraph to broadband communications or high-speed Internet access. The challenge of affordable expansion of broadband services is seen all across the United States including the Commonwealth of Kentucky. This study examined the extent to which community and provider-related supply and demand factors among last mile residential fixed broadband service areas impact the nonpromotional advertised price of last mile broadband service throughout the 120 counties in the Commonwealth of Kentucky. The potential factors included population density, unemployment rate, provider count, broadband availability, middle mile, actual broadband speeds, technology deployed, provider type, maximum advertised download speeds, and maximum advertised upload speeds, with a goal to reveal if any have a correlation to the actual price of broadband seen by end users. In addition, this study attempted to create a model based on the significantly correlated factors. Utilizing Pearson correlation and multiple regression analysis, this study found five variables with a significant correlation to the dependent variable, price per megabit, including a slight negative correlation with the count of middle mile providers, slight positive correlation with the technology deployed, slight negative correlation with the provider type, strong negative v correlation with the download speed tier, and strong negative correlation with the upload speed tier. Finally, a model was created to predict the price per megabit of broadband with three variables, technology used, provider type, and a joint variable representing the download and upload speeds tiers.
  • DEVELOPMENT OF A QUALITY MANAGEMENT ASSESSMENT TOOL TO EVALUATE SOFTWARE USING SOFTWARE QUALITY MANAGEMENT BEST PRACTICES

    Erukulapati, Kishore (Cunningham Memorial library, Terre Haute,Indiana State University, 2017-12)
    Organizations are constantly in search of competitive advantages in today’s complex global marketplace through improvement of quality, better affordability, and quicker delivery of products and services. This is significantly true for software as a product and service. With other things being equal, the quality of software will impact consumers, organizations, and nations. The quality and efficiency of the process utilized to create and deploy software can result in cost and schedule overruns, cancelled projects, loss of revenue, loss of market share, and loss of consumer confidence. Hence, it behooves us to constantly explore quality management strategies to deliver high quality software quickly at an affordable price. This research identifies software quality management best practices derived from scholarly literature using bibliometric techniques in conjunction with literature review, synthesizes these best practices into an assessment tool for industrial practitioners, refines the assessment tool based on academic expert review, further refines the assessment tool based on a pilot test with industry experts, and undertakes industry expert validation. Key elements of this software quality assessment tool include issues dealing with people, organizational environment, process, and technology best practices. Additionally, weights were assigned to issues of people, organizational environment, process, and technology best practices based on their relative importance, to calculate an overall weighted score for organizations to evaluate where they stand with respect to their peers in pursuing the business of producing quality software. This research study indicates that people best practices carry 40% of overall weight, organizational best v practices carry 30% of overall weight, process best practices carry 15% of overall weight, and technology best practices carry 15% of overall weight. The assessment tool that is developed will be valuable to organizations that seek to take advantage of rapid innovations in pursuing higher software quality. These organizations can use the assessment tool for implementing best practices based on the latest cutting edge management strategies that can lead to improved software quality and other competitive advantages in the global marketplace. This research contributed to the current academic literature in software quality by presenting a quality assessment tool based on software quality management best practices, contributed to the body of knowledge on software quality management, and expanded the knowledgebase on quality management practices. This research also contributed to current professional practice by incorporating software quality management best practices into a quality management assessment tool to evaluate software.
  • A STUDY OF THE MATERIAL INSPECTION RECORD AND QUALITY SYSTEMS: A CASE IN THE UNITED STATES DEPARTMENT OF THE NAVY

    Brown, Larry W. Jr. (Cunningham Memorial Library, Terre Haute, Indiana State University., 2017-12)
    Defective products and services are a part of every industry, sector, and organization. Minimization of those defects is essential for business success. The later those defects are found, the more they cost the business and consumer. This study investigated the impact having an accredited Quality Management System (QMS) had on the acceptance of delivered product. The study focused on the products delivered to the Naval Sea Systems Command (NAVSEA) and Naval Supply Systems Command (NAVSUP) organizations. This study investigated the statistical significance between the means of the groups within size and number of accreditation. The dependent variables were Material Inspection Record (MIR), units received, and units rejected, or products delivered to the NAVSEA and NAVSUP organizations. The study used the PDREP Metric Dashboard data for fiscal year 2012, quarter 1 through fiscal year 2016, quarter 2, resulting in more than 8,000 records analyzed and interpreted using a one-way ANOVA and General Linear Model. The results of the analysis indicated there were no significant differences between size or accreditation of organizations, when compared to the number of rejected units and Material Inspection Report (MIR) acceptance or rejection. The analysis did suggest there is statistical significance when size and accreditation are compared to MIR acceptance or rejection (F-Value 3.01, P-Value 0.006). Additional analysis was conducted for within group comparisons and small organizations were identified as having a statistically disproportionate percentage of units rejected (76.61 percent), when compared to the percentage of units received (55.24 percent). iv Within small organizations, organizations with one accreditation had the highest ratio of units rejected compared to units received (2.00 to 1) as a percentage of units received within small organizations. Further research was recommended to explore other factors that would improve risk assessment and mitigation within the Department of Defense (DoD).
  • OPERATIONALIZING HUMILITY: A MODEL OF SITUATIONAL HUMILITY FOR CHRISTIAN COLLEGE STUDENT LEADERS

    Barrett, Scott T. (Cunningham Memorial library, Terre Haute,Indiana State University, 2017-12)
    This research study explored how college student leaders operationalize humility in their actions and what leads individuals to act with situational humility. There is a rise in narcissistic tendencies in college students (Twenge, Konrath, Campbell, & Bushman, 2008a, 2008b) and a decline in overall character traits (Burns, 2012; Hunter, 2000; Liddell & Cooper, 2012). Opposite the vice of narcissism sits the virtue of humility (Emmons, 2000; Exline & Geyer, 2004; Peterson & Seligman, 2004; Tangney, 2000). Using a grounded theory approach, the researcher looked to discover the process of humility development. Twenty six in depth interviews were conducted at three institutions. Each institution was a member of the Council of Christian Colleges and Universities and each participant identified as having a Christian belief system. Interviews were digitally recorded and transcribed. Transcriptions were coded using grounded theory method of open, axial, and selective coding. Based on the data collected three main themes emerged. Faith and humility go hand in hand, sense of self impacts humbling experiences, and the effect of relationships on humility. Through this research, the model of situational humility emerged grounded in the data. The model of situational humility describes what leads an individual to act with humility within a specific humbling experience. For these students, humbling experiences occurred when their sense of self (“I am an athlete,” “I get things done on time,” “I am a not racist”) did not line of up with their experience of the world (physical injury, failing to send necessary emails, making comments that were received as racial insensitive by a peer). Individuals then move to iv the point of change where they must decide how whether they will reorient their sense of self or actions or if they will not reorient and act with pride. In this point of change individuals were positively impacted towards humility by their Christian belief system, empathy, being in relationship, and interacting with others who were different from them. The implications of this research for institutional leaders who desire to grow humility in students include valuing how humility is seen as a virtue, growing empathy in students, and providing opportunities for students to be in relationship with others, specifically those who are different from them.

View more