Admittedly, this adaptation had not proceeded very far by , although the first jet-powered aircraft were in service by the end of the war. But the construction of a satisfactory gas-turbine engine was delayed for a decade by the lack of resources, and particularly by the need to develop new metal alloys that could withstand the high temperatures generated in the engine. This problem was solved by the development of a nickel-chromium alloy, and, with the gradual solution of the other problems, work went on in both Germany and Britain to seize a military advantage by applying the jet engine to combat aircraft.
The principle of the gas turbine is that of compressing and burning air and fuel in a combustion chamber and using the exhaust jet from this process to provide the reaction that propels the engine forward. In its turbopropeller form, which developed only after World War II , the exhaust drives a shaft carrying a normal airscrew propeller.
News & Media Coverage
Compression is achieved in a gas-turbine engine by admitting air through a turbine rotor. In the so-called ramjet engine, intended to operate at high speeds, the momentum of the engine through the air achieves adequate compression. The gas turbine has been the subject of experiments in road, rail, and marine transport, but for all purposes except that of air transport its advantages have not so far been such as to make it a viable rival to traditional reciprocating engines. As far as fuel is concerned, the gas turbine burns mainly the middle fractions kerosene, or paraffin of refined oil, but the general tendency of its widespread application was to increase still further the dependence of the industrialized nations on the producers of crude oil , which became a raw material of immense economic value and international political significance.
The refining of this material itself underwent important technological development. Until the 20th century it consisted of a fairly simple batch process whereby oil was heated until it vaporized, when the various fractions were distilled separately. Apart from improvements in the design of the stills and the introduction of continuous-flow production, the first big advance came in with the introduction of thermal cracking.
This process took the less volatile fractions after distillation and subjected them to heat under pressure, thus cracking the heavy molecules into lighter molecules and so increasing the yield of the most valuable fuel, petrol or gasoline. The discovery of this ability to tailor the products of crude oil to suit the market marks the true beginning of the petrochemical industry. It received a further boost in , with the introduction of catalytic cracking. By the use of various catalysts in the process, means were devised for still further manipulating the molecules of the hydrocarbon raw material.
The development of modern plastics followed directly on this see below Plastics. So efficient had the processes of utilization become that by the end of World War II the petrochemical industry had virtually eliminated all waste materials. All the principles of generating electricity had been worked out in the 19th century, but by its end these had only just begun to produce electricity on a large scale.
The 20th century witnessed a colossal expansion of electrical power generation and distribution. The general pattern has been toward ever-larger units of production, using steam from coal- or oil-fired boilers.
Our energy analysis service portfolio goes beyond monitoring to actually provide diagnostics and analysis to improve and maintain desired energy performance. The major features that we provide are. Energy Analysis: The Energy analysis allows data to be presented in a wide range of graphs and tables. Consumption analysis form electrical and gaseous sources will be conducted.
Consumption analysis of fuel, water, heat and electrical sources will be conducted.
Carl H. Lindner College of Business
Track and trend changes in energy usage over time. Combine current power utilization and costs to the optimal configurations. Obtain information needed to qualify for power company or government rebates and incentives.
This leads to cost optimization. Benchmark Analysis: This feature allows facilities to be benchmarked against targets established by the application of local and international standards. The output of this analysis includes recommendations that, when applied, can result to better alignment with targets, and associated cost reductions.
- Carl H. Lindner College of Business | University of Cincinnati!
- Indulge your desires: the Olivia Darling 3-Book Bundle - Vintage, Priceless, Temptation.
- 20th Century Business Requires an Eclectical Approach to Management?
Progress can be measured as often as daily. This will help identify operational effectiveness of the cooling system as well as fine tune systems for optimized performance. Base Load Analysis: Base load analysis will provide analysis between Base load and Active load separately. We use base load analysis to understand if energy is being used outside of the operating period or occupancy time. Identify when energy is used most and if unnecessary energy is being used during off peak hours. Drill down to understand the unusual energy usage and suggest recommendations.
Base Parameter Analysis Boilers : Base parameters analysis will provide analysis between base operating parameters, data of boiler temperature, air flow, water flow, fuel and electricity consumption and actual data separately. Reports: Pure Diagnostics and analytical reports are a part of this service. Each report talks about the findings, the reasons and improvement measures wherever applicable. Dashboards: We have dashboards for different type of users and hence they have different profiles. The Dashboards are a combination of historical and real time data and provides a single window information of the entire assets.
An user can view data across parameters and systems and easily retrieve previously viewed data at the click of a button. Chiller Analysis: Chilled water-based cooling systems are frequently used to air-condition large office buildings or campuses that encompass multiple buildings. They represent a large investment from the perspective of initial cost, physical space they occupy within the building, as well as energy and maintenance cost. Yet despite these fiscal and spatial impacts, many chiller plants do not reach their potential from the standpoint of energy efficiency.
Strategic Management for Competitive Advantage
The development was targeted to address the need of medium to large chilled water plants such as Assets, hospitals, shopping malls and district cooling plants for handy and online performance assessment as well as effective reporting mechanism for maintaining the plant as close to the benchmark as possible. Energy performance of a chilled water plant is an important aspect to talk about, if we talk about saving energy in building operation.
The chiller tool has a web based and flexible frame work which is capable of communicating with large variety of plants and its equipments, and performs various integrated expert data mining and analysis to deliver important efficiency and diagnostic information to the user. The chiller tool frame work is broadly expected to deliver following functionalities to the user. There are two kind of calculation which is expected from the framework. These are primary and secondary calculation. Primary calculation constitutes the first hand treatment of incoming data within the hardware itself.
Parameters from the chiller like chilled water inlet and outlet temperature, condenser water inlet and outlet temperature, chilled water and condenser water flow rate, etc are logged and treated via soft coded algorithms in the hardware.
Delivery to consumers - U.S. Energy Information Administration (EIA)
The purpose of residing these calculations on the hardware is to make sure that the data base which is then populated from these values has all the necessary data to perform secondary and more advance calculation. The secondary calculation routine will be considered as brain of the software. This calculation class includes various subroutines like DOE grey box modeling approach, integrated part load value IPLV and drift identification techniques in statistics. They are empirical in the sense that the polynomial structure is not based on physical relationships.
However, the model is somewhat a grey-box model, since the final power prediction of the chiller is based on physically meaningful quantities obtained from the polynomial curves. An assumption the model makes is that evaporator and condenser water flow rates remain constant. The first curve describes how the cooling capacity of the chiller varies at different evaporator and condenser water temperatures, in comparison to the cooling capacity at reference conditions. The reference conditions can be any temperature, so long as they are consistent.
Q ref, and P ref are the full load cooling capacity, and power consumption at the reference chilled and condenser water temperatures. The second curve describes how the full load efficiency, defined as power consumption in kW per ton of cooling varies with water temperatures. This is also a dimensionless term. The third curve describes how the power consumption varies at part load conditions. This is again a dimensionless term. A single number part-load efficiency figure of merit calculated as per the method described in this standard referenced to conditions other than IPLV conditions.
Drift and Clustering Analysis The matrices and figures which come out of monitoring and measurement for a specified period of time are some time irrelevant if seen as a time independent figures. Although if the trend is analyzed it would give a better insight of the building phenomenon. Since it is highly recommended to see and analyze different process variables and calculated or derived variables with respect to time, various statistical methods have been devised to quantitatively assess the change in parameter over time either between two similar systems or between two separate durations.
Methods like drift analysis, statistical deviation and regression analysis are some of the best analysis tool that we have incorporated in the frame work.
Inbuilt Diagnosing Capabilities A very important but most functionality wise ignored aspect in the data acquisition and reporting frame work is the iterative diagnosing capabilities. The way we get close to fault is by iteratively plotting them one after the other and observing the pattern of each variable conclude to some hypothesis. This process of diagnosing has to be more flexible at the same time more standard towards evaluating faults. This functionality is currently being implemented in the software which would assist the user to diagnose fault with moderate knowledge of system behaviour.
Related 20th Century Business Requires an Eclectical Approach to Management
Copyright 2019 - All Right Reserved