Science Fair Project Encyclopedia
- See Steel (disambiguation) for other uses.
Steel is a metal alloy whose major component is iron, with carbon being the primary alloying material. Carbon acts as a hardening agent, preventing iron atoms, which are naturally arranged in a lattice, from sliding past one another. Varying the amount of carbon and its distribution in the alloy controls the qualities of the resulting steel. Steel with increased carbon content can be made harder and stronger than iron, but is also more brittle. One classical definition is that steels are iron-carbon alloys with up to 5.1 percent carbon; ironically, alloys with higher carbon content than this are known as iron.
Currently there are several classes of steels in which carbon is replaced with other alloying materials, and carbon, if present, is undesired. A more recent definition is that steels are iron-based alloys that can be plastically formed (pounded, rolled, etc.).
Iron and steel
Iron, like most metals, is not found in the Earth's crust in a native state. Since the rise of the cyanobacteria and their excretion of oxygen into the atmosphere, iron can be found only in oxide form, typically Fe2O3— the form of iron oxide found as the mineral hematite. Iron oxide is a soft sandstone-like material with limited uses on its own. Iron is extracted from ore by removing the oxygen by combining it with a preferred chemical partner such as carbon. This process, known as smelting, was first applied to metals with lower melting points. Copper and tin both melt at just over 1000 °C, temperatures that could be reached with ancient methods that have been in use for at least 6000 years (since the Bronze Age). Since the oxidation rate itself increases rapidly beyond 800 °C, it is important that smelting take place in a fairly oxygen-free environment. Unlike copper and tin, liquid iron dissolves carbon quite readily, so that smelting results in an alloy containing too much carbon to be called steel.
Even in the narrow range of concentrations that make up steel, mixtures of carbon and iron can form into a number of different structures, or allotropes, with very different properties; understanding these is essential to making quality steel. At room temperature, the most stable form of iron is the body-centered cubic structure ferrite or α-iron, a fairly soft metallic material that can dissolve only a small concentration of carbon (no more than 0.021 wt% at 910 °C). Above 910 °C ferrite undergoes a phase transition from body-centered cubic to a face-centered cubic configuration, called austenite or γ-iron, which is similarly soft and metallic but can dissolve considerably more carbon (as much as 2.04 wt% carbon at 1146 °C). As carbon-rich austenite cools, the mixture attempts to revert to the ferrite phase, resulting in an excess of carbon. One way for carbon to leave the austenite is for cementite to precipitate out of the mix, leaving behind iron that is pure enough to take the form of ferrite, and resulting in a cementite-ferrite mixture. Cementite is a stochiometric phase with the chemical formula of Fe3C. Cementite forms in regions of higher carbon content while other areas revert to ferrite around it. Self-reinforcing patterns often emerge during this process, leading to a patterned layering known as perlite due to its pearl-like appearance, or the similar but less beautiful bainite.
Perhaps the most important allotrope is martensite, a chemically metastable substance with about four to five times the strength of ferrite. Martensite has a very similar unit cell structure to austenite, and identical chemical composition. As such, it requires extremely little thermal activation energy to form.
The heat treatment process for most steels involves heating the alloy until austenite forms, then quenching the hot metal in water or oil, cooling it so rapidly that the transformation to ferrite or perlite does not have time to take place. The transformation into martensite, by contrast, occurs almost immediately, due to a lower activation energy.
Martensite has a lower density than austenite, so that the transformation between them results in a change of volume. In this case, expansion occurs. Internal stresses from this expansion generally take the form of compression on the crystals of martensite and tension on the remaining ferrite, with a fair amount of shear on both constituents. If quenching is done improperly, these internal stresses can cause a part to shatter as it cools; at the very least, they cause internal work hardening and other microscopic imperfections.
At this point, if its carbon content is high enough to produce a significant concentration of martensite, the metal resembles spring steel: extremely hard, but very brittle. Often, steel undergoes further heat treatment at a lower temperature to destroy some of the martensite (by allowing enough time for cementite, etc., to form) and help settle the internal stresses and defects. This softens the steel, producing a more ductile and fracture-resistant metal. Because time is so critical to the end result, this process is known as tempering, source of the term tempered steel.
Other materials are often added to the iron-carbon mixture to tailor the resulting properties. Nickel in steel adds to the tensile strength and makes austenite more chemically stable, chromium increases the hardness, and vanadium also increases the hardness while reducing the effects of metal fatigue. Large amounts of chromium and nickel (often 18 and 8 %, respectively) are added to stainless steel so that a hard oxide forms on the metal surface, to inhibit corrosion. Tungsten interferes with the formation of cementite, allowing martensite to form with slower quench rates, resulting in high speed steel. On the other hand sulfur, nitrogen, and phosphorus make steel more brittle, so these commonly found elements must be removed from the ore during processing.
When iron is smelted from its ore by commercial processes, it contains more carbon than is desirable. To become steel, it must be melted and re-processed to remove the correct amount of carbon, at which point other elements can be added. Once this liquid is cast into ingots, it usually must be "worked" at high temperature to remove any cracks or poorly-mixed regions from the solidification process, and to produce shapes such as plate, sheet, wire, etc. It is then heat-treated to produce a desirable crystal structure, and often "cold worked" to produce the final shape. In modern steelmaking these processes are often combined, with ore going in one end of the assembly line and finished steel coming out the other. These can be streamlined by a deft control of the interaction between work hardening and tempering.
History of iron and steelmaking
Iron was in limited use long before it became possible to smelt it. About 6% of meteorites are composed of an iron-nickel alloy, and iron recovered from meteorite falls allowed ancient peoples to manufacture small numbers of iron artifacts. The name for iron in several ancient languages means "sky metal" or something similar. In distant antiquity, iron was regarded as a precious metal, suitable for royal ornaments. The Egyptian ruler Tutankhamun died in 1323 BC and was buried with an iron dagger with a golden hilt. A battle axe with an iron blade and a gold-decorated bronze haft found in the excavation of Ugarit has been dated to about 1400 BC. The early Hittites sold iron to Assyria for 40 times its weight in silver.
Meteoric iron was also fashioned into tools in pre-contact North America. Beginning around the year 1000, the Thule people of Greenland began making harpoons and other edged tools from pieces of the Cape York meteorite . These artifacts were also used as trade goods with other Arctic peoples: tools made from the Cape York meteorite have been found in archaeological sites more than 1000 miles (1600 km) away. When the American polar explorer Robert Peary shipped the largest piece of the meteorite to the American Museum of Natural History in New York City in 1897, it still weighed over 33 tons.
The iron age
The oldest known samples of iron that appear to have been smelted from iron oxides are small lumps found at copper-smelting sites on the Sinai Peninsula, dated to about 3000 BC. Some iron oxides are effective fluxes for copper smelting; it is possible that small amounts of metallic iron were made as a byproduct of copper and bronze production throughout the bronze age. In Anatolia, smelted iron was occasionally used for ornamental weapons: an iron-bladed dagger with a bronze hilt has been recovered from a Hattic tomb dating from 2500 BC.
Iron did not, however, replace bronze as the chief metal used for weapons and tools for several centuries. Working iron required more fuel and significantly more labor than working bronze, and the quality of iron produced by early smiths may have been inferior to bronze as a material for tools. Then, between 1200 and 1000 BC, iron tools and weapons displaced bronze ones throughout the near east. This process appears to have begun in Cyprus and southern Greece, where iron artifacts dominate the archaeological record after 1050 BC. Mesopotamia was fully into the iron age by 900 BC, central Europe by 800 BC. The reason for this sudden adoption of iron remains a topic of debate among archaeologists. One prominent theory is that warfare and mass migrations beginning around 1200 BC disrupted the regional tin trade, forcing a switch from bronze to iron. Egypt, on the other hand, did not experience such a rapid transition from the bronze to iron ages: although Egyptian smiths did produce iron artifacts, bronze remained in widespread use there until after Egypt's conquest by Assyria in 663 BC.
Iron smelting at this time was based on the bloomery, a furnace where bellows were used to force air through a pile of iron ore and burning charcoal. The carbon monoxide produced by the charcoal reduced the iron oxides to metallic iron, but the bloomery was not hot enough to melt the iron. Instead, the iron collected in the bottom of the furnace as a spongy mass, or bloom, whose pores were filled with ash and slag. The bloom then had to be reheated to soften the iron and melt the slag, and then repeatedly beaten and folded to force the molten slag out of it. The result of this time-consuming and laborious process was wrought iron, a malleable but fairly soft alloy containing little carbon.
Wrought iron can be carburized into a mild steel by holding it in a charcoal fire for prolonged periods of time. By the beginning of the iron age, smiths had discovered that iron that was repeatedly re-forged produced a higher quality of metal. Quench-hardening was also known by this time. The oldest quench-hardened steel artifact is a knife found on Cyprus at a site dated to 1100 BC.
Developments in China
Archaeologists and historians debate whether bloomery-based ironworking ever spread to China from the West. Around 500 BC, however, metalworkers in the southern state of Wu developed an iron smelting technology that would not be practiced in Europe until late medieval times. In Wu, iron smelters achieved a temperature of 1130°C, hot enough to be considered a blast furnace. At this temperature, iron combines with 4.3% carbon and melts. As a liquid, iron can be cast into molds, a method far less laborious than individually forging each piece of iron from a bloom.
Cast iron is rather brittle and unsuitable for striking implements. It can, however, be decarburized to steel or wrought iron by heating it in air for several days. In China, these ironworking methods spread northward, and by 300 BC, iron was the material of choice throughout China for most tools and weapons. A mass grave in Hebei province, dated to the early third century BC, contains several soldiers buried with their weapons and other equipment. The artifacts recovered from this grave are variously made of wrought iron, cast iron, malleabilized cast iron, and quench-hardened steel, with only a few, probably ornamental, bronze weapons.
During the Han Dynasty (202 BC - AD 220), Chinese ironworking achieved a scale and sophistication not reached in the West until the eighteenth century. In the first century, the Han government established ironworking as a state monopoly and built a series of large blast furnaces in Henan province, each capable of producing several tons of iron per day. By this time, Chinese metallurgists had discovered how to puddle molten pig iron, stirring it in the open air until it lost its carbon and became wrought iron. (In Chinese, the process was called chao, literally, stir-frying.)
Also during this time, Chinese metallurgists had found that wrought iron and cast iron could be melted together to yield an alloy of intermediate carbon content, that is, steel. According to legend, the sword of Liu Bang, the first Han emperor, was made in this fashion. Some texts of the era mention "harmonizing the hard and the soft" in the context of ironworking; the phrase may refer to this process.
Perhaps as early as 300 BC, although certainly by AD 200, high quality steel was being produced in southern India by what Europeans would later call the crucible technique. In this system, high-purity wrought iron, charcoal, and glass were mixed in crucibles and heated until the iron melted and absorbed the carbon. The resulting high-carbon steel, called pulad in Arabic and wootz by later Europeans, was exported throughout much of Asia.
By the 9th century, smiths in the Abbasid caliphate had developed techniques for forging wootz to produce steel blades of unusual flexibility and sharpness (Damascus steel). The secret of forging this kind of steel was lost, even in the Middle East, by around 1600, and only recently have metallurgists found methods for reproducing its properties.
Ironworking in medieval Europe
The middle ages in Europe saw the construction of progressively larger bloomeries. By the 8th century, smiths in northern Spain had developed a style that become known as a Catalan forge, a furnace about 1 meter (3 feet) tall, capable of smelting up to 150 kg (350 lb) of iron in each batch. In succeeding centuries, smiths in the Frankish empire and later the Holy Roman Empire scaled up this basic design, increasing the height of the flue to as tall as 5 meters (16 feet) and smelting as much as 350 kg (750 lb) of iron in each batch. These larger furnaces required more draft than could be provided by human power, and forging the large blooms that resulted was also beyond the capabilities of a single man. To this end, waterwheels were employed to power the bellows and hammers.
Eventually, the scaling up of the bloomery reached a point where the furnace was hot enough to produce cast iron. Although the brittle cast iron may initially have been a nuissance to the smith, as it was too brittle to be forged, the spread of cannon to Europe in the 1300s provided an application for iron casting, cast iron cannonballs.
The oldest known blast furnace in Europe was constructed at Lapphyttan in Sweden, sometime between 1150 and 1350. Other early European blast furnaces were built throughout the Rhine valley: blast furnaces were in operation near Liège (a city in modern-day Belgium) in the 1340s, and at Massevaux in France by 1409.
The first English blast furnace was not built until 1496, when Henry VII commissioned a new ironworks at Newcastle, in a part of Sussex known as the Weald. Despite this late start, the production of English iron foundries rapidly grew, in no small part due to foreign craftsmen hired by Henry to bring the craft of iron casting to England. In 1543, William Levett , a Wealden ironmaster, and Peter Baude , a French craftsman in Henry VIII's employ, cast the Weald's first one-piece iron cannon. English iron cannons gained a reputation for being superior to, and less expensive than, the bronze cannons made elsewhere in Europe, and at least initially, efforts to copy them outside the Weald failed. The superiority of English cannons over Spanish ones has been credited as one factor in England's 1588 defeat of the Spanish Armada.
In 1619, Jan Andries Moerbeck , a Dutch ironmaster, began importing Wealden iron ore for comparison to the ore available on the Continent. One difference he observed was that the English ore contained some calcareous material, and soon after, Dutch ironmasters introduced the use of limestone as a flux in the blast furnace. This practice improved the separation of slag from the cast iron and improved the quality of Continental cast iron.
Ironworking in early modern Europe
Also by the early 1600s, ironworkers in western Europe had found a means (called cementation) to carburize wrought iron without individually forging each piece. Wrought iron bars and charcoal were packed into stone boxes, then held at a red heat for up to a week. During this time, carbon diffused into the iron, producing a product called cement steel or blister steel.
For many years the best steels could be produced by buying expensive iron ore from Sweden. Although it was not understood at the time, Swedish ore had very low phosphorus content compared to most ores (notably those in England), which allowed for a finer and stronger crystal structure. Sales of Swedish ore generated considerable trade income, and local development helped the country became the industrial powerhouse it remains to this day.
By the 18th century, deforestation in western Europe was making ironworking and its charcoal-hungry processes increasingly expensive. In 1709 Abraham Darby began smelting iron using coke, a refined coal product, in place of charcoal at his ironworks at Coalbrookdale in England. Although coke could be produced less expensively than charcoal, coke-fired iron was initially of inferior quality compared to charcoal-fired iron. It was not until the 1750s, when Darby's son refined the coking process to reduce the amout of sulfur in the coke that coke-fired furances became widespread.
Another 18th-century European development was the re-invention of the puddling furnace. In particular, the form of coal-fired puddling furnace developed by the British engineer Henry Cort in 1784 made it possible to convert cast iron into wrought iron in large batches, finally rendering the ancient bloomery obsolete. Wrought iron produced using this method became a major metal in the English midlands' emerging toy industry. The combination of the blast furnace and the puddling furnace allowed iron to be produced at either end of the carbon spectrum, depending on the user's needs.
As for alloys of intermediate carbon content (that is, steel), crucible steel was rediscovered in the 1740s by Benjamin Huntsman in Handsworth in England. In his process, wrought iron and cast iron were heated in small ceramic crucibles, melting together to form steel. While producing steel superior to cement steel, the crucible steel process remained relatively expensive in both time and fuel, and could not be used in any sort of modern industrial scale. The strong steels produced were however in high demand for specialty products such as cutlery and weapons. Sheffield's Abbeydale Industrial Hamlet has preserved a water-wheel powered, scythe-making works dating from Huntsman's times. It is still operated for the public, several times per year, using crucible steel made on the Abbeydale site.
The problem of mass-producing steel was solved in 1856 by Henry Bessemer, with the introduction of the Bessemer converter at his steelworks in Sheffield, England. (An early converter can still be seen at the city's Kelham Island Museum). In the Bessemer process, molten pig iron from the blast furnace was charged into a large crucible, and then air was blown through the molten iron from below, igniting the dissolved carbon. As the carbon burned off, the melting point of the mixture increased, but the heat from the burning carbon provided the extra energy needed to keep the mixture molten. After the carbon content in the melt had dropped to the desired level, the air draft was cut off: a typical Bessemer converter could convert a 25-ton batch of pig iron to steel in half an hour.
In 1867, the German-British engineer William Siemens introduced an improved puddling furnace that used brick heat exchangers to preheat the incoming air and conserve fuel. The next year, Pierre and Émile Martin , French ironmasters who had licensed Siemens' furnace design, developed a method for measuring the carbon content of molten iron. Thus, the decarburization could be stopped at the steel stage rather than proceeding all the way to wrought iron. This open-hearth process coexisted in industrial practice with the Bessemer process for many years, but eventually proved more economical and displaced it. Reasons for this include its ability to recycle scrap metal in addition to fresh pig iron, its greater scaleability (up to hundreds of tons per batch, compared to tens of tons for the Bessemer process), and the more precise quality control it permitted.
Initially, only ores low in phosphorus and sulfur could be used for quality steelmaking; ores rich in those elements yielded brittle metals little better than cast iron. This problem was solved in 1878 by Percy Carlyle Gilchrist and his cousin Sidney Gilchrist Thomas at the ironworks at Blaenavon in Wales. Their modified Bessemer process used a converter lined with limestone or dolomite, and additional lime was added to the molten metal as a flux. This added basic material removed phosphorus and sulfur from the steel as insoluble calcium or magnesium phosphates and sulfates. This development expanded the range of iron ores that could be used to make steel, especially in France and Germany, where high-phosphorus ores abounded.
These developments increased the availability and decreased the price of steel; 22 thousand tonnes were produced in 1867, 500 thousand in 1870, 1 million in 1880 and 28 million by 1900. Today, worldwide annual production is around 850 million tonnes. This widespread availability of inexpensive steel powered the industrial revolution and modern society as we know it. It also led to the introduction of newer "niche" steels (such as stainless steel), all of them dependent on the wide availability of inexpensive iron and steel and the ability to alloy it at will.
Types of steel
Alloy steels were known from antiquity, being nickel-rich iron from meteorites, and hot-worked into useful items. Damascus blades , famous as the blades that the Saracens wielded against the crusaders, were probably smelted iron wire, mated wire obtained from meteorites, heated and worked to impart the properties of expensive "star metal" to cheaper wrought iron; an early attempt at alloying.
In a modern sense, alloy steels have been made since the advent of furnaces capable of melting iron, into which other metals may be thrown and mixed.
- Carbon steel
- Damascus steel, which was famous in ancient times for its flexibility, was created from a number of different materials (some only in traces), essentially a complicated alloy with iron as main component.
- Stainless steels and surgical stainless steels contain a minimum of 10.5% chromium, often combined with nickel, to resist corrosion (rust). Some stainless steels are non-magnetic.
- Tool steels
- HSLA Steel (High Strength, Low Alloy)
- Advanced High Strength Steels
- Ferrous superalloys
- Crucible technique or puddling - the original steel making technique, developed in India as wootz, used in the Middle East as Damascus steel and independently redeveloped in Sheffield by Benjamin Huntsman in 1740, and Pavel Anosov in Russia in 1837.
- Bessemer process, the first commercial scale steel production process
- Open hearth furnace
- Basic oxygen steelmaking
- Electric arc furnace a form of secondary steelmaking from scrap, though the process can also use direct-reduced iron
- Essays on geology, history, and people
- Early iron in China, Korea, and Japan
- Precursors of the blast furnace
- Early progress in the melting of iron
- Steel City founders
The contents of this article is licensed from www.wikipedia.org under the GNU Free Documentation License. Click here to see the transparent copy and copyright details