Science Fair Project Encyclopedia
Digital circuits are electric circuits based on a number of discrete voltage levels. In most cases there are two voltage levels: one near to zero volts and one at a higher level depending on the supply voltage in use. These two levels are often represented as L and H.
The two levels are used to represent the binary integers or logic levels of 0 and 1. In active-high logic, L represents binary 0 and H represents binary 1. Active-low logic uses the reverse representation. It is usual to allow some tolerance in the voltage levels used; for example, 0 to 2 volts might represent logic 0, and 3 to 5 volts logic 1. A voltage of 2 to 3 volts would be invalid and would occur only in a fault condition or during a logic level transition, as most circuits are not purely resistive, and therefore cannot instantly change voltage levels. However, few logic circuits can detect such a fault, and most will just choose to interpret the signal randomly as either a 0 or a 1.
Examples of binary logic levels:
|Technology||L voltage||H voltage||Notes|
|CMOS||0V to VCC/2||VCC/2 to VCC||VCC = supply voltage|
|TTL||0V to 0.8V||2V to VCC||VCC is 4.75V to 5.25V|
Electronic logic is often constructed from small electronic circuits called logic gates. Each logic gate represents a function of boolean logic. A logic gate is an arrangement of electrically controlled switches. The output is an electrical flow or voltage, that can, in turn, control more logic gates. Logic gates often use the fewest number of transistors. In large volumes, they are the least expensive implementation. They are usually designed by engineers using electronic design automation software.
Another form of electronic logic is constructed from lookup tables, usually described as "programmable logic devices". These can perform all the same functions as machines based on logic gates, but lookup tables can be easily reprogrammed without changing the wiring. This means that a designer can often repair errors without changing the arrangement of wires. Therefore, in small volume products, programmable logic devices are often the preferred solution. They are usually designed by engineers using electronic design automation software.
When the volumes are medium to large, and the logic can be slow, or involves complex algorithms or sequences, often a small microcontroller is programmed to make an embedded system. These are usually programmed by software engineers.
When only one logic machine is needed, and its design is totally customized, as for a factory production line controller, the conventional solution is a programmable logic controller, or PLC. These are usually programmed by electricians, using ladder logic.
Structure of Digital Systems
Complicated digital systems, such as computers, are constructed from large numbers of small logic gates or lookup tables. How these are organized is crucial, because the most likely reason for a project to fail is because it is too complex to understand.
Most digital systems divide into "combinatorial systems" and "sequential systems". A combinatorial system always presents the same output when given the same inputs. It is basically a representation of a set of logic functions.
One of the easiest ways to design a combinatorial system is to simply have a memory containing a Truth table. The inputs are fed into the address of the memory, and the data outputs of the memory become the outputs. To save money, truth table-style descriptions are often given to computer programs that automatically produce systems of logic gates or lookup tables that produce the desired outputs. Interestingly, the only known way to get a perfect optimization is to enumerate all possible designs. This means that many needed systems are too large to be perfectly optimized with known algorithms on real computers. Larger systems have problems with partitioning the logic system, or run out of memory or time. Most practical algorithms for optimizing large logic systems use algebraic manipulations or binary decision diagrams, and there are promising experiments with genetic algorithms and annealing optimizations .
Engineers use many methods to minimize logic functions, in order to reduce the complexity and expense of digital machines. The most widely used cost-reduction methods include Truth tables, Karnaugh Maps, and Boolean Algebra.
A sequential system is a combinatorial system with some of the outputs fed back as inputs. This makes the digital machine perform a "sequence" of operations. The simplest sequential system is probably a flip flop, a mechanism that represents a bit.
Sequential systems are often designed as state machines. In this way, engineers can design a system's gross behavior, and even test it in a simulation, without considering all the details of the logic functions.
Sequential systems divide into two further subcategories. "Synchronous" sequential systems change state all at once, when a "clock" signal changes state. "Asynchronous" sequential systems propagate changes whenever inputs change. Synchronous sequential systems are made of well-characterized asynchronous circuits such as flip-flops, that change only when the clock changes. Asynchronous systems are very hard to design, because all possible states, in all possible timings must be considered.
Now (2005), almost all digital machines are synchronous, because it is much easier to design. However, asynchronous logic is thought to be superior, if it can be made to work, because it runs at the speed of its slowest part. Theoretically, this means that a machine could run as fast as the logic permitted.
In the 1980s, some researchers discovered that almost all synchronous digital machines could be convereted to asynchronous designs by using first-in-first-out synchronization logic. In this scheme, the digital machine is characterized as a set of data flows. In each step of the flow, an asynchronous "synchronization circuit" determines when the outputs of that step are valid, and presents a signal that says, "grab the data" to the stages that use that stage's inputs. It turns out that just a few relatively simple synchronization circuits are needed.
The most general-purpose data flow machine is a computer. This is basically an automatic binary abacus. The control unit of a computer is usually designed as a microprogram running on a microsequencer, and this controls the arithmetic logic unit.
Several numbers determine the practicality of a system of digital logic. Engineers explored numerous electronic devices to get an ideal combination of speed, low cost and reliability.
The cost of a logic gate is crucial. In the 1930s, the earliest digital logic systems were constructed from telephone relays because these were inexpensive and relatively reliable. After that, engineers always used the cheapest available electronic switches that could still fulfill the requirements. The earliest integrated circuits were a happy accident. They were constructed not to save money, but to save weight, and permit a computer to fly a spacecraft. The first integrated circuit logic gates cost nearly $50 (in 1960 dollars, when an engineer earned $10,000/year). To everyone's surprise, by the time the circuits were mass-produced, they had become the least-expensive method of constructing digital logic. Improvements in this technology have driven all subsequent improvements in cost.
The "reliability" of a logic gate describes its mean time between failure (MTBF). Digital machines often need thousands, sometimes millions of logic gates. Also, most digital machines are "optimized" to reduce their cost. The result is that often, the failure of a single logic gate will cause a digital machine to stop working. Digital machines first became useful when the MTBF for a switch got above a few hundred hours. Even so, many of these machines had complex, well-rehearsed repair procedures, and would be nonfunctional for hours because a tube burned-out, or a moth got stuck in a relay. Modern transistorized integrated circuit logic gates have MTBFs of nearly a trillion (1x10^12) hours, and need them because they have so many logic gates.
The "fan out" describes how many logic inputs can be controlled by a single logic output. The minimum practical fan out is about five. Modern electronic logic using CMOS transistors for switches have fanouts near fifty, and can sometimes go much higher.
The "switching speed" describes how many times per second an inverter (an electronic representation of a "logical not" function) can change from true to fall and back. Faster logic can accomplish more operations in less time. Digital logic first became useful when switching speeds got above fifty Hertz, because that was faster than a team of humans operating mechanical calculators. Modern electronic digital logic routinely switches at five billion (5x10^9) Hertz, and some laboratory systems switch at more than a trillion (1x10^12) Hertz.
It is possible to construct nonelectronic digital mechanisms. In principle, any technology capable of representing discrete states and representing logic operations could be used to build mechanical logic.
Hydraulic, pneumatic and mechanical versions of logic gates exist and are used in situations where electricity cannot be used. The first two types are considered under the heading of fluidics. One application of fluidic logic is in military hardware that is likely to be exposed to a nuclear electromagnetic pulse (nuclear EMP, or NEMP) that would destroy any electrical circuits.
Mechanical logic is frequently used in inexpensive controllers, such as those in washing machines. Famously, the first computer design, by Charles Babbage, was designed to use mechanical logic. Mechanical logic might also be used in very small computers that could be built by nanotechnology.
Logic systems can be constructed from diverse systems including optical, magnetic, chemical, biochemical and quantum systems. In each case, the desired logic function can be found in the interactions of the physical components. For example if two particular enzymes are required to prevent the construction of a particular protein, this is the equivalent of a biological "NAND" gate. They can even be constructed in computer games. David Deutsch (Q3A Electronics) constructed logic circuits for use in the game Quake 3 Arena. Although not present in the game, as a level designer, he found a way of doing it, thus allowing complex triggering of game events.
They can also be used to process digital information without being connected up as a computer. Such circuits are referred to as "random logic".
The discovery of superconductivity has enabled the development of Rapid Single Flux Quantum (RSFQ) circuit technology, which uses Josephson junctions instead of transistors. Most recently, attempts are being made to construct purely optical computing systems capable of processing digital information using nonlinear optical elements.
Analog circuit | Boolean algebra | Circuit | CMOS | Combinatorial logic | Data strobe encoding | De Morgan's laws | Digital | Electrical network | Electronics | Field effect transistor | Finite state machine | Formal verification | Glitch Ringing | Hardware description language | Instruction pipelining | Integrated circuit | Sequential logic | Logic analyzer | Logic gate | Microelectronics | Multiplexer | Multiplication ALU | Multivibrator | NMOS | Programmable logic device | Reconfigurable system | Register | Transistor | Transistor-transistor logic | Transparent latch | Ternary logic | Runt pulse | Transmission line | VHSIC
- Claude E. Shannon : used Boolean algebra for building digital circuits.
- List of electrical Input/Output standards
The contents of this article is licensed from www.wikipedia.org under the GNU Free Documentation License. Click here to see the transparent copy and copyright details