The fundamental element of a circuit is called a transistor -- a tiny on-off switch that governs the flow of electrical current. Transistors are linked in complicated cascades called logic circuits in which the number 1 represents flowing electrical current and 0 represents no current. Those transistors work together to ensure you can quickly pull up your Instagram app for a well-timed selfie.
Frank Wanless of Fairchild Semiconductor applied for a patent on the CMOS transistor in 1963 and got it in 1967. Transistors are tiny switches that conduct electricity from a source to a drain, but only if a gate in between activates that flow. The same basic design, vastly smaller, is still used in today's computer processors.
Conceptually, the transistor has been the same since Frank Wanlass patented the design in 1963. But physically, it's changed dramatically -- shrinking so much that Intel's Xeon server chips, released in 2014, are packed with 4.3 billion transistors.
It's the result of Moore's Law, the steady cadence of chip improvement first observed in 1965 by Intel co-founder Gordon Moore, who noted that the number of on-chip transistors doubles, on average, every two years.
The problem is that in a decade or so, transistors won't be able to shrink further because their components will be only a few atoms in size. You can't make things out of half atoms.
Intel and Samsung today use a process in which more than 10,000 transistors could fit on the side of a red blood cell that's about 7,000nm in diameter. By comparison, a sheet of paper or human hair is about 100,000 nanometers thick.
Researchers at UCLA envision tiny transistors made using atom-thick sheets of carbon called graphene, shown here with the hexagonal patterns. Graphene-based chips will pose challenges, though: the material conducts electrical current well but doesn't mirror silicon's semiconductor properties.
Today's chips are made from silicon wafers 300mm (12 inches) in diameter and less than 1mm thick. Each circular slice of silicon crystal is transformed by many steps -- layered with coatings, zapped with carefully patterned light, bathed in solvents, implanted with electrically charged atoms called ions -- until it houses an array of identical rectangular chips. Through careful cutting, the wafer is diced up into individual chips.
Why start with a circular wafer if you're making rectangular chips? Because it's easier to grow the near-perfect silicon crystals in a cylindrical shape, and the cylinder is sliced into the wafers.
Silicon falls into what the chip industry calls group IV of the periodic table of the elements. One way to keep pushing progress will involve elements drawn from columns to either side of the group IV column -- thus the term III-V materials, pronounced simply "three-five."
With III-V chip manufacturing, all that stays the same -- but silicon will get new elements layered on top. That will help electrons flow faster, which means less voltage needed to get them moving. If the chips need less power, then transistors can be smaller and switch faster.
One company betting its future on III-V materials is Efficient Power Conversion, a 34-person startup led by Chief Executive Alex Lidow. EPC already is seeing steady revenue growth from devices that incorporate a III-V layer made of gallium nitride (GaN). In 2016 or 2017 he expects to adapt the gallium nitride manufacturing process to work for the logic circuits that do the thinking in computer processors. Because of gallium nitride's electrical properties, "you immediately get a thousand times potential in improvement" over conventional silicon, he said.
IBM is investing big in exotic forms of carbon as a way to recraft chips. Graphene, for example, is a sheet of carbon atoms just a single atomic layer think, arranged in a hexagonal array that looks like a chickenwire fencing. Another is carbon nanotubes, which are like tiny straws made from rolled up graphene sheets.
Both nanotubes and graphene present challenges. Nanotubes, for example, are 99.99 percent pure but IBM needs to improve that by a factor of 10 or 100, Guha said.
Graphene is "the wonder material, but it's a lousy transistor," Intel's Mayberry said.
Spintronics is a more radical approach.
Conventional electronics process information based on electrons' negative charge. But the industry has long been interested in using electron particles' spin --conceptually akin to how a planet can rotate clockwise or counterclockwise on its axis -- to process information. You can't see an electron's spin, but you can influence and measure it with a magnetic field. Different spin directions can be represented by the 1s and 0s at the foundation of digital computation.
Spintronics' big potential advantage is energy efficiency -- an important edge because power consumption and heat limit how fast today's silicon chips can run.
Here, too, there are challenges. For example, a computer would use spintronics within its deepest interior but rely on traditional electronics further out to communicate with memory, drives and networks. Translating data and instructions between the two zones takes time.
Still, it could be useful in things like remote sensors that don't need fast processing but do need very low power consumption.
Quantum computing just might be the most mind-bending idea out there. The field explores physics phenomena at ultra-small distances that are profoundly different from what humans experience.
Here's one example of that weirdness. When we flip a coin, it lands as either heads or tails, which in computing terms is described by either 0 or 1. But quantum computers use "qubits" -- quantum bits -- that can be both 0 and 1 at the same time through a quantum mechanics concept called superposition.
Graphene is "the wonder material, but it's a lousy transistor," Intel's Mayberry said.
Conventional electronics process information based on electrons' negative charge. But the industry has long been interested in using electron particles' spin --conceptually akin to how a planet can rotate clockwise or counterclockwise on its axis -- to process information. You can't see an electron's spin, but you can influence and measure it with a magnetic field. Different spin directions can be represented by the 1s and 0s at the foundation of digital computation.
Because qubits can represent data in multiple states at the same time, they can be used to explore multiple solutions to a problem at the same time. In other words, quantum computers can test lots of possibilities in parallel instead of testing one after another the way a conventional computer chip does. Each time you add a new qubit to a quantum computer, you can try twice the number of solutions.
Google's current quantum computers process information using nine qubits, or quantum bits.
Though quantum computers must be kept extraordinarily cold to keep the qubits still enough to do their processing magic.
Silicon photonics could shuttle data around a computer faster, while re-engineered DNA might enable living cells to perform computation. That wouldn't replace a smartphone chip, but it could expand computing technology to new realms like medical diagnosis and treatment.
"Think of a flock of birds," said IBM's Guha. "When the lead bird tires, it moves to the back, and another bird takes the lead. Moore's Law has carried us fantastically the past 30 or 40 years. I'm not worried that the flock of birds won't keep going."
Ref: CNET
Though quantum computers must be kept extraordinarily cold to keep the qubits still enough to do their processing magic.
Ref: CNET
0 Comments