Tag Archives: Latest Seminar Topics

Ovonic Unified Memory

This  Electronics Engineering Seminar Topic deals with the following:

We are now living in a world driven by various electronic equipments. Semiconductors form the fundamental building blocks of the modern electronic world providing the brains and the memory of products all around us from washing machines to super computers. Semi conductors consist of array of transistors with each transistor being a simple switch between electrical 0 and 1. Now often bundled together in there 10’s of millions they form highly complex, intelligent, reliable semiconductor chips, which are small and cheap enough for proliferation into products all around us.

Identification of new materials has been, and still is, the primary means in the development of next generation semiconductors. For the past 30 years, relentless scaling of CMOS IC technology to smaller dimensions has enabled the continual introduction of complex microelectronics system functions. However, this trend is not likely to continue indefinitely beyond the semiconductor technology roadmap. As silicon technology approaches its material limit, and as we reach the end of the roadmap, an understanding of emerging research devices will be of foremost importance in the identification of new materials to address the corresponding technological requirements.

If scaling is to continue to and below the 65nm node, alternatives to CMOS designs will be needed to provide a path to device scaling beyond the end of the roadmap. However, these emerging research technologies will be faced with an uphill technology challenge. For digital applications, these challenges include exponentially increasing the leakage current (gate, channel, and source/drain junctions), short channel effects, etc. while for analogue or RF applications, among the challenges are sustained linearity, low noise figure, power added efficiency and transistor matching. One of the fundamental approaches to manage this challenge is using new materials to build the next generation transistors.

PRESENT MEMORY TECHNOLOGY SCENARIO
As stated, revising the memory technology fields ruled by silicon technology is of great importance. Digital Memory is and has been a close comrade of each and every technical advancement in Information Technology. The current memory technologies have a lot of limitations. DRAM is volatile and difficult to integrate. RAM is high cost and volatile. Flash has slower writes and lesser number of write/erase cycles compared to others. These memory technologies when needed to expand will allow expansion only two-dimensional space. Hence area required will be increased. They will not allow stacking of one memory chip over the other. Also the storage capacities are not enough to fulfill the exponentially increasing need. Hence industry is searching for “Holy Grail” future memory technologies that are efficient to provide a good solution. Next generation memories are trying tradeoffs between size and cost. These make them good possibilities for development.

EMERGING MEMORY TECHNOLOGIES
Many new memory technologies were introduced when it is understood that semiconductor memory technology has to be replaced, or updated by its successor since scaling with semiconductor memory reached its material limit. These memory technologies are referred as ‘Next Generation Memories”. Next Generation Memories satisfy all of the good attributes of memory. The most important one among them is their ability to support expansion in three-dimensional spaces. Intel, the biggest maker of computer processors, is also the largest maker of flash-memory chips is trying to combine the processing features and space requirements feature and several next generation memories are being studied in this perspective. They include MRAM, FeRAM, Polymer Memory Ovonic Unified Memory, ETOX-4BPC, NRAM etc. One or two of them will become the mainstream.

Click Here To Download Full Seminar

Related Posts:

Optical Switching

This  Electronics Engineering Seminar Topic deals with the following:

Seminar Topic on Optical Switching

Explosive information demand in the internet world is creating  enormous needs for capacity expansion in next generation telecommunication networks. It is expected that the data- oriented network traffic will double every year.

Optical networks are widely regarded as the ultimate solution to the bandwidth needs of future communication systems. Optical fiber links deployed between nodes are capable to carry terabits of information but the electronic switching at the nodes limit the bandwidth of a network. Optical switches at the nodes will overcome this limitation. With  their improved efficiency and lower costs, Optical switches provide the key to both manage the new capacity Dense Wavelength Division Multiplexing (DWDM) links as well as gain a competitive advantage for provision of new band width hungry services. However, in an optically switched network the challenge lies in overcoming signal impairment and network related parameters. Let us discuss the present status, advantages and challenges and future trends in optical  switches.

OPTICAL FIBERS
A fiber consists of a glass core and a surrounding layer called the cladding. The core and cladding have carefully chosen indices of refraction to ensure that the photos propagating in the core are always reflected at the interface of the cladding. The only way the light can enter and escape is through the ends of the fiber. A transmitter either alight emitting diode or a laser sends electronic data that have been converted to photons over the fiber at a wavelength of between 1,200 and 1,600 nanometers.

Today fibers are pure enough that a light signal can travel for about 80 kilometers without the need for amplification. But at some point the signal still needs to be boosted. Electronics for amplitude signal were replaced by stretches of fiber infused with ions of the rareearth erbium. When these erbium-doped fibers were zapped by a pump laser, the excited ions could revive a fading signal. They restore a signal without any optical to electronic conversion and can do so for very high speed signals sending tens of gigabits a second. Most importantly they can boost the power of many wavelengths simultaneously.

Now to increase information rate, as many wavelengths as possible are jammed down a fiber, with a wavelength carrying as much data as possible. The technology that does this has a name-dense wavelength division multiplexing (DWDM ) – that is a paragon of technospeak.

Switches are needed to route the digital flow to its ultimate destination. The enormous bit conduits will flounder if the light streams are routed using conventional electronic switches, which require a multi-terabit signal to be converted into hundreds of lower speed electronic signals. Finally, switched signals would have to be reconverted to photons and re-aggregated into light channels that are then sent out through a designated output fiber.

The cost and complexity of electronic switching prompted to find a means of redirecting either individual wavelengths or the   entire light signal in a fiber from one path way to another without the opto-electronic conversion.

Click Here To Download Full Seminar

Related Posts:

Optical Computing Technology

This  Electronics Engineering Seminar Topic deals with the following:

With the growth of computing technology the need of high performance computers (HPC) has significantly increased. Optics has been used in computing for a number of years but the main emphasis has been and continues to be to link portions of computers, for communications, or more intrinsically in devices that have some optical application or component (optical pattern recognition etc.)

Optical computing was a hot research area in 1980’s.But the work tapered off due to materials limitations that prevented optochips from getting small enough and cheap enough beyond laboratory curiosities. Now, optical computers are back with advances in self-assembled conducting organic polymers that promise super-tiny of all optical chips.

Optical computing technology is, in general, developing in two directions. One approach is to build computers that have the same architecture as present day computers but using optics that is Electro optical hybrids. Another approach is to generate a completely new kind of computer, which can perform all functional operations in optical mode. In recent years, a number of devices that can ultimately lead us to real optical computers have already been manufactured. These include optical logic gates, optical switches, optical interconnections and optical memory.

Current trends in optical computing emphasize communications, for example the use of free space optical interconnects as a potential solution to remove ‘Bottlenecks’ experienced in electronic architectures. Optical technology is one of the most promising, and may eventually lead to new computing applications as a consequence of faster processing speed, as well as better connectivity and higher bandwidth.

NEED FOR OPTICAL COMPUTING
The pressing need for optical technology stems from the fact that today’s computers are limited by the time response of electronic circuits. A solid transmission medium limits both the speed and volume of signals, as well as building up heat that damages components.

One of the theoretical limits on how fast a computer can function is given by Einstein’s principle that signal cannot propagate faster than speed of light. So to make computers faster, their components must be smaller and there by decrease the distance between them. This has resulted in the development of very large scale integration (VLSI) technology, with smaller device dimensions and greater complexity. The smallest dimensions of VLSI nowadays are about 0.08mm. Despite the incredible progress in the development and refinement of the basic technologies over the past decade, there is growing concern that these technologies may not be capable of solving the computing problems of even the current millennium. The speed of computers was achieved by miniaturizing electronic components to a very small micron-size scale, but they are limited not only by the speed of electrons in matter but also by the increasing density of interconnections necessary to link the electronic gates on microchips.

The optical computer comes as a solution of miniaturization problem.Optical data processing can perform several operations in parallel much faster and easier than electrons. This parallelism helps in staggering computational power. For example a calculation that takes a conventional electronic computer more than 11 years to complete could be performed by an optical computer in a single hour. Any way we can realize that in an optical computer, electrons are replaced by photons, the subatomic bits of electromagnetic radiation that make up light.

SOME KEY OPTICAL COMPONENTS FOR COMPUTING
The major breakthroughs on optical computing have been centered on the development of micro-optic devices for data input.

1. VCSEL (VERTICAL CAVITY SURFACE EMITTING LASER)
VCSEL (pronounced ‘vixel’) is a semiconductor vertical cavity surface emitting laser diode that emits light in a cylindrical beam vertically from the surface of a fabricated wafer, and offers significant advantages when compared to the edge-emitting lasers currently used in the majority of fiber optic communications devices. The principle involved in the operation of a VCSEL is very similar to those of regular lasers.

There are two special semiconductor materials sandwiching an active layer where all the action takes place. But rather than reflective ends, in a VCSEL there are several layers of partially reflective mirrors above and below the active layer. Layers of semiconductors with differing compositions create these mirrors, and each mirror reflects a narrow range of wavelengths back in to the cavity in order to cause light emission at just one wavelength.

Click Here To Download Full Seminar

Related Posts: