Tuesday 30 August 2011

Geo tv | Geo News | Pakistan, Sports, World, Video News

Geo tv | Geo News | Pakistan, Sports, World, Video News

Monday 8 August 2011

How is Infinera Different?


Carriers have told us that they want the benefits of digital networks for their optical layer. They want the service flexibility that comes from digital add/drop of any optical service at any network node. They want to simplify network engineering and operations through regular digital cleanup of analog optical impairments, digital grooming and multiplexing, and advanced digital coding. And they want these benefits in a network that costs less than the traditional "analog optical" alternative. In short, these carriers have created an alternative vision for the optical layer that we call the Digital Optical Network.
Digital Optical Networking is the antithesis of the way that most other optical networking vendors have chosen to evolve their network architectures. These vendors have chosen the "all-optical route" with all the drawbacks that this entails.
Infinera took a different approach. We began with the assumption that "digital is good". After all, the vast majority of data being carried over optical networks today is created in digital form. In addition, anything to do with administering or fault-finding the services that carry this data is also digital, and any attempts to mimic these OAM or performance management capabilities in the analog domain tend to be somewhat ineffective.
We decided we had to find a way to make "digital" work. To make it scale, and to make it economical to deploy and operate. The key was to allow economical Optical-Electrical-Optical (OEO) conversions.
OEO conversions are uneconomical in conventional networks because they involve a number of discrete optical components for each wavelength, multiplied by the number of wavelengths that DWDM allows us to transmit in a single fiber.
For example, to create a signal for an individual wavelength we typically need:
  • A source of light - such as a laser
  • A way to put a digital signal onto that light - known as a modulator
  • A means to ensure that the laser stays on the correct wavelength
  • Components that control the output power of the laser
A number of optical component companies have developed small scale PICs that combine a number of these components into the same chip package. Sometimes these PICs are built on the same semiconductor substrate, while sometimes they're simply combined together in the same package (see monolithic photonic integration vs co-packaging).
Infinera has pioneered the technique of large scale photonic integration. Our PICs integrate the components needed for ten different wavelengths onto a single chip. At the moment that means over fifty discrete optical components on a single transmit PIC, and we'll continue to drive integration of additional components and functions.

Unique Space Image of Alabama Tornado Tracks

Unique Space Image of Alabama Tornado Tracks
May 16, 2011: NASA has released a unique satellite image tracing the damage of a monster EF-4 tornado that tore through Tuscaloosa, Alabama, on April 27th. It combines visible and infrared data to reveal damage unseen in conventional photographs.
"This is the first time we've used the ASTER instrument to track the wake of a super-outbreak of tornadoes," says NASA meteorologist Gary Jedlovec of the Marshall Space Flight Center in Huntsville, AL.

An ASTER visible-IR image of tornado damage near Tuscaloosa, AL. [larger image]
In the picture, captured just days after the storm, pink represents vegetation and aqua is the absence of vegetation. The tornado ripped up everything in its path, scouring the Earth's surface with its terrible force. The "tearing up" of vegetation makes the tornado's track stand out as a wide swath of aqua.
"This image and others like it are helping us study the torn landscape to determine just how huge and powerful these twisters were and to assess the damage they inflicted," says Jedlovec.
ASTER, short for Advanced Spaceborne Thermal Emission and Reflection Radiometer, orbits Earth onboard NASA's Terra spacecraft. Its data products include digital elevation maps from stereo images; surface temperatures; vegetation maps; cloud and sea ice data; and more. Last spring the instrument helped track the movement of the oil spill in the Gulf of Mexico.

Ground survey teams have a lot to contend with. [Youtube video]
To detect the scars left by the twisters, ASTER senses the visible and infrared energy reflected from the planet's surface. Destruction like crushed houses, torn and snapped trees, and uprooted crops are evident in the multi-wavelength images.
"A demolished house, debris and soil scattered on vegetated surfaces, and damaged trees and crops all change the pattern of reflected radiation measured by the satellite," explains Jedlovec. "We can analyze these patterns to help storm survey teams evaluate the damage."
Ground teams conducting field surveys of tornado damage must try to pinpoint where the twisters touched down, how long they stayed on the ground, and the force of their winds. But doing this from ground level can be tricky. Some places are nearly impossible to reach by foot or car. Also, in remote areas, damage often goes unreported, so survey teams don't know to look there.
This is where satellites can help.
"To get an accurate picture survey teams need to look everywhere that sustained damage – even unreported areas. Satellite sensors detect damage in rural areas, wilderness areas, and other unpopulated areas. Only with that knowledge can surveyors determine the true track of a tornado."
Otherwise, says Jedlovec, a twister could have flattened a single dwelling in a remote location, killing everyone inside, and no one would know.

Another sample of ASTER tornado data showing three nearly-parallel tracks of destruction. [large image] [annotated composite image]
Less critical but still important are home owners' insurance issues. To evaluate claims submitted by storm victims, insurance companies rely on National Weather Service storm reports based on the field surveys.
"Let's say you live in a remote area," says Jedlovec. "If there's no record of a storm passing over your area, you could be out of luck."
Jedlovec and colleagues are working now to produce satellite images of other areas ravaged by the historic outbreak of tornadoes.
"We want to help the storm victims any way we can."

Super Storm on Saturn

Super Storm on Saturn
May 19, 2011: NASA's Cassini spacecraft and a European Southern Observatory ground-based telescope are tracking the growth of a giant early-spring storm in Saturn's northern hemisphere so powerful that it stretches around the entire planet. The rare storm has been wreaking havoc for months and shooting plumes of gas high into the planet's atmosphere.

This false-color infrared image shows clouds of large ammonia ice particles dredged up by the powerful storm. Credit: Cassini. [more]
"Nothing on Earth comes close to this powerful storm," says Leigh Fletcher, a Cassini team scientist at the University of Oxford in the United Kingdom, and lead author of a study that appeared in this week's edition of Science Magazine. "A storm like this is rare. This is only the sixth one to be recorded since 1876, and the last was way back in 1990."
Cassini's radio and plasma wave science instrument first detected the large disturbance in December 2010, and amateur astronomers have been watching it ever since through backyard telescopes. As it rapidly expanded, the storm's core developed into a giant, powerful thunderstorm, producing a 3,000-mile-wide (5,000-kilometer-wide) dark vortex possibly similar to Jupiter's Great Red Spot.
This is the first major storm on Saturn observed by an orbiting spacecraft and studied at thermal infrared wavelengths. Infrared observations are key because heat tells researchers a great deal about conditions inside the storm, including temperatures, winds, and atmospheric composition. Temperature data were provided by the Very Large Telescope (VLT) on Cerro Paranal in Chile and Cassini's composite infrared spectrometer (CIRS), operated by NASA's Goddard Space Flight Center in Greenbelt, Md.
"Our new observations show that the storm had a major effect on the atmosphere, transporting energy and material over great distances -- creating meandering jet streams and forming giant vortices -- and disrupting Saturn's seasonal [weather patterns]," said Glenn Orton, a paper co-author, based at NASA's Jet Propulsion Laboratory in Pasadena, Calif.
The violence of the storm -- the strongest disturbances ever detected in Saturn's stratosphere -- took researchers by surprise. What started as an ordinary disturbance deep in Saturn's atmosphere punched through the planet's serene cloud cover to roil the high layer known as the stratosphere.

Thermal infrared images of Saturn from the Very Large Telescope Imager and Spectrometer for the mid-Infrared (VISIR) instrument on the European Southern Observatory's Very Large Telescope, on Cerro Paranal, Chile, appear at center and on the right. An amateur visible-light image from Trevor Barry, of Broken Hill, Australia, appears on the left. The images were obtained on Jan. 19, 2011. [more]
"On Earth, the lower stratosphere is where commercial airplanes generally fly to avoid storms which can cause turbulence," says Brigette Hesman, a scientist at the University of Maryland in College Park who works on the CIRS team at Goddard and is the second author on the paper. "If you were flying in an airplane on Saturn, this storm would reach so high up, it would probably be impossible to avoid it."
A separate analysis using Cassini's visual and infrared mapping spectrometer, led by Kevin Baines of JPL, confirmed the storm is very violent, dredging up deep material in volumes several times larger than previous storms. Other Cassini scientists are studying the evolving storm and, they say, a more extensive picture will emerge soon.

NASA's Galaxy Evolution Explorer Helps Confirm Nature of Dark Energy

PASADENA, Calif. -- A five-year survey of 200,000 galaxies, stretching back seven billion years in cosmic time, has led to one of the best independent confirmations that dark energy is driving our universe apart at accelerating speeds. The survey used data from NASA's space-based Galaxy Evolution Explorer and the Anglo-Australian Telescope on Siding Spring Mountain in Australia.

The findings offer new support for the favored theory of how dark energy works -- as a constant force, uniformly affecting the universe and propelling its runaway expansion. They contradict an alternate theory, where gravity, not dark energy, is the force pushing space apart. According to this alternate theory, with which the new survey results are not consistent, Albert Einstein's concept of gravity is wrong, and gravity becomes repulsive instead of attractive when acting at great distances.

"The action of dark energy is as if you threw a ball up in the air, and it kept speeding upward into the sky faster and faster," said Chris Blake of the Swinburne University of Technology in Melbourne, Australia. Blake is lead author of two papers describing the results that appeared in recent issues of the Monthly Notices of the Royal Astronomical Society. "The results tell us that dark energy is a cosmological constant, as Einstein proposed. If gravity were the culprit, then we wouldn't be seeing these constant effects of dark energy throughout time."

Dark energy is thought to dominate our universe, making up about 74 percent of it. Dark matter, a slightly less mysterious substance, accounts for 22 percent. So-called normal matter, anything with atoms, or the stuff that makes up living creatures, planets and stars, is only approximately four percent of the cosmos.

The idea of dark energy was proposed during the previous decade, based on studies of distant exploding stars called supernovae. Supernovae emit constant, measurable light, making them so-called "standard candles," which allows calculation of their distance from Earth. Observations revealed dark energy was flinging the objects out at accelerating speeds.

Dark energy is in a tug-of-war contest with gravity. In the early universe, gravity took the lead, dominating dark energy. At about 8 billion years after the Big Bang, as space expanded and matter became diluted, gravitational attractions weakened and dark energy gained the upper hand. Billions of years from now, dark energy will be even more dominant. Astronomers predict our universe will be a cosmic wasteland, with galaxies spread apart so far that any intelligent beings living inside them wouldn't be able to see other galaxies.

The new survey provides two separate methods for independently checking the supernovae results. This is the first time astronomers performed these checks across the whole cosmic timespan dominated by dark energy. The team began by assembling the largest three-dimensional map of galaxies in the distant universe, spotted by the Galaxy Evolution Explorer. The ultraviolet-sensing telescope has scanned about three-quarters of the sky, observing hundreds of millions of galaxies.

"The Galaxy Evolution Explorer helped identify bright, young galaxies, which are ideal for this type of study," said Christopher Martin, principal investigator for the mission at the California Institute of Technology in Pasadena. "It provided the scaffolding for this enormous 3-D map."

The astronomers acquired detailed information about the light for each galaxy using the Anglo-Australian Telescope and studied the pattern of distance between them. Sound waves from the very early universe left imprints in the patterns of galaxies, causing pairs of galaxies to be separated by approximately 500 million light-years.

This "standard ruler" was used to determine the distance from the galaxy pairs to Earth -- the closer a galaxy pair is to us, the farther apart the galaxies will appear from each other on the sky. As with the supernovae studies, this distance data were combined with information about the speeds at which the pairs are moving away from us, revealing, yet again, the fabric of space is stretching apart faster and faster.

The team also used the galaxy map to study how clusters of galaxies grow over time like cities, eventually containing many thousands of galaxies. The clusters attract new galaxies through gravity, but dark energy tugs the clusters apart. It slows down the process, allowing scientists to measure dark energy's repulsive force.

"Observations by astronomers over the last 15 years have produced one of the most startling discoveries in physical science; the expansion of the universe, triggered by the Big Bang, is speeding up," said Jon Morse, astrophysics division director at NASA Headquarters in Washington. "Using entirely independent methods, data from the Galaxy Evolution Explorer have helped increase our confidence in the existence of dark energy."

Caltech leads the Galaxy Evolution Explorer mission and is responsible for science operations and data analysis. NASA's Jet Propulsion Laboratory in Pasadena, manages the mission and built the science instrument. The mission was developed under NASA's Explorers Program managed by the Goddard Space Flight Center, Greenbelt, Md. Researchers sponsored by Yonsei University in South Korea and the Centre National d'Etudes Spatiales (CNES) in France collaborated on this mission. Caltech manages JPL for NASA

Gliese 581d: A Habitable Exoplanet?

Gliese 581d: A Habitable Exoplanet?
Source: CNRS press release



Alien Life
Posted: 05/20/11
Summary: A new computer model that simulates possible exoplanet climates indicates that the planet Gliese 581d might be warm enough to have oceans, clouds and rainfall. Gliese 581d is likely to be a rocky planet with a mass at least seven times that of Earth.


Schematic of the global climate model used to study Gliese 581d. Red / blue shading indicate hot / cold surface temperatures, while the arrows show wind velocities at 2 km height in the atmosphere. © LMD/CNRS Are there other planets inhabited like the Earth, or at least habitable? The discovery of the first habitable planet has become a quest for many astrophysicists who look for rocky planets in the “habitable zone” around stars, the range of distances in which planets are neither too cold nor too hot for life to flourish.

In this quest, the red dwarf star Gliese 581 has already received a huge amount of attention. In 2007, scientists reported the detection of two planets orbiting not far from the inner and outer edge of its habitable zone (Gliese 581d and Gliese 581c). While the more distant planet, Gliese 581d, was initially judged to be too cold for life, the closer-in planet, Gliese 581c, was thought to be potentially habitable by its discoverers. However, later analysis by atmospheric experts showed that if it had liquid oceans like Earth, they would rapidly evaporate in a 'runaway greenhouse' effect similar to that which gave Venus the hot, inhospitable climate it has today.

A new possibility emerged late in 2010, when a team of observers led by Steven Vogt at the University of California, Santa Cruz, announced that they had discovered a new planet, which they dubbed Gliese 581g, or 'Zarmina's World'. This planet, they claimed, had a mass similar to that of Earth and was close to the centre of the habitable zone. For several months, the discovery of the first potential Earth twin outside the Solar System seemed to have been achieved. Unfortunately, later analysis by independent teams has raised serious doubts on this extremely difficult detection. Many now believe that Gliese 581g may not exist at all. Instead, it may simply be a result of noise in the ultra-fine measurements of stellar 'wobble' needed to detect exoplanets in this system.


Surface temperature maps for simulations of Gliese 581d assuming an atmosphere of 20 bars of CO2 and varying rotation rates. It is currently unknown whether the planet rotates slowly or has permanent day and night sides. In all cases, the temperatures allow for the presence of liquid water on the surface. © LMD/CNRS It is Gliese 581g's big brother – the larger and more distant Gliese 581d - which has been shown to be the confirmed potentially habitable exoplanet by Robin Wordsworth, François Forget and co-workers from Laboratoire de Météorologie Dynamique (CNRS/UPMC/ENS/Ecole Polytechnique) at the Institute Pierre Simon Laplace in Paris, in collaboration with a researcher from the Laboratoire d'astrophysique de Bordeaux (CNRS/Université Bordeaux 1). Although it is likely to be a rocky planet, it has a mass at least seven times that of Earth, and is estimated to be about twice its size.

At first glance, Gliese 581d is a pretty poor candidate in the hunt for life: it receives less than a third of the stellar energy Earth does and may be tidally locked, with a permanent day and night side. After its discovery, it was generally believed that any atmosphere thick enough to keep the planet warm would become cold enough on the night side to freeze out entirely, ruining any prospects for a habitable climate.

To test whether this intuition was correct, Wordsworth and colleagues developed a new kind of computer model capable of accurately simulating possible exoplanet climates. The model simulates a planet's atmosphere and surface in three dimensions, rather like those used to study climate change on Earth. However, it is based on more fundamental physical principles, allowing the simulation of a much wider range of conditions than would otherwise be possible, including any atmospheric cocktail of gases, clouds and aerosols.

To their surprise, they found that with a dense carbon dioxide atmosphere - a likely scenario on such a large planet - the climate of Gliese 581d is not only stable against collapse, but warm enough to have oceans, clouds and rainfall. One of the key factors in their results was Rayleigh scattering, the phenomenon that makes the sky blue on Earth.

In the Solar System, Rayleigh scattering limits the amount of sunlight a thick atmosphere can absorb, because a large portion of the scattered blue light is immediately reflected back to space. However, as the starlight from Gliese 581 is red, it is almost unaffected. This means that it can penetrate much deeper into the atmosphere, where it heats the planet effectively due to the greenhouse effect of the CO2 atmosphere, combined with that of the carbon dioxide ice clouds predicted to form at high altitudes. Furthermore, the 3D circulation simulations showed that the daylight heating was efficiently redistributed across the planet by the atmosphere, preventing atmospheric collapse on the night side or at the poles.


This artist's concept illustrates a young, red dwarf star surrounded by three planets. Such stars are dimmer and smaller than yellow stars like our sun. Credit: NASA/JPL-Caltech Scientists are particularly excited by the fact that at 20 light years from Earth, Gliese 581d is one of our closest galactic neighbours. For now, this is of limited use for budding interstellar colonists – the furthest-travelled man-made spacecraft, Voyager 1, would still take over 300,000 years to arrive there. However, it does mean that in the future telescopes will be able to detect the planet's atmosphere directly.

While Gliese 581d may be habitable there are other possibilities; it could have kept some atmospheric hydrogen, like Uranus and Neptune, or the fierce wind from its star during its infancy could even have torn its atmosphere away entirely. To distinguish between these different scenarios, Wordsworth and co-workers came up with several simple tests that observers will be able to perform in future with a sufficiently powerful telescope.

If Gliese 581d does turn out to be habitable, it would still be a pretty strange place to visit – the denser air and thick clouds would keep the surface in a perpetual murky red twilight, and its large mass means that surface gravity would be around double that on Earth. But the diversity of planetary climates in the galaxy is likely to be far wider than the few examples we are used to from the Solar System. In the long run, the most important implication of these results may be the idea that life-supporting planets do not in fact need to be particularly like the Earth at all.

Local Scientists Produce First Aerogel in Space

First Space-Produced Aerogel Made on Space Sciences Laboratory Rocket Flight
June 19, 1996: Aerogel is the lightest solid known to mankind, with only three times the density of air. A block the size of a human weighs less than a pound. Because of its amazing insulating properties, an inch-thick slab can safely shield the human hand from the heat of a blowtorch. A sugar-cubed size portion of the material has the internal surface area of a basketball court. As the only known transparent insulator, Aerogel is a supercritically dried gel sometimes referred to as "frozen smoke".

On April 3, 1996, the first space-produced samples of aerogels were produced by NASA on a flight of a starfire rocket. The production of such materials in space is interesting because of the strong influence of gravity on how a gel is formed. Comparison of gels manufactured in space and on the ground have shown large differences, and the production of gels in space can provide a higher-quality product with a more uniform structure.

Chemical Engineering Progress (June 1995, p 14) described "the holy grail of aerogel applications has been developing invisible insulation for use between window panes." The production of insulating and transparent windows through aerogel manufacturing in space can develop into a substantial market for residential and commercial applications. The excellent thermal properties and transparent nature of silica aerogel make it an obvious choice for super-insulating windows, skylights, solar collector covers, and specialty windows.

Space Sciences Laboratory Hosts Bill Nye, the Science Guy

October 16, 1996

This week, the Marshall Space Flight Center and the Space Sciences Laboratory are hosting Bill Nye, The Science Guy, as their crew from Seattle films for an upcoming episode of the PBS television series. Taping in SSL will occur on Wednesday, October 16 and Thursday, October 17.
Areas of science from the laboratory that will be featured on an upcoming episode of Bill Nye include Aerogel, "cool telescopes" such as BATSE and the AXAF Calibration Facility, the SSL Solar Vector Magnetograph, and the 105-meter drop tube for microgravity experimentation.
The program will also feature a dive in the Marshall Neutral Buoyancy Simulator, the large tank in which the Hubble Space Telescope repair missions are rehearsed by astronauts, as well as a visit to the Space Station Assembly facility.

First Space-Produced Aerogel Made on Space Sciences Laboratory Rocket Flight

October 8, 1996: Results are now beginning to become available from the April 3, 1996 rocket flight to produce the first space-made Aerogel. As described in the June 19, 1996 Aerogel Headline , Aerogel is the lightest solid known to mankind, with only three times the density of air. Aerogel, because of its appearence is sometimes referred to as "frozen smoke". Aerogel produced on the ground typically displays a blue haze or has a slight cloudiness to its appearence. This feature is believed to be caused by impurities and variations in the size of small pores in the Aerogel material. Scientists are trying to eliminate this haze so that the insulator might be used in window panes and other applications where transparency is important.

The Aerogel made aboard the flight of the Starfire Rocket in April has indicated that gravity effects in samples of the material made on the ground may be responsible for the adverse pore sizes and thus account for the lack of transparency. Both the diameter and volume of the pores in the space-made Aerogel appear to be between 4 and 5 times better than otherwise identically formulated ground samples. Because Aerogels are the only known transparent insulator, with typical heat conduction properties that are five times better than the next best alternative, a number of novel applications are foreseen in high performance Aerogels.

Fall Science Meeting Highlights Tethered Satellite Results

October 15, 1996

Scientists attending the Fall 1996 meeting of the American Geophysical Union will be treated to three special sessions covering scientific results obtained from the reflight of the Tethered Satellite System (TSS-1R). The conference will take place on December 18 and 19 in San Francisco, California.
The TSS-1R science mission was conducted on space shuttle flight STS-75 at the end of February 1996. During the flight, the Tethered Satellite was deployed to a distance of 12.3 miles (19.7 km) and science data was collected aboard the satellite, the space-shuttle orbiter, and from a network of ground stations monitoring the earth's ionosphere.
Five hours of tethered operation yielded a rich scientific data set. These data include tether current and voltage measurements, plasma particle and wave measurements, and visual observations for a variety of pre-planned science objectives. During the flight the conducting tether connecting the Orbiter to the satellite was severed, and large currents were observed to be flowing between the satellite and the Orbiter during the break event.
Further scientific data were obtained from the instruments on the satellite after the break, when the science and NASA support teams were able to capture telemetry from the satellite during the overflight of NASA tracking stations.
One important finding from TSS-1R has been the high level of current collected by the satellite at relatively low voltage throughout the deployed phase of the mission. Surprisingly large currents were also observed during the tether break and gas releases, indicating important new physics at play. The three Tethered Satellite sessions at the AGU meeting will cover the results of data analysis from the mission, important supporting physics insights from laboratory experiments, theoretical and numerical modeling of current collection during the mission, and the conclusions of recent studies on the future use of tethers for science in space.

Unique telescope to open the X(-ray) Files

Artist's concept of AXAF in orbit., The nested mirrors are at center behind the dotted circles.
The finest set of mirrors ever built for X-ray astronomy has arrived at NASA's Marshall Space Flight Center for several weeks of calibration before being assembled into a telescope for launch in late 1998.

The High-Resolution Mirror Assembly (HRMA), as it is known, will be the heart of the Advanced X-ray Astrophysics Facility (AXAF) which is managed by Marshall Space Flight Center. HRMA was built by Eastman Kodak and Hughes Danbury Optical Systems. In 1997-98, they will be assembled by TRW Defense and Space Systems into the AXAF spacecraft. AXAF is designed to give astronomers as clear a view of the universe in X-rays as they now have in visible light through the Hubble Space Telescope.

Indeed, one of the Hubble's recent discoveries may move near the top of the list of things to do for AXAF. Hubble recently discovered that some quasars reside within quite ordinary galaxies. Quasars (quasi-stellar objects) are unusually energetic objects which emit up to 1,000 times as much energy as an entire galaxy, but from a volume about the size of our solar system.

More clues to what is happening inside quasars may lie in the X-rays emitted by the most violent forces in the universe.

Before AXAF can embark on that mission, though, its mirrors must be measured with great precision so astronomers will know the exact shape and quality of the mirrors. Then, once the telescope is in space, they will be able to tell when they discover unusual objects, and be able to measure exactly how unusual.

These measurements will be done in Marshall's X-ray Calibration Facility, the world's largest, over the next few weeks.

AXAF will use four sets of mirrors, each set nested inside the other, to focus X-rays by grazing incidence reflection, the same principle that makes sunlight glare off clear windshields. AXAF's smallest mirror - 63 cm (24.8 in.) in diameter - is larger than the biggest - 58 cm (22.8 in.) flown on the Einstein observatory (HEAO-2) in 1978-81.

Mapping the details of the mirror will start with an X-ray source pretty much like what a dentist uses to check your teeth. But that's next week's story.

MSFC Earth-Sun Studies Featured at AGU

AGU
December 13, 1996
Fountains of electrified gases spewing from the Earth into space and pictures of the aurora during the day will be highlighted by the American Geophysical Union (AGU) annual winter conference in San Francisco Dec. 15-19.
AGU is one of the largest scientific bodies in the world and takes in everything from earthquakes to solar flares - including work by scientists at Marshall Space Flight Center's Space Sciences Laboratory (SSL) to understand what drives the aurora borealis and causes space storms that can black out cities.
At at three sessions during the AGU meeting, Marshall scientists will present their results in several papers, written with colleagues from other institutions, from the Thermal Ion Dynamics Experiment (TIDE) and the Ultraviolet Imager (UVI), two of several instruments aboard the Polar spacecraft launched in 1996.
TIDE recently confirmed that plasmas in the tail of the magnetosphere come from Earth's outer atmosphere being warmed by a flow of materials from space. The magnetosphere is formed by the Earth's magnetic field and buffers the planet from the constant wind of gases streaming from the sun.
Press briefings scheduled for the AGU Fall Meeting include:
Imaging Space Plasmas - Polar UVI and the Inner Magnetosphere Imager on which MSFC will have an important camera. Tuesday, Dec. 17, 12:45 p.m.
Sun-Earth Connections - the new era of coordinated solar-terrestrial research by scientists using Polar and other craft. Time TBD.
"There's a raging controversy over whether the magnetosphere stores energy to any degree, or just dissipates what the solar wind throws at it," said Dr. Tom Moore, director of the space plasma physics branch at SSL and principal investigator for TIDE.
Pictures from the UVI will help scientists decide whether the magnetosphere is driven directly by the solar wind, or it stores then discharges energy like a thunder cloud building a lightning charge.
"Northern winter traditionally has been the busy season for plasma scientists," said Dr. James Spann, a UVI co-investigator at SSL, "because that's when the aurora borealis is almost all in the night sky and can be viewed in visible as well as ultraviolet light."
UVI, included in three sessions at AGU, extends the busy season by letting scientists see what happens during the day. Doing this has been a challenge because the atmosphere's ozone layer reflects solar ultraviolet light that blinds most sensors. Previous instruments let scientists see parts of the daytime aurora, or the entire nightside auora. UVI aboard Polar is the first to show all of both day and nightside auroras. It does this with narrow bandpass filters - filters that admit narrowly define colors - that match lights emitted by the auroras.
UVI lets scientists measure, with precision, the energies flowing into the auroral oval. In addition to striking pictures, UVI reveals the footprint of the Earth's magnetic field lines that may stretch into deep space to several times the distance from Earth to Moon.

Free-Floating Planets May Be More Common Than Stars

May 18, 2011: Astronomers have discovered a new class of Jupiter-sized planets floating alone in the dark of space, away from the light of a star. The team believes these lone worlds are probably outcasts from developing planetary systems and, moreover, they could be twice as numerous as the stars themselves.
"Although free-floating planets have been predicted, they finally have been detected," said Mario Perez, exoplanet program scientist at NASA Headquarters in Washington. "[This has] major implications for models of planetary formation and evolution."
The discovery is based on a joint Japan-New Zealand survey that scanned the center of the Milky Way galaxy during 2006 and 2007, revealing evidence for up to 10 free-floating planets roughly the mass of Jupiter. The isolated orbs, also known as orphan planets, are difficult to spot, and had gone undetected until now. The planets are located at an average approximate distance of 10,000 to 20,000 light years from Earth.

This artist's concept illustrates a Jupiter-like planet alone in the dark of space, floating freely without a parent star. [larger image] [video]
This could be just the tip of the iceberg. The team estimates there are about twice as many free-floating Jupiter-mass planets as stars. In addition, these worlds are thought to be at least as common as planets that orbit stars. This adds up to hundreds of billions of lone planets in our Milky Way galaxy alone.
"Our survey is like a population census," said David Bennett, a NASA and National Science Foundation-funded co-author of the study from the University of Notre Dame in South Bend, Ind. "We sampled a portion of the galaxy, and based on these data, can estimate overall numbers in the galaxy."
The study, led by Takahiro Sumi from Osaka University in Japan, appears in the May 19 issue of the journal Nature. The survey is not sensitive to planets smaller than Jupiter and Saturn, but theories suggest lower-mass planets like Earth should be ejected from their stars more often. As a result, they are thought to be more common than free-floating Jupiters.
Previous observations spotted a handful of free-floating planet-like objects within star-forming clusters, with masses three times that of Jupiter. But scientists suspect the gaseous bodies form more like stars than planets. These small, dim orbs, called brown dwarfs, grow from collapsing balls of gas and dust, but lack the mass to ignite their nuclear fuel and shine with starlight. It is thought the smallest brown dwarfs are approximately the size of large planets.

A video from JPL describes the microlensing technique astronomers used to detect the orphan planets.
On the other hand, it is likely that some planets are ejected from their early, turbulent solar systems, due to close gravitational encounters with other planets or stars. Without a star to circle, these planets would move through the galaxy as our sun and others stars do, in stable orbits around the galaxy's center. The discovery of 10 free-floating Jupiters supports the ejection scenario, though it's possible both mechanisms are at play.
"If free-floating planets formed like stars, then we would have expected to see only one or two of them in our survey instead of 10," Bennett said. "Our results suggest that planetary systems often become unstable, with planets being kicked out from their places of birth."
The observations cannot rule out the possibility that some of these planets may be in orbit around distant stars, but other research indicates Jupiter-mass planets in such distant orbits are rare.
The survey, the Microlensing Observations in Astrophysics (MOA), is named in part after a giant wingless, extinct bird family from New Zealand called the moa. A 5.9-foot (1.8-meter) telescope at Mount John University Observatory in New Zealand is used to regularly scan the copious stars at the center of our galaxy for gravitational microlensing events. These occur when something, such as a star or planet, passes in front of another more distant star. The passing body's gravity warps the light of the background star, causing it to magnify and brighten. Heftier passing bodies, like massive stars, will warp the light of the background star to a greater extent,resulting in brightening events that can last weeks. Small planet-size bodies will cause less of a distortion, and brighten a star for only a few days or less.
A second microlensing survey group, the Optical Gravitational Lensing Experiment (OGLE), contributed to this discovery using a 4.2-foot (1.3 meter) telescope in Chile. The OGLE group also observed many of the same events, and their observations independently confirmed the analysis of the MOA group.

Super Storm on Saturn

May 19, 2011: NASA's Cassini spacecraft and a European Southern Observatory ground-based telescope are tracking the growth of a giant early-spring storm in Saturn's northern hemisphere so powerful that it stretches around the entire planet. The rare storm has been wreaking havoc for months and shooting plumes of gas high into the planet's atmosphere.

This false-color infrared image shows clouds of large ammonia ice particles dredged up by the powerful storm. Credit: Cassini. [more]
"Nothing on Earth comes close to this powerful storm," says Leigh Fletcher, a Cassini team scientist at the University of Oxford in the United Kingdom, and lead author of a study that appeared in this week's edition of Science Magazine. "A storm like this is rare. This is only the sixth one to be recorded since 1876, and the last was way back in 1990."
Cassini's radio and plasma wave science instrument first detected the large disturbance in December 2010, and amateur astronomers have been watching it ever since through backyard telescopes. As it rapidly expanded, the storm's core developed into a giant, powerful thunderstorm, producing a 3,000-mile-wide (5,000-kilometer-wide) dark vortex possibly similar to Jupiter's Great Red Spot.
This is the first major storm on Saturn observed by an orbiting spacecraft and studied at thermal infrared wavelengths. Infrared observations are key because heat tells researchers a great deal about conditions inside the storm, including temperatures, winds, and atmospheric composition. Temperature data were provided by the Very Large Telescope (VLT) on Cerro Paranal in Chile and Cassini's composite infrared spectrometer (CIRS), operated by NASA's Goddard Space Flight Center in Greenbelt, Md.
"Our new observations show that the storm had a major effect on the atmosphere, transporting energy and material over great distances -- creating meandering jet streams and forming giant vortices -- and disrupting Saturn's seasonal [weather patterns]," said Glenn Orton, a paper co-author, based at NASA's Jet Propulsion Laboratory in Pasadena, Calif.
The violence of the storm -- the strongest disturbances ever detected in Saturn's stratosphere -- took researchers by surprise. What started as an ordinary disturbance deep in Saturn's atmosphere punched through the planet's serene cloud cover to roil the high layer known as the stratosphere.

Thermal infrared images of Saturn from the Very Large Telescope Imager and Spectrometer for the mid-Infrared (VISIR) instrument on the European Southern Observatory's Very Large Telescope, on Cerro Paranal, Chile, appear at center and on the right. An amateur visible-light image from Trevor Barry, of Broken Hill, Australia, appears on the left. The images were obtained on Jan. 19, 2011. [more]
"On Earth, the lower stratosphere is where commercial airplanes generally fly to avoid storms which can cause turbulence," says Brigette Hesman, a scientist at the University of Maryland in College Park who works on the CIRS team at Goddard and is the second author on the paper. "If you were flying in an airplane on Saturn, this storm would reach so high up, it would probably be impossible to avoid it."
A separate analysis using Cassini's visual and infrared mapping spectrometer, led by Kevin Baines of JPL, confirmed the storm is very violent, dredging up deep material in volumes several times larger than previous storms. Other Cassini scientists are studying the evolving storm and, they say, a more extensive picture will emerge soon.

Solar Storm Warning

March 10, 2006: It's official: Solar minimum has arrived. Sunspots have all but vanished. Solar flares are nonexistent. The sun is utterly quiet.
Like the quiet before a storm.
This week researchers announced that a storm is coming--the most intense solar maximum in fifty years. The prediction comes from a team led by Mausumi Dikpati of the National Center for Atmospheric Research (NCAR). "The next sunspot cycle will be 30% to 50% stronger than the previous one," she says. If correct, the years ahead could produce a burst of solar activity second only to the historic Solar Max of 1958.
That was a solar maximum. The Space Age was just beginning: Sputnik was launched in Oct. 1957 and Explorer 1 (the first US satellite) in Jan. 1958. In 1958 you couldn't tell that a solar storm was underway by looking at the bars on your cell phone; cell phones didn't exist. Even so, people knew something big was happening when Northern Lights were sighted three times in Mexico. A similar maximum now would be noticed by its effect on cell phones, GPS, weather satellites and many other modern technologies.
Right: Intense auroras over Fairbanks, Alaska, in 1958

NASA Events

NASA Events

Review: Eee Pad tablet transforms into laptop

(AP) -- The tablet computers that compete with the iPad have mostly been uninspiring. The Eee Pad Transformer stands out with a design that isn't just copied from the iPad: It's a tablet that turns into a ...

Google Music: Definitely beta

Google has been accused of overusing the "beta" tag on products it releases early. But with its new music service - Music - the beta tag is mandatory. It's still pretty raw, judging from my experience with it today.

Microsoft trying to take another bite of the Apple?

t was recently announced that Apple, assessed at $150 billion, surpassed Google as the world’s most valuable brand. This comes a year after overtaking Microsoft as the globe’s most valuable technology ...

Google works to close security loophole in Android

Google is in the process of updating its Android operating system to fix an issue that is believed to have left millions of smartphones and tablets vulnerable to personal data leaks. ..

NASA sees Tropical Storm 04W's thunderstorms grow quickly

This TRMM satellite 3-D image shows that some thunderstorm towers near TSO4W's center of circulation were punching up to heights of over 16 km (~9.9 miles) above the ocean's surface. Credit: Credit: NASA/SSAI, Hal Pierce


Tropical Storm 04W formed from the low pressure System 98W this morning in the northwestern Pacific. NASA's Tropical Rainfall Measuring Mission (TRMM) satellite watched the towering thunderstorms in the center of the tropical storm grow to almost 10 miles (16 km) high as it powered up quickly.

galaxies

"Advanced computer techniques allow us to combine data from the individual telescopes to yield images with the sharpness of a single giant telescope, one nearly as large as Earth itself," said Roopesh Ojha at NASA's Goddard Space Flight Center in Greenbelt, Md.
The enormous energy output of galaxies like Cen A comes from gas falling toward a black hole weighing millions of times the sun's mass. Through processes not fully understood, some of this infalling matter is ejected in opposing jets at a substantial fraction of the speed of light. Detailed views of the jet's structure will help astronomers determine how they form.
The jets strongly interact with surrounding gas, at times possibly changing a galaxy's rate of star formation. Jets play an important but poorly understood role in the formation and evolution of galaxies.

Enlarge
Left: The giant elliptical galaxy NGC 5128 is the radio source known as Centaurus A. Vast radio-emitting lobes (shown as orange in this optical/radio composite) extend nearly a million light-years from the galaxy. Credit: Capella Observatory (optical), with radio data from Ilana Feain, Tim Cornwell, and Ron Ekers (CSIRO/ATNF), R. Morganti (ASTRON), and N. Junkes (MPIfR). Right: The radio image from the TANAMI project provides the sharpest-ever view of a supermassive black hole's jets. This view reveals the inner 4.16 light-years of the jet and counterjet, a span less than the distance between our sun and the nearest star. The image resolves details as small as 15 light-days across. Undetected between the jets is the galaxy's 55-million-solar-mass black hole. Credit: Credit: NASA/TANAMI/Müller et al.
NASA's Fermi Gamma-ray Space Telescope has detected much higher-energy radiation from Cen A's central region. "This radiation is billions of times more energetic than the radio waves we detect, and exactly where it originates remains a mystery," said Matthias Kadler at the University of Wuerzburg in Germany and a collaborator of Ojha. "With TANAMI, we hope to probe the galaxy's innermost depths to find out."
Ojha is funded through a Fermi investigation on multiwavelength studies of Active Galactic Nuclei.
The astronomers credit continuing improvements in the Australian Long Baseline Array (LBA) with TANAMI's enormously increased image quality and resolution. The project augments the LBA with telescopes in South Africa, Chile and Antarctica to explore the brightest galactic jets in the southern sky.

Radio telescopes capture best-ever snapshot of black hole jets (w/ video)

Enlarge
Merging X-ray data (blue) from NASA's Chandra X-ray Observatory with microwave (orange) and visible images reveals the jets and radio-emitting lobes emanating from Centaurus A's central black hole. Credit: ESO/WFI (visible); MPIfR/ESO/APEX/A.Weiss et al. (microwave); NASA/CXC/CfA/R.Kraft et al. (X-ray)
(PhysOrg.com) -- An international team, including NASA-funded researchers, using radio telescopes located throughout the Southern Hemisphere has produced the most detailed image of particle jets erupting from a supermassive black hole in a nearby galaxy.

Display Applications

Overcoming the Drawbacks of Fluorescent Lamps

Liquid crystal display (LCD), thanks to continued improvements in resolution, response rates and scalability, has become the pervasive display technology for mobile phones, monitors, notebooks, HDTVs and other consumer electronics. Since LCD panels are transmissive and emit no light of their own, they require a backlight to provide illumination. Commonly, LCD backlighting units (BLUs) employed cold cathode fluorescent lamps (CCFLs), similar to those used for commercial overhead lights, as their light source. However, CCFLs have a number of drawbacks. They require a high voltage power supply and generally are the highest power consuming component in large format displays and HDTVs. CCFLs contain mercury which has special disposal requirements and faces increasing limits on its use in many countries. Also, the space needed by CCFLs constrains how thin an LCD panel can be made. And as CCFLs are a tube-based technology, they are usually the first component to fail in an LCD display.

Light emitting diodes (LEDs) offer a semiconductor-based lighting solution which overcomes the limitations of CCFLs. With continued advancements in brightness and efficiency, LEDs are displacing CCFLs in backlighting applications, and as their price continues to drop, will take their place as a general lighting solution as well. LEDs deliver higher brightness than CCFLs and better power efficiency (more lumens per watt), use a lower-voltage power supply and generate less heat. LEDs can produce a much wider color gamut making movies and images appear more vibrant and lifelike. Because of their compact nature, LED backlights can enable ultra-slim displays and HDTVs less than half an inch thick. As a solid state component, like the other semiconductor devices in mobile phones, computers and HDTVs, LEDs have much longer lifetimes than CCFLs.

Harnessing the Benefits of LEDs

However, harnessing all the benefits of LEDs for backlighting still entails challenges. As point sources of light, LEDs can be used in an array topology in the backlight to directly illuminate the LCD panel. An array requires a high number of LEDs and therefore can be very expensive. In addition, in order to properly diffuse the light, arrays require a greater distance between the LEDs and the LCD panel, resulting in a thicker display. A thinner and more cost-effective solution is to use LEDs in an edge-lit configuration with a light guide panel (LGP) to turn the light into the viewing plane and distribute it across the display. This requires fewer LEDs but introduces the problem of maintaining uniformity of brightness over the entire backlight area. Maintaining uniformity and achieving the full benefits of edge-lit technology necessitates a high-efficiency LGP that can be economically manufactured.

XDR™ Memory Architecture

XDR™ Memory Architecture

The Rambus XDR™ memory architecture is a total memory system solution that achieves an order of magnitude higher performance than today's standard memories while utilizing the fewest ICs. Perfect for compute and consumer electronics applications, a single, 4-byte-wide, 6.4Gbps XDR DRAM component provides 25.6GB/s of peak memory bandwidth.
Key components enabling the breakthrough performance of the XDR memory architecture are:
XDR DRAM is a high-speed memory IC that turbo-charges standard CMOS DRAM cores with a high-speed interface capable of 7.2Gbps data rates providing up to 28.8GB/s of bandwidth with a single device.

HDTV Applications

HDTV Applications

“The year 2010 marks a major transition period for the US LCD TV market, when consumers increasingly are gravitating towards sets with more advanced features.” - Riddhi Patel, iSuppli Principal TV Analyst
Consumer research finds that among advanced features, HDTV buyers' top priority is picture quality. Capabilities such as full HD 1080p resolution, 480Hz frame rates, LED backlighting, 3D display, and advanced image processing and motion compensation create incredibly rich viewing experiences. Each of these capabilities demands higher levels of memory bandwidth.

In the future, consumers will expect even more. With requirements for handling multiple streams of 3D content, Ultra-High Definition (UHD) 4K picture resolution, 16-bit color and more, HDTV designers need a memory architecture that provides the highest bandwidth performance. However, even as functionality increases, OEMs will continue to face strong downward pressure on prices. Consumer focus on pricing is second only to picture quality. For this reason, achieving these advanced features while reducing BOM costs and minimizing the total number of devices used is critical.
As a result of recent government mandates and consumers’ desire to “buy green,” OEMs must also significantly reduce HDTV system power. Typical HDTV power budgets must fall by as much as 50% by 2013 in order to meet the most stringent requirements. Key to addressing power reduction is the move to LED technology for LCD backlights, and continued improvements to power efficiency of electronics components including the image processors and memory subsystem.

Mobile Applications

Consumers have come to expect the entertainment experience of the living room from the mobile devices they carry every day. Advanced mobile devices offer high-definition (HD) resolution video recording, multi-megapixel digital image capture, 3D gaming and media-rich web applications. To pack all that functionality in a form factor that's thin, light and delivered with a pleasing aesthetic presents a tremendous challenge for mobile device designers. Chief among these challenges is the implementation of a high-performance memory architecture that meets the power efficiency constraints of battery-operated products.

In order to support these advanced mobile devices, memory bandwidth will experience significant growth. Over the course of the next 2-3 years, mobile gaming and graphics applications will push memory bandwidth requirements to 12.8 gigabytes per second and beyond. This bandwidth must be achieved within the constraints of the available battery life and cost budget.

Understanding the Energy Consumption of Dynamic Random Access Memories

Energy consumption has become a major constraint on the capabilities of computer systems. In large systems the energy consumed by Dynamic Random Access Memories (DRAM) is a significant part of the total energy consumption. It is possible to calculate the energy consumption of currently available DRAMs from their datasheets, but datasheets don’t allow extrapolation to future DRAM technologies and don’t show how other changes like increasing bandwidth requirements change DRAM energy consumption. This paper first presents a flexible DRAM power model which uses a description of DRAM architecture, technology and operation to calculate power usage and verifies it against datasheet values. Then the model is used together with assumptions about the DRAM roadmap to extrapolate DRAM energy consumption to future DRAM generations. Using this model we evaluate some of the proposed DRAM power reduction schemes.

Terabyte Bandwidth Initiative

The Rambus Terabyte Bandwidth Initiative reflects Rambus' ongoing commitment to innovation in cutting-edge performance memory architectures to enable tomorrow's most exciting gaming and graphics products. Targeting a terabyte per second (TB/s) of memory bandwidth (1 terabyte = 1,024 gigabytes) from a single System-on-Chip (SoC), Rambus has pioneered new memory technologies capable of signaling at 20 gigabits per second (Gbps) while maintaining best-in-class power efficiency. In order to enable the transition from current generation memory architectures, Rambus has developed innovations that support both single-ended and differential memory interfaces in a single SoC package design with no additional pins.
The patented Rambus innovations that enable this breakthrough performance, unmatched power efficiency and multi-modal functionality include:
32X Data Rate – Enables high data rates while maintaining a low frequency system clock.
Fully Differential Memory Architecture (FDMA) – Improves signal integrity and reduces power consumption at high-speed operation.
FlexLink™ Command/Address (C/A) – Reduces the number of pins required for the C/A link.
FlexMode™ Interface – Provides multi-modal functionality, either single-ended or differential in a single SoC package design with no additional pins.
These innovations offer increased performance, higher and scalable data bandwidth, area optimization, enhanced signal integrity, and multi-modal capability for gaming, graphics and multi-core computing applications. With these innovations and others developed through the Terabyte Bandwidth Initiative, Rambus will provide the foundation for future memory architectures over the next decade.
Background

Graphics cards and game consoles continue to be the marquee performance products for consumers. The insatiable demand for photorealistic game play, 3D images, and a richer end-user experience is constantly pushing system and memory requirements higher. Today's high-end graphics processors support as much as 128 gigabytes per second (GB/s) of memory bandwidth, and future generations will push memory bandwidth to upwards of 1 terabyte per second (TB/s).
However, increased data rates will be only one of the challenges for future graphics processors and game consoles. Historically, as performance has increased, so have power consumption and the physical size of the processor; two trends that cannot continue unchecked due to the physical limitations for both thermals and manufacturing. Future generation gaming and graphics memory systems must be able to deliver ultra-high bandwidth without significantly increasing the power consumption or pin count over current solutions.
Innovations

Rambus' Terabyte Bandwidth Initiative incorporates breakthrough innovations to achieve 1TB/s of bandwidth on a single (SoC). These patented innovations include:
32X Data Rate transfers 32 bits of data per I/O on each clock cycle.
Asymmetric Equalization improves overall signal integrity while minimizing the complexity and cost of the DRAM device.
Enhanced Dynamic Point to Point (DPP) enables increased scaling of memory system capacity and access granularity.
Enhanced FlexPhase™ Timing Adjustment enables flexible phase relationships between signals, allowing precise on-chip alignment of data with clock.
FlexPhase circuit enhancements improve sensitivity and capability for very high performance memory systems operating at data rates of 10Gbps and higher.
FlexLink C/A is the industry's first full-speed, scalable, point-to-point command/address implemented through a single, differential, high-speed communications channel.
FlexMode Interface is a programmable assignment of signaling I/Os as data (DQ) or C/A, for either a single-ended or differential interface.
FDMA is the industry's first memory architecture that incorporates differential signaling technology on all key signal connections between the memory controller and the DRAM.
Jitter Reduction Technology improves the signal integrity of very high-speed communications links.

XDR™2 Memory Architecture

he XDR™2 memory architecture is the world's fastest memory system solution capable of providing more than twice the peak bandwidth per device when compared to a GDDR5-based system. Further, the XDR 2 memory architecture delivers this performance at 30% lower power than GDDR5 at equivalent bandwidth.

Designed for scalability, power efficiency and manufacturability, the XDR 2 architecture is a complete memory solution ideally suited for high-performance gaming, graphics and multi-core compute applications. Each XDR 2 DRAM can deliver up to 80GB/s of peak bandwidth from a single, 4-byte-wide, 20Gbps XDR 2 DRAM device. With this capability, systems can achieve memory bandwidth of over 500GB/s on a single SoC.
Capable of data rates up to 20Gbps, the XDR 2 architecture is part of the award-winning family of XDR products. With backwards compatibility to XDR DRAM and single-ended industry-standard memories, the XDR 2 architecture is part of a continuously compatible roadmap, offering a path for both performance upgrades and system cost reductions.

IBM briefly tops Microsoft in market value

A man walks past the IBM logo at the world's biggest high-tech fair, the CeBIT, in Hanover, Germany 2009. IBM briefly topped Microsoft in market value on Wall Street on Friday to become the second-largest technology company after Apple.
IBM briefly topped Microsoft in market value on Wall Street on Friday to become the second-largest technology company after Apple.

Google

Google Inc. is an American public corporation, earning revenue from advertising related to its Internet search, e-mail, online mapping, office productivity, social networking, and video sharing services as well as selling advertising-free versions of the same technologies. Google has also developed an open source web browser and a mobile operating system. The Google headquarters, the Googleplex, is located in Mountain View, California. As of March 31, 2009 (2009 -03-31)[update], the company has 19,786 full-time employees. The company is running millions of servers worldwide, which process about 1 petabyte of user-generated data every hour. Google conducts hundreds of millions of search requests every day.
Google was founded by Larry Page and Sergey Brin while they were students at Stanford University and the company was first incorporated as a privately held company on September 4, 1998. The initial public offering took place on August 19, 2004, raising $1.67 billion, implying a value for the entire corporation of $23 billion. Google has continued its growth through a series of new product developments, acquisitions, and partnerships. Environmentalism, philanthropy and positive employee relations have been important tenets during the growth of Google. The company has been identified multiple times as Fortune Magazine's #1 Best Place to Work, and as the most powerful brand in the world (according to the Millward Brown Group).
Google's mission is "to organize the world's information and make it universally accessible and useful". The unofficial company slogan, coined by former employee and Gmail's first engineer Paul Buchheit, is "Don't be evil". Criticism of Google includes concerns regarding the privacy of personal information, copyright, and censorship.

Programming

Computer programming is the iterative process of writing or editing source code. Editing source code involves testing, analyzing, and refining, and sometimes coordinating with other programmers on a jointly developed program. A person who practices this skill is referred to as a computer programmer, software developer or coder. The sometimes lengthy process of computer programming is usually referred to as software development. The term software engineering is becoming popular as the process is seen as an engineering discipline.

Computer program

A computer program (also a software program, or just a program) is a sequence of instructions written to perform a specified task for a computer.A computer requires programs to function, typically executing the program's instructions in a central processor. The program has an executable form that the computer can use directly to execute the instructions. The same program in its human-readable source code form, from which executable programs are derived (e.g., compiled), enables a programmer to study and develop its algorithms.
Computer source code is often written by professional computer programmers. Source code is written in a programming language that usually follows one of two main paradigms: imperative or declarative programming. Source code may be converted into an executable file (sometimes called an executable program or a binary) by a compiler and later executed by a central processing unit. Alternatively, computer programs may be executed with the aid of an interpreter, or may be embedded directly into hardware.
Computer programs may be categorized along functional lines: system software and application software. Many computer programs may run simultaneously on a single computer, a process known as multitasking

More About Disk Drives

Floppies – Although floppy drives are being phased out in some new computers, there are still millions of them out there and you should know something about them. The floppy drive has a little slot on the face of the computer cabinet, and into this slot you can slide a floppy diskette like the one shown here. One of the reasons floppy drives are still around is that it is very easy to take a floppy diskette from one system to another.
Inside the floppy diskette is a round flat disk coated with iron oxide on each side so that data can be stored on it magnetically. This disk is called a platter, and it spins underneath an electro-magnet called the write head that puts data onto the platter surface. There is another head called the read head that copies data from the platter.
Once the disk has made one complete revolution, data is written all the way around. That is called a track. The head then moves a bit and writes another circle of data to create a second track. Altogether, there are 80 tracks on each side, for a total of 160. Altogether, the floppy can hold 1.44 MB (megabytes) of data.
If we are looking for just a few bytes out of 1.44 million, it’s not enough to know which track it is in. To help narrow the search, the track is divided into 18 pieces, calledsectors, which look much like a slice of pie. Each sector holds 512 bytes of data, so if we know the track and sector number of the data we want it won’t be hard to find.
Hard Drives – On a hard drive, data is also organized into tracks and sectors. While each sector still holds 512 bytes, there can be many more tracks and sectors on a platter. There are also multiple platters, one on top of the other like a stack of pancakes. Hard drives can hold much more data than floppies, sometimes into the billions of bytes, calledgigabytes(GB).
Multiple platters require multiple read and write heads, all attached to the same arm so they move together. It’s called an actuator arm. When we are reading track number 10 on the top platter, the other heads are also positioned over track 10 of the other platters, and together all of these track 10s make up a cylinder. To specify the location of data on a hard drive it is necessary to say what cylinder, then the track and sector. Moving the heads from one cylinder to another is called a seek, and the amount of time this takes is the average seek time.
Although hard drives can hold much more data than floppies, the platters are sealed into a metal case that is fastened inside the computer cabinet, so it’s not an easy matter to move from one system to another like you can with floppies. A hard drive is sometimes called a fixed diskfor this reason.
Operating systems use a couple of different methods to keep track of what data is stored where on a drive. One common method uses a table called a File Allocation Tableor FAT, which is a section of the disk with pointers to data locations. There are two versions, calledFAT16 and FAT32. Windows NT, XP and 2000 use a similar method called NTFS.
There are two different interfaces commonly by hard drives to talk to the rest of the system. These are called IDE for Integrated Drive Electronics, and SCSI forSmallComputer System Interconnect. The technical differences are not important at this point, but you should know about the two types because they are not interchangeable.
Figuring out where the heads should go next and then moving them there is the job of some electronic circuitry called the disk controller. Every disk drive has its own controller, which may be on the motherboard or inside the drive itself, depending on the type of drive.
There are a few more things you should know about disk drives before we leave the subject. The first sector of Cylinder 0, Track 0 is called the boot sector, and it contains aMaster Boot Record (MBR) that shows whether the disk contains an operating system and the location of the code. If there is more than one operating system, the drive must be divided into multiple partitions. If not, then the whole drive will be a single partition. All of the disk space assigned to a partition is called a volume.
Another term you will encounter is a disk format. There is a high-level format, which creates a new file allocation table and is done with a FORMAT command. There is also alow-level format that creates a new pattern of sectors. A low-level format must be followed by an FDISK command to create a new Master Boot Record and partitions.
Last, we have the word media. This refers to the actual surface holding the data, which is the platter in the case of a disk drive. Because the floppy platter can be taken out of the drive, it is called removable media, while a hard drive is calledfixed media.
Other Drives – Most systems today, especially home systems, have additional storage drives that use CD or DVD discs. The technology for both is similar but DVDs hold much more data. These drives do not store data magnetically but use optical markings that are read with a laser. They are mostly used just to read data and not to write it. The full name for CD in fact is CD-ROM, which stands forCompact Disc - Read Only Memory. However, there are versions that can be used to write also, and these are called CD-RW and DVD-RW. Even so they are mostly used to write just once for permanent storage, and are not practical for constantly changing data.
Like hard drives, CD-ROM drives can use either an IDE or SCSI interface. The version of IDE for CD-ROM drives is called ATAPI, and for SCSI the CD-ROM version is ASPI.
Because the discs can be removed, CD-ROM and DVD are considered removable media. There are other types of removable media also that are not as common, such as tape drives and Zip disks, which are similar to floppies but with a storage capacity of 100 or 250 MB. Zip disks and tape drives also use the ATAPI interface.

More About Video

The monitor is a passive device that just displays the video output from the system. However, so much data is needed for the constantly changing screen display that special provisions are made for it.
The video card (or video circuitry on the motherboard) has its own RAM memory just to hold the display information, and its own ROM BIOS to control the output. Some motherboards even have a special high-speed connection between the CPU and the video. It’s called the AGP, or Accelerated Graphics Port.
The important numbers in evaluating a video display are how many distinct colors can be displayed and also the resolution, which is how many pixels the image contains across and from top to bottom. Each dot of color making up the image is one pixel. As video technology evolved there have been a number of standards, and each one has its own set of initials like EGA, CGA or VGA. A common one isSVGA, which stands for SuperVideo Graphics Array and has a resolution of 800x600 (that’s 800 pixels across and 600 down). Some high-performance monitors use SXGA (1280x1024) or even UXGA with a resolution of 1600x1200.

CMOS and RTC

There is other start-up information that normally stays the same but that we might want to change once in a while. This includes info about the various pieces of hardware connected to the system, which disk drive to check first for the operating system and that sort of thing. This data can’t be stored on the hard drive because we need it to boot up. It can’t be stored in RAM because it will be lost at power-off, and it can’t be stored in the BIOS because we might need to change it.
The problem is solved by a type of RAM chip that uses very low power, and it is connected to a battery. This type of low-power memory chip is called CMOS. It stands for the type of technology used in the chip, which is Complementary MetalOxideSubstrate. This is probably more than you need to know, but I’m a fanatic about defining things. By the way, since batteries don’t last forever, if you leave your computer unplugged for about 5 years you’ll find it needs a bit of trickery to get it to boot again, because the CMOS information will be gone.
There is another feature in the computer that has the same requirements as CMOS, and that is the date and time function. This obviously needs to change very minute, but we don’t want to lose track when the computer is turned off. The circuitry for this is called the RTC or Real Time Clock, and for convenience it is usually included in the same chip with the CMOS. A little trickle of juice from the CMOS battery keeps the clock running, and when you turn the computer on again it knows exactly what time and day it is. Convenient, isn’t it?

The BIOS

As we mentioned earlier, the computer knows what to do by taking instructions from programs stored in RAM. The main instructions come from a program called the operating system, and those instructions direct traffic for other programs called applications.
When the computer is turned off, all the instructions copied into the RAM are gone. When the system is turned on again, it needs to go out to the disk, get the operating system and load it into RAM, but there are no instructions in the RAM to tell it how to do this. The solution to this problem is a set of instructions that stay in memory and don’t get lost when the computer is turned off.
This set of instructions is called the BIOS, for Basic Input Output System. Since the instructions don’t need to change, they can be stored in a different kind of chip than we use for RAM. It’s called ROM, for Read Only Memory. We say that the instructions in the BIOS are hard-wired, and instead of software they are calledfirmware.
The computer goes through a process called booting up when it is first turned on. This involves executing the BIOS instructions, loading the operating system from disk into RAM, and then turning control of the computer over to the operating system after everything checks out OK. The term refers to somebody pulling themselves up by their own bootstraps (without outside help, in other words). Any computer term that includes ‘boot’ will have something to do with this start-up process.

The perfect gaming computer at the right price: How to find the parts

It can be difficult to find a computer that meets all your gaming needs. Gaming technology is constantly improving and systems can go from state-of-the-art to obsolete in a couple of years. Buying a computer presents a hassle as well. So many components determine the quality of a gaming system, how do you decide which computer to purchase without comparing every piece of hardware?
The answer is simple — decide your computer’s components for yourself and build your own gaming rig.
Building a computer is much simpler than it sounds. You need only to find the ideal components for your rig and assemble them.
Shopping for computer parts might seem intimidating, but it can be worth it. More than that, buying your own parts individually can help you save money.
The internal components you will need for your computer are a motherboard, CPU processor, hard drive, memory, graphics card and sound card. You will also need a case, monitor, keyboard, mouse and speakers.
You will want your motherboard and case to be compatible. Some motherboards are reduced in size to fit smaller cases. Once you’ve chosen a motherboard, you will want to choose the right CPU chip. For gaming purposes, you want to decide what your priorities are. Do you intend to overclock? Do you want to be able to play the most graphically advanced games for years to come, or just run your current library at decent speeds? Either way, multi-core processors are the way to go.
When choosing a hard drive, don’t skimp on space. Purchase at least 400 GB of space. You might also want to invest in a smaller SSD drive to use as a boot drive, while keeping most of your data on a separate hard drive.
If you are “future-proofing” your computer, you might want to go with a fast quad-core CPU. Dual-core processors, however, can handle most games at a significantly reduced price. A computer with a 3.2 GHz Dual-Core processor, for example, can run most games at advanced graphics settings with a good graphics card.
Selecting your graphics card can also be tricky at first. Remember one thing - graphics card companies release new products every year at high prices. That reduces the prices of their previous lines, which are still capable of running games. Older cards, such as the later cards in Nvidia’s 8 and 9 series, are capable of running most games and can be found at very affordable prices. For future proofing purposes, shell out a bit more money, research the latest lines of graphics cards and buy last year’s releases. They will last you quite a long time.
Choosing RAM is less tricky. Again, if you wish to overclock, make sure you choose a brand designed to do so. Otherwise, peruse customer reviews and find a reliable brand that fits your budget. You will want at least 4 GB of RAM to ensure a quality gaming experience. The more the better.
Sound cards are not a major point of concern, for the most part. Anything that fits your motherboard can work, unless you are going for a home theater experience. This is up to you. If you wish to cut costs, an inexpensive sound card can cost about $30 and give you all the sound you’ll need.
Avoid expensive cases. A good, spacious case shouldn’t cost more than $100.
All told, a quality gaming rig should not cost you more than $1,200 to $1,400. Because individual parts are cheaper when purchased separately, building your own rig will cost you less. You will also get the satisfaction of running all the latest games on your computer and admiring your own handiwork. It’s a win-win situation.

Want to get the most out of your DRAM? Overclock it

Your computer might be running just fine, but don’t kid yourself — it could be running even faster. For those willing to modify their computer’s hardware, overclocking is the best way to improve the performance of your CPU, DRAM and hard drive.
Overclocking is the process of running a piece of computer hardware at a faster clock rate than it is set to run. While it can be risky for those who are experienced with computer upgrades, a properly overclocked computer will see significant improvements with no negative effects on the side.
Since RAM is so important to your computer’s speed, it’s a good place to start when considering overclocking. Before you start digging through your computer to figure out how to overclock, you need to consider a few things first.
Not all RAM is alike, though, and some RAM modules are better designed for overclocking than others. Because of this, if you’re looking to overclock, you might want to get some new RAM first. While you might not want to spend extra, think of it as an investment; you will be upgrading your RAM twice over, after all.
One RAM manufacturer recently launched a line of high performance DDR3 DRAM that is aimed specifically at overclockers. The modules, which run at 1.5 volts, are offered in single, double and triple module kits and have capacities ranging from 4 GB to 16 GB. Before overclocking, the RAM runs at speeds ranging from 1600 MHz to 1866 MHz.
Another DRAM manufacturer recently launched a line of DDR2 and DDR3 modules aimed at gamers. These 2GB modules are also designed to be compatible with overclocking. According to Techshout.com, the modules are good for more than just gaming. “Each of the products in this line is also built to serve profession graphic designing and multimedia auditing requirements. With this series, a lot of emphasis has been given to attributes such as clock speed and memory timing,” according to the report.
If you’re unsure about overclocking your RAM module, research the model online.
Once you’re set with RAM you are confident about overclocking, the actual process can begin.
Before tinkering with any of your computer’s processes, it is wise to backup any files you don’t want to risk losing.
Memory is overclocked through the motherboard’s setup. You can access this by hitting the Delete key while viewing your computer’s boot menu during startup.
Once you have accessed the menu, change the memory clock’s speed, save the configuration and restart. To test the new clock speed, run an application that requires a lot of memory, such as a game with high graphics requirements. There are also memory benchmark programs that you can download to make sure your computer’s performance is actually increasing, because an improperly overclocked memory module can actually decrease a computer’s performance.
If the overclock fails, you simply have to go back to the boot menu and reset the clock speed. If it is successful, you can stick with the new DRAM speed and enjoy, or continue to overclock. If you choose the latter — which you will want to do if you have DRAM designed for overclocking — be sure to increase the clock speed in small increments until you reach your RAM’s maximum clock speed.
You can also overclock DRAM by increasing its voltage. This is an option to increase your RAM’s maximum clock speed. To do this, access the motherboard menu and find the voltage control. It will be listed with names such as “DIMM OverVoltage Control” and “DRAM Voltage Regulator.” From here, you will be presented with your voltage options. Increase your voltage here as desired and you will improve your RAM’s clock speeds.
While overclocking is not recommended for the inexperienced, it can be a great option for improving computer experience without spending a dollar. It’s easy, free and can give you great results quickly.

PC slowing you down? Upgrading is easier than it sounds

If you’re a PC owner, there’s a good chance that you’ve been there — that PC that was so sparkling a year ago and is now just a pain to run. While it would turn on almost instantly when you first got it, it now takes forever just to get to the desktop and even longer before you can run any applications. Games that used to boot up in seconds now take interminable minutes, costing you valuable time during Team Fortress 2 matches and World of Warcraft PvP matches. Everything seems to have slowed down.
Don’t put up with a slow computer. Making it as fast as new is easier than you think.
The first step to speeding up your computer is diagnosing the cause of the slowdown. “There are no magic tricks to speed up a slow computer. One must try to determine the nature of the slowness and what factors are causing it,” wrote David Levine, a tech specialist at Colby-Sawyer College. “Is it always slow, or just sometimes? … Does it take forever for the computer to start up? Are there any error messages or other warnings?”
Slow start ups are not uncommon and are an easy way to diagnose a sluggish computer. These can be caused by a number of issues, from inadequate RAM, hard drive errors and malware, to simply having too many processes running.
The latter of these is quite common and easy to fix. First, run your PC’s system configuration. To do this, type “msconfig” without quotes into the start menu on a Vista or Windows 7 machine. For XP, click Start, then Run and type “msconfig.”
Under the “Startup” tab, you will find a list of all the processes and applications your computer runs when it boots up. Odds are, you will be surprised at how many there are. Simply uncheck the boxes to the left of the applications you don’t want your computer to run and startup and click “Apply.” Restart your computer and you should see some immediate improvements.
If disabling startup applications doesn’t work, you will want to check for malware. Run the antimalware software of your choice, making sure that its definitions are up to date. Run a deep scan if possible; it will take several hours, but it will ensure that every location in your computer is scanned.
If your computer is still slow, odds are your RAM, or lack thereof, is to blame. RAM is your computer’s engine and you will want plenty, especially if you regularly use memory-hogging applications or games. To check how much RAM you have, type “dxdiag” into your Start menu or Run application, depending on your Windows version. This will bring up a table displaying all of your computer’s specifications. Your RAM will be displayed under “Memory.” The total will be displayed in megabytes, so “1024 MB” equates to 1 GB of RAM.
If you purchased your computer on the cheap from a store, there’s a good chance you don’t have enough RAM to meet your needs. However, upgrading RAM is simple and new RAM sticks are inexpensive. You just need to figure out your computer’s model and find a compatible stick of RAM.
Disk defragmentation is one last option for speeding up your machine.
When you use files on your computer, they get scattered in pieces around the hard drive. Defragmentation reorganizes these pieces.
While you will only notice a significant change if your computer is severely fragmented, defragmenting is easy and can’t hurt. Windows PCs come with their own defragmentation applications.
Hopefully with these tips you will get your computer running as fast as before. With luck and a RAM upgrade, it might even run faster.

Go green with SSD upgrades

Whether you are an environmental crusader hoping to change the world one hard drive at a time, or you are just trying to save a bit of energy from your overhead, it can pay to be environmentally conscious. Now, thanks to advancements in SSD and DRAM technology, it’s easier than ever for you to save energy and help the environment.
Computers waste a lot of energy. Emissions from data centers show this on a large scale. According to a 2008 report by the Economist, American data centers are responsible for more carbon dioxide emissions per year than the entirety of Argentina or the Netherlands.
According to the report, data centers could exceed the aviation industry in carbon output by 2020.
If data centers can emit so much energy, imagine what the output of a country’s worth of home and business computers amounts to.
“How many other industries are promoting a clean image that, on more critical examination, makes a significant contribution to growth in energy use? And what more can be done to push them to account fully for their carbon footprints?” Experts.
The SSD and DRAM industries have since done just that, making it easier than ever for consumers to upgrade their computer’s performance and be green about doing it.
Several companies have launched green SSD and DRAM lines. Micron offers energy efficient DDR2 and DDR3 DRAM that operate at lower voltages than standard DRAM. The company’s Aspen Memory portfolio offers 1 GB DDR3 modules that operate at 1.35 volts and 2 GB DDR2 modules that operate at 1.5 volts. Standard DDR3 memory operates at 1.5 volts and standard DDR2 memory operates at 1.8 volts.
''The trend in energy-efficient technology is especially important for data centers because they are always running 24 hours a day, seven days a week,'' said Brian Shirley, vice president of Micron's Memory Group.
Another manufacturer recently launched a new line of RAM with low voltages.
“The memory features high 2,133MHz speeds and a low-voltage of only 1.5V. Two 2GB modules are included, totaling 4GB, while DHX+ heatsinks and a GT Airflow module are also included to improve cooling,” Experts.
Even more so than DRAM, solid state drives are at the forefront of energy efficient computing. According to a 2009 study , SSDs could allow "the world’s data centers to reduce their cumulative electricity consumption by 166,643 megawatt hours from 2008 to 2013." According to the study, that is enough energy savings to power a whole country.
Imagine what that could do for your PC.
SSDs use about half the power of hard disk drives, a fact that many manufacturers have taken advantage of to create environmentally friendly storage drives. “[The] SSDs use 1.9 watts of power in active mode and 0.6 watts in idle mode, minimizing power and heat loads,” Experts say about one line of green SSDs. “These ‘green’ ratings tower over typical 15K HDDs, which consume between 8 to 15 watts in active mode and 1 to 2 watts in idle mode. Servers with high-rpm hard-drive solutions lead to increased power bills and larger carbon footprints.”
It’s not hard to install a new storage drive or DRAM. With the technology that is in place, it’s easier than ever for you to make your computer environmentally sound while increasing its storage capacity and speed. You might save some money while doing it, too.

Crysis games demanding on machines, rewarding for gamers

With the accolades, however, came some reservations about the game's technical requirements. Crysis was one of the most technically demanding games of all time when it was released and its recommended specs, which include 12 GB of hard drive space, 2 GB of RAM and the equivalent of an NVIDIA GeForce 8800 GTS card, are still demanding by today’s standards. Even these specs don’t allow gamers to run Crysis at its highest graphical settings.
In Gamespot’s review of the game in 2007, Experts“it's doubtful that a system has been built yet that can run the game at ultra-high resolutions with all the graphical sliders maxed out.”
That still seems to be the case. Experts say “Crysis remains the most technologically demanding video game ever made,” adding that “there is still presently no CPU and graphics card combination that can run the game on its highest settings at 1080p and v-synced 60FPS.”
While Crytek has moved away from PC exclusivity, the developer still stresses the technical superiority of PCs over consoles like the PS3 and Xbox 360.
"PC is easily a generation ahead right now,” said Crytek CEO Cevat Yerli to Edge Magazine. “With 360 and PS3, we believe the quality of the games beyond Crysis 2 and other CryEngine developments will be pretty much limited to what their creative expressions is, what the content is.”
While the company has said that Crysis 2 will not be as demanding on this generation of computers as Crysis was to its crop, the game is still likely to require a fairly hefty system to run.
While the fact that Crysis 2 will be a cross-platform game has led to rumors that it will be easier on computers than its predecessor, don’t expect its developers to go soft simply because they are now designing games for consoles as well. These days, you’ll need at least 2 GB of RAM and 25 GB of hard drive space to play World of Warcraft: Cataclysm, a game with a reputation for being easy on gamers’ machines.
Crytek is at the forefront of PC gaming technology and, to be ready for that technology, you will need hardware to match the requirements of the gaming industry. RAM is vital to running high-end games and there has never been a better time to invest in dynamic random access memory. DRAM prices have dropped 55 percent from their peak levels and are expected to continue falling. This should lead to reduced prices for consumers.
According to Experts, the drop in prices will lead to superior computers being built for the marketplace - perfect for gamers looking for an upgraded machine but unsure about building their own gaming rig.
“Instead, computer vendors will offer higher quality specs in order for vendors to preserve a certain price point. Your wallet may not see a difference, but when you go to use your system, you will notice improved functionality,” Experts.
Crysis 2 is slated to be released on March 11, 2011. PC gamers in particular will want to be able to take full advantage of a game that is sure to be gorgeous and that will undoubtedly look better on a PC than on the years-old Xbox 360 and PS3 technology. You are going to want your computer to be ready to play this game.

Emerging technologies are impacting the Flash memory market

According to a recent report from New Electronics, DRAM and Flash memory could be the only successful new memory-related technologies in recent years. The report said many technologies have been developed and pushed toward the market, but none of them have established the reliability, cost-efficiency or sustainability of DRAM or Flash.
As a result, if you are purchasing a new computer you can rest assure that your DRAM will remain valuable for the device's entire life cycle without being completely replaced by a new technology. However, you may need to adjust your notions of purchasing a traditional hard disk drive, because Flash memory technology is allowing solid state drives to become an important part of the hardware market.
SSDs use flash memory to read and write data stored on the device, allowing them to operate at faster speeds than traditional hard disks, which depend on a mechanical device rotating a disk that uses lasers to read and write data. As a result, SSDs are quickly becoming somewhat standard in laptops. The desktop market for SSDs is not necessarily as responsive because the moving parts of a hard drive are not as much of a factor in desktop models. However, SSDs are becoming popular boot drives to improve system startup and overall speeds.
The New Electronics report estimates Flash memory still has about 10 years remaining in its mainstream life cycle. The architecture behind the technology has become cheap enough to manufacture efficiently and sell to mainstream users, and devices that favor SSDs, such as laptops and mobile platforms, are becoming more popular. However, new technologies have the potential to emerge within the 10-year period that could challenge Flash's current stronghold.
One of those challenges, the report said, is the result of an upgrade to current Flash models that uses vertical 3D construction to improve memory capacity. Flash devices work by sending electric signal through small transistors that use the signal to read and write content. Current Flash architecture involves transistors places side-by-side on a memory chip. This not only limits storage capacity, it also allows the electric signal to be disrupted because transistors are closely packed.
According to New Electronics, researches have successfully developed new techniques to essentially stack transistors on top of one another in a vertical, 3D stack. As a result, the disruption between parts of the chip is removed and space limitations associated with horizontal construction is gone. The new method of constructing Flash memory is in the early stages of development, but the report said it could end up offering significant benefits for both the speed and reliability of Flash memory devices.
The report said the 3D method of constructing Flash memory could extend the production life of Flash technologies, such as solid state drives, beyond the 10-year period currently predicted for the technology. Furthermore, this manufacturing technique could keep other new memory-related technologies at bay, keeping Flash-based tools relevant.
According to recent research released by the Bedford Report, 3D Flash memory is already becoming an important tool for high-end Flash devices, and could become mainstream within the next few years. The report said 3D Flash construction makes the chips less expensive than current current Flash technology, which could provide significant benefits to consumers looking for a new memory solution.
Currently, the next technology is most popular in mobile computing devices, the Bedford Report said, but it could develop into a mainstream memory solution that enhances the benefits of Flash memory.

Add more RAM to boost your World of Warcraft Cataclysm experience

Add more RAM to boost your World of Warcraft Cataclysm experience

The millions of virtual citizens of Blizzard's hugely popular MMORPG, World of Warcraft, celebrated the recent launch of the game's latest expansion pack — World of Warcraft: Cataclysm. Those of the game's 12 million-plus subscribers who meet only the minimum RAM required specifications of the new title, however, might want to look into upgrading their RAM and hard drives in order to maximize their Azeroth experience.
The minimum requirements for Cataclysm are not extreme compared to most games, but they are a jump from the last expansion pack, 2008's Wrath of the Lich King. Cataclysm requires 1GB of RAM, higher than the 512MB minimum by Wrath of the Lich King. However, as many gamers know, meeting only the minimum RAM requirement can lead to a choppy, frustrating gaming experience, especially as systems struggle to keep pace with the game's memory demands in highly populated areas.
Memory is not the only issue facing Cataclysm players, however. The game is also a hard drive hog, taking up 25GB of space. In comparison, Wrath of the Lich King used 15GB of hard drive space. For many gamers, the hard drive requirements will be the most difficult requirement to overcome. "There are no real surprises in store — the requirements are really very low, it's just the hard drive that takes a bit of a kicking," Experts sayings when the specifications were announced.
If your hard drive is already filled to the brim and you are in need of an upgrade to play Cataclysm, now might be a good time to consider investing in a new SSD. SSDsare faster and more efficient than traditional hard disc drives.
An upgrade will be worth it. Reviews of the game have been strong, indicating that it is yet another superb entry in Blizzard's flagship franchise.
"Simply put, it is World of Warcraft 2.0. Everything seems to fit in — overhauled graphics, new landscapes, additional quests," Experts.
If you're a WOW fan, you won't want to miss out on Cataclysm, especially due to a computer that is slow or out of space. The game is sure to be one of the biggest releases of the year and you owe it to yourself to make sure you have the hardware to run it.