06b521ffd2536edf2e86327883ce4d88419ba7c5

Science and Technology Updates

Friday, October 10, 2014

Manipulating memory with light: Scientists erase specific memories in mice


Just look into the light: not quite, but researchers at the UC Davis Center for Neuroscience and Department of Psychology have used light to erase specific memories in mice, and proved a basic theory of how different parts of the brain work together to retrieve episodic memories.
During memory retrieval, cells in the hippocampus
connect to cells in the brain cortex.
Credit: Photo illustration by Kazumasa Tanaka and 
Brian Wiltgen/UC Davis
Optogenetics, pioneered by Karl Diesseroth at Stanford University, is a new technique for manipulating and studying nerve cells using light. The techniques of optogenetics are rapidly becoming the standard method for investigating brain function.

Kazumasa Tanaka, Brian Wiltgen and colleagues at UC Davis applied the technique to test a long-standing idea about memory retrieval. For about 40 years, Wiltgen said, neuroscientists have theorized that retrieving episodic memories -- memories about specific places and events -- involves coordinated activity between the cerebral cortex and the hippocampus, a small structure deep in the brain.

"The theory is that learning involves processing in the cortex, and the hippocampus reproduces this pattern of activity during retrieval, allowing you to re-experience the event," Wiltgen said. If the hippocampus is damaged, patients can lose decades of memories.

But this model has been difficult to test directly, until the arrival of optogenetics.

Wiltgen and Tanaka used mice genetically modified so that when nerve cells are activated, they both fluoresce green and express a protein that allows the cells to be switched off by light. They were therefore able both to follow exactly which nerve cells in the cortex and hippocampus were activated in learning and memory retrieval, and switch them off with light directed through a fiber-optic cable.

They trained the mice by placing them in a cage where they got a mild electric shock. Normally, mice placed in a new environment will nose around and explore. But when placed in a cage where they have previously received a shock, they freeze in place in a "fear response."

Tanaka and Wiltgen first showed that they could label the cells involved in learning and demonstrate that they were reactivated during memory recall. Then they were able to switch off the specific nerve cells in the hippocampus, and show that the mice lost their memories of the unpleasant event. They were also able to show that turning off other cells in the hippocampus did not affect retrieval of that memory, and to follow fibers from the hippocampus to specific cells in the cortex.

"The cortex can't do it alone, it needs input from the hippocampus," Wiltgen said. "This has been a fundamental assumption in our field for a long time and Kazu’s data provides the first direct evidence that it is true."

They could also see how the specific cells in the cortex were connected to the amygdala, a structure in the brain that is involved in emotion and in generating the freezing response.

Co-authors are Aleksandr Pevzner, Anahita B. Hamidi, Yuki Nakazawa and Jalina Graham, all at the Center for Neuroscience. The work was funded by grants from the Whitehall Foundation, McKnight Foundation, Nakajima Foundation and the National Science Foundation.

Story Source:
The above story is based on materials provided by University of California - Davis. Note: Materials may be edited for content and length.

Journal Reference:
Kazumasa Z. Tanaka, Aleksandr Pevzner, Anahita B. Hamidi, Yuki Nakazawa, Jalina Graham, Brian J. Wiltgen. Cortical Representations Are Reinstated by the Hippocampus during Memory Retrieval. Neuron, 2014 DOI: 10.1016/j.neuron.2014.09.037


Saturday, September 13, 2014

Incredibly light, strong materials recover original shape after being smashed


Materials scientists have developed a method for creating new structural materials by taking advantage of the unusual properties that solids can have at the nanometer scale. They have used the method to produce a ceramic (e.g., a piece of chalk or a brick) that contains about 99.9 percent air yet is incredibly strong and can recover its original shape after being smashed by more than 50 percent.

This sequence shows how the Greer Lab's three-dimensional,
ceramic nanolattices can recover after being compressed by
more than 50 percent. Clockwise, from left to right, an alumina
nanolattice before compression, during compression, fully
compressed, and recovered following compression.
Credit: Lucas Meza/Caltech

Imagine a balloon that could float without using any lighter-than-air gas. Instead, it could simply have all of its air sucked out while maintaining its filled shape. Such a vacuum balloon, which could help ease the world's current shortage of helium, can only be made if a new material existed that was strong enough to sustain the pressure generated by forcing out all that air while still being lightweight and flexible. 

Caltech materials scientist Julia Greer and her colleagues are on the path to developing such a material and many others that possess unheard-of combinations of properties. For example, they might create a material that is thermally insulating but also extremely lightweight, or one that is simultaneously strong, lightweight, and nonbreakable -- properties that are generally thought to be mutually exclusive.

Greer's team has developed a method for constructing new structural materials by taking advantage of the unusual properties that solids can have at the nanometer scale, where features are measured in billionths of meters. In a paper published in the September 12 issue of the journal Science, the Caltech researchers explain how they used the method to produce a ceramic (e.g., a piece of chalk or a brick) that contains about 99.9 percent air yet is incredibly strong, and that can recover its original shape after being smashed by more than 50 percent.

"Ceramics have always been thought to be heavy and brittle," says Greer, a professor of materials science and mechanics in the Division of Engineering and Applied Science at Caltech. "We're showing that in fact, they don't have to be either. This very clearly demonstrates that if you use the concept of the nanoscale to create structures and then use those nanostructures like LEGO to construct larger materials, you can obtain nearly any set of properties you want. You can create materials by design."

The researchers use a direct laser writing method called two-photon lithography to "write" a three-dimensional pattern in a polymer by allowing a laser beam to crosslink and harden the polymer wherever it is focused. The parts of the polymer that were exposed to the laser remain intact while the rest is dissolved away, revealing a three-dimensional scaffold. That structure can then be coated with a thin layer of just about any kind of material -- a metal, an alloy, a glass, a semiconductor, etc. Then the researchers use another method to etch out the polymer from within the structure, leaving a hollow architecture.

The applications of this technique are practically limitless, Greer says. Since pretty much any material can be deposited on the scaffolds, the method could be particularly useful for applications in optics, energy efficiency, and biomedicine. For example, it could be used to reproduce complex structures such as bone, producing a scaffold out of biocompatible materials on which cells could proliferate.

In the latest work, Greer and her students used the technique to produce what they call three-dimensional nanolattices that are formed by a repeating nanoscale pattern. After the patterning step, they coated the polymer scaffold with a ceramic called alumina (i.e., aluminum oxide), producing hollow-tube alumina structures with walls ranging in thickness from 5 to 60 nanometers and tubes from 450 to 1,380 nanometers in diameter.

Greer's team next wanted to test the mechanical properties of the various nanolattices they created. Using two different devices for poking and prodding materials on the nanoscale, they squished, stretched, and otherwise tried to deform the samples to see how they held up.

They found that the alumina structures with a wall thickness of 50 nanometers and a tube diameter of about 1 micron shattered when compressed. That was not surprising given that ceramics, especially those that are porous, are brittle. However, compressing lattices with a lower ratio of wall thickness to tube diameter -- where the wall thickness was only 10 nanometers -- produced a very different result.

"You deform it, and all of a sudden, it springs back," Greer says. "In some cases, we were able to deform these samples by as much as 85 percent, and they could still recover."

To understand why, consider that most brittle materials such as ceramics, silicon, and glass shatter because they are filled with flaws -- imperfections such as small voids and inclusions. The more perfect the material, the less likely you are to find a weak spot where it will fail. Therefore, the researchers hypothesize, when you reduce these structures down to the point where individual walls are only 10 nanometers thick, both the number of flaws and the size of any flaws are kept to a minimum, making the whole structure much less likely to fail.

"One of the benefits of using nanolattices is that you significantly improve the quality of the material because you're using such small dimensions," Greer says. "It's basically as close to an ideal material as you can get, and you get the added benefit of needing only a very small amount of material in making them."

The Greer lab is now aggressively pursuing various ways of scaling up the production of these so-called meta-materials.

Story Source: http://www.sciencedaily.com/releases/2014/09/140911135450.htm


Saturday, May 31, 2014

20 Things You Didn't Know About... Time


The beginning, the end, and the funny habits of our favorite ticking force.

 “Time is an illusion. Lunchtime doubly so,” joked Douglas Adams in The Hitchhiker’s Guide to the Galaxy. Scientists aren’t laughing, though. Some speculative new physics theories suggest that time emerges from a more fundamental—and timeless—reality.
 Try explaining that when you get to work late. The average U.S. city commuter loses 38 hours a year to traffic delays.
 Wonder why you have to set your clock ahead in March?Daylight Saving Time began as a joke by Benjamin Franklin, who proposed waking people earlier on bright summer mornings so they might work more during the day and thus save candles. It was introduced in the U.K. in 1917 and then spread around the world.
4  Green days. The Department of Energy estimates that electricity demand drops by 0.5 percent during Daylight Saving Time, saving the equivalent of nearly 3 million barrels of oil.
5  By observing how quickly bank tellers made change, pedestrians walked, and postal clerks spoke, psychologists determined that the three fastest-paced U.S. cities are Boston, Buffalo, and New York.
 The three slowest? Shreveport, Sacramento, and L.A.
 One second used to be defined as 1/86,400 the length of a day. However, Earth’s rotation isn’t perfectly reliable. Tidal friction from the sun and moon slows our planet and increases the length of a day by 3 milli­seconds per century.
8  This means that in the time of the dinosaurs, the day was just 23 hours long.
9  Weather also changes the day. During El Niño events, strong winds can slow Earth’s rotation by a fraction of a milli­second every 24 hours.
10  Modern technology can do better. In 1972 a network of atomic clocks in more than 50 countries was made the final authority on time, so accurate that it takes 31.7 million years to lose about one second.
11  To keep this time in sync with Earth’s slowing rotation, a “leap second” must be added every few years, most recently this past New Year’s Eve.
12  The world’s most accurate clock, at the National Institute of Standards and Technology in Colorado, measures vibrations of a single atom of mercury. In a billion years it will not lose one second.
13  Until the 1800s, every village lived in its own little time zone, with clocks synchronized to the local solar noon.
14  This caused havoc with the advent of trains and timetables. For a while watches were made that could tell both local time and “railway time.”
15  On November 18, 1883, American railway companies forced the national adoption of standardized time zones.
16  Thinking about how railway time required clocks in different places to be synchronized may have inspired Einstein to develop his theory of relativity, which unifies space and time.
17  Einstein showed that gravity makes time run more slowly. Thus airplane passengers, flying where Earth’s pull is weaker, age a few extra nano­seconds each flight.
18  According to quantum theory, the shortest moment of time that can exist is known as Planck time, or 0.0000000000000000000000000000000000000000001 second.
19  Time has not been around forever. Most scientists believe it was created along with the rest of the universe in the Big Bang, 13.7 billion years ago.
20  There may be an end of time. Three Spanish scientists posit that the observed acceleration of the expanding cosmos is an illusion caused by the slowing of time. According to their math, time may eventually stop, at which point everything will come to a standstill.


Thursday, May 22, 2014

Engineers build world's smallest, fastest nanomotor: Can fit inside a single cell


Researchers at the Cockrell School of Engineering at The University of Texas at Austin have built the smallest, fastest and longest-running tiny synthetic motor to date. The team's nanomotor is an important step toward developing miniature machines that could one day move through the body to administer insulin for diabetics when needed, or target and treat cancer cells without harming good cells.
Simple nanomotor. Credit: Image courtesy of University of Texas at Austin

With the goal of powering these yet-to-be invented devices, UT Austin engineers focused on building a reliable, ultra-high-speed nanomotor that can convert electrical energy into mechanical motion on a scale 500 times smaller than a grain of salt.

Mechanical engineering assistant professor Donglei "Emma" Fan led a team of researchers in the successful design, assembly and testing of a high-performing nanomotor in a nonbiological setting. The team's three-part nanomotor can rapidly mix and pump biochemicals and move through liquids, which is important for future applications. The team's study was published in a recent issue of Nature Communications.

Fan and her team are the first to achieve the extremely difficult goal of designing a nanomotor with large driving power.

With all its dimensions under 1 micrometer in size, the nanomotor could fit inside a human cell and is capable of rotating for 15 continuous hours at a speed of 18,000 RPMs, the speed of a motor in a jet airplane engine. Comparable nanomotors run significantly more slowly, from 14 RPMs to 500 RPMs, and have only rotated for a few seconds up to a few minutes.

Looking forward, nanomotors could advance the field of nanoelectromechanical systems (NEMS), an area focused on developing miniature machines that are more energy efficient and less expensive to produce. In the near future, the Cockrell School researchers believe their nanomotors could provide a new approach to controlled biochemical drug delivery to live cells.

To test its ability to release drugs, the researchers coated the nanomotor's surface with biochemicals and initiated spinning. They found that the faster the nanomotor rotated, the faster it released the drugs.

"We were able to establish and control the molecule release rate by mechanical rotation, which means our nanomotor is the first of its kind for controlling the release of drugs from the surface of nanoparticles," Fan said. "We believe it will help advance the study of drug delivery and cell-to-cell communications."

The researchers address two major issues for nanomotors so far: assembly and controls. The team built and operated the nanomotor using a patent-pending technique that Fan invented while studying at Johns Hopkins University. The technique relies on AC and DC electric fields to assemble the nanomotor's parts one by one.

In experiments, the researchers used the technique to turn the nanomotors on and off and propel the rotation either clockwise or counterclockwise. The researchers found that they could position the nanomotors in a pattern and move them in a synchronized fashion, which makes them more powerful and gives them more flexibility.

Fan and her team plan to develop new mechanical controls and chemical sensing that can be integrated into nanoelectromechanical devices. But first they plan to test their nanomotors near a live cell, which will allow Fan to measure how they deliver molecules in a controlled fashion.


Video: 


Sunday, April 20, 2014

Sugar-powered biobattery has 10 times the energy storage of lithium: Your smartphone might soon run on enzymes


As you probably know, from sucking down cans of Coke and masticating on candy, sugar — glucose, fructose, sucrose, dextrose — is an excellent source of energy. Biologically speaking, sugar molecules are energy-dense, easy to transport, and cheap to digest. There is a reason why almost every living cell on Earth generates its energy (ATP) from glucose. Now, researchers at Virginia Tech have successfully created a sugar-powered fuel cell that has an energy storage density of 596 amp-hours per kilo — or “one order of magnitude” higher than lithium-ion batteries. This fuel cell is refillable with a solution of maltodextrin, and its only by products are electricity and water. The chief researcher, Y.H. Percival Zhang, says the tech could be commercialized in as soon as three years.


Now, it’s not exactly news that sugar is an excellent energy source. As a culture we’ve probably known about it since before we were Homo sapiens. The problem is, unless you’re a living organism or some kind of incendiary device, extracting that energy is difficult. In nature, an enzymatic pathway is used — a production line of tailor-made enzymes that meddle with the glucose molecules until they become ATP. Because it’s easy enough to produce enzymes in large quantities, researchers have tried to create fuel cells that use artificial “metabolism” to break down glucose into electricity (biobatteries), but it has historically proven very hard to find the right pathway for maximum efficiency and to keep the enzymes in the right place over a long period of time.


A diagram of the enzymatic fuel cell. The little Pac-Man things are enzymes.
Now, however, Zhang and friends at Virginia Tech appear to have built a high-density fuel cell that uses an enzymatic pathway to create a lot of electricity from glucose. There doesn’t seem to be much information on how stable this biobattery is over multiple refills, but if Zhang thinks it could be commercialized in three years, that’s a very good sign. Curiously, the research paper says that the enzymes are non-immobilized — meaning Zhang found a certain battery chemistry that doesn’t require the enzymes to be kept in place… or, alternatively, that it will only work for a very short time.

The Virginia Tech biobattery uses 13 enzymes, plus air (it’s an air-breathing biobattery), to produce nearly 24 electrons from a single glucose unit. This equates to a power output of 0.8 mW/cm, current density of 6 mA/cm, and energy storage density of 596 Ah/kg. This last figure is impressive, at roughly 10 times the energy density of the lithium-ion batteries in your mobile devices. [Research paper: doi:10.1038/ncomms4026 - "A high-energy-density sugar biobattery based on a synthetic enzymatic pathway"]

If Zhang’s biobatteries pan out, you might soon be recharging your smartphone by pouring in a solution of 15% maltodextrin. That battery would not only be very safe (it produces water and electricity), but very cheap to run and very green. This seems to fit in perfectly with Zhang’s homepage, which talks about how his main goals in life are replacing crude oil with sugar, and feeding the world.

The other area in which biobatteries might be useful is powering implanted devices, such as pacemakers — or, in the future, subcutaneous sensors and computers. Such a biobattery could feed on the glucose in your bloodstream, providing an endless supply of safe electricity for the myriad implants that futuristic technocrats will surely have.
 


Wednesday, April 16, 2014

Lithium-sulfur batteries last longer with nanomaterial-packed cathode


Electric vehicles could travel farther and more renewable energy could be stored with lithium-sulfur batteries that use a unique powdery nanomaterial.
Pacific Northwest National Laboratory developed
a nickel-based metal organic framework, shown here
in an illustration, to hold onto polysulfide molecules
in the cathodes of lithium-sulfur batteries and extend
the batteries' lifespans. The colored spheres in this i
mage represent the 3D material's tiny pores into with
the polysulfides become trapped.
Credit: Pacific Northwest National Laboratory
Researchers added the powder, a kind of nanomaterial called a metal organic framework, to the battery's cathode to capture problematic polysulfides that usually cause lithium-sulfur batteries to fail after a few charges. A paper describing the material and its performance was published online April 4 in the American Chemical Society journal Nano Letters.

"Lithium-sulfur batteries have the potential to power tomorrow's electric vehicles, but they need to last longer after each charge and be able to be repeatedly recharged," said materials chemist Jie Xiao of the Department of Energy's Pacific Northwest National Laboratory. "Our metal organic framework may offer a new way to make that happen."

Today's electric vehicles are typically powered by lithium-ion batteries. But the chemistry of lithium-ion batteries limits how much energy they can store. As a result, electric vehicle drivers are often anxious about how far they can go before needing to charge. One promising solution is the lithium-sulfur battery, which can hold as much as four times more energy per mass than lithium-ion batteries. This would enable electric vehicles to drive farther on a single charge, as well as help store more renewable energy. The down side of lithium-sulfur batteries, however, is they have a much shorter lifespan because they can't currently be charged as many times as lithium-ion batteries.

Energy Storage 101

The reason can be found in how batteries work. Most batteries have two electrodes: one is positively charged and called a cathode, while the second is negative and called an anode. Electricity is generated when electrons flow through a wire that connects the two. To control the electrons, positively charged atoms shuffle from one electrode to the other through another path: the electrolyte solution in which the electrodes sit.

The lithium-sulfur battery's main obstacles are unwanted side reactions that cut the battery's life short. The undesirable action starts on the battery's sulfur-containing cathode, which slowly disintegrates and forms molecules called polysulfides that dissolve into the liquid electrolyte. Some of the sulfur—an essential part of the battery's chemical reactions—never returns to the cathode. As a result, the cathode has less material to keep the reactions going and the battery quickly dies.

New materials for better batteries

Researchers worldwide are trying to improve materials for each battery component to increase the lifespan and mainstream use of lithium-sulfur batteries. For this research, Xiao and her colleagues honed in on the cathode to stop polysulfides from moving through the electrolyte.

Many materials with tiny holes have been examined to physically trap polysulfides inside the cathode. Metal organic frameworks are porous, but the added strength of PNNL's material is its ability to strongly attract the polysulfide molecules.

The framework's positively charged nickel center tightly binds the polysulfide molecules to the cathodes. The result is a coordinate covalent bond that, when combined with the framework's porous structure, causes the polysulfides to stay put.

"The MOF's highly porous structure is a plus that further holds the polysulfide tight and makes it stay within the cathode," said PNNL electrochemist Jianming Zheng.

Nanomaterial is key

Metal organic frameworks—also called MOFs—are crystal-like compounds made of metal clusters connected to organic molecules, or linkers. Together, the clusters and linkers assemble into porous 3-D structures. MOFs can contain a number of different elements. PNNL researchers chose the transition metal nickel as the central element for this particular MOF because of its strong ability to interact with sulfur.

During lab tests, a lithium-sulfur battery with PNNL's MOF cathode maintained 89 percent of its initial power capacity after 100 charge-and discharge cycles. Having shown the effectiveness of their MOF cathode, PNNL researchers now plan to further improve the cathode's mixture of materials so it can hold more energy. The team also needs to develop a larger prototype and test it for longer periods of time to evaluate the cathode's performance for real-world, large-scale applications.

PNNL is also using MOFs in energy-efficient adsorption chillers and to develop new catalysts to speed up chemical reactions.

"MOFs are probably best known for capturing gases such as carbon dioxide," Xiao said. "This study opens up lithium-sulfur batteries as a new and promising field for the nanomaterial."

This research was funded by the Department of Energy's Office of Energy Efficiency and Renewable Energy. Researchers analyzed chemical interactions on the MOF cathode with instruments at EMSL, DOE's Environmental Molecular Sciences Laboratory at PNNL.

In January, a Nature Communications paper by Xiao and some of her PNNL colleagues described another possible solution for lithium-sulfur batteries: developing a hybrid anode that uses a graphite shield to block polysulfides.

Source: http://phys.org/news/2014-04-lithium-sulfur-batteries-longer-nanomaterial-packed-cathode.html


Monday, March 31, 2014

Aging Successfully Reversed in Mice; Human Trials to Begin Next


Scientists have successfully reversed the aging process in mice according to a new study just released. Human trials are to begin next, possibly before the year is over. The study was published in the peer reviewed science journal Cell after researchers from both the U.S and Australia made the breakthrough discovery. Lead researcher David Sinclair of the University of New South Wales says he is hopeful that the outcome can be reproduced in human trials. A successful result in people would mean not just a slowing down of aging but a measurable reversal.

Scientists have successfully reversed the aging process in mice according to a new study just released. Human trials are to begin next, possibly before the year is over.

The study showed that after administering a certain compound to the mice, muscle degeneration and diseases caused by aging were reversed. Sinclair says the study results exceeded his expectations, explaining:

I’ve been studying aging at the molecular level now for nearly 20 years and I didn’t think I’d see a day when ageing could be reversed. I thought we’d be lucky to slow it down a little bit. The mice had more energy, their muscles were as though they’d be exercising and it was able to mimic the benefits of diet and exercise just within a week. We think that should be able to keep people healthier for longer and keep them from getting diseases of ageing.
The compound the mice ate resulted in their muscles becoming very toned, as if they’d been exercising. Inflammation, a key factor in many disease processes, was drastically reduced. Insulin resistance also declined dramatically and the mice had much more energy overall. Researchers say that what happened to the mice could be compared to a 60 year old person suddenly having the muscle tone and energy of someone in his or her 20s.

What’s more, say the researchers, these stunning results were realized within just one week’s time. The compound raises the level of a naturally occurring substance in the human body called nicotinamide adenine dinucleotide. This substance decreases as people age, although those who follow a healthy diet and get plenty of exercise do not suffer the same level of reduction in the substance as do people who do not exercise. This may explain why people who remain fit into their senior years often enjoy better health than others.

Scientists who participated in the study say that poor communication between mitochondria and the cell nucleus is to blame for the aging process. The compound the researchers have developed cause the cells to be able to “talk” to each other again. They compared the relationship between the nucleus and the mitochondria to a married couple; by the time the couple has been married for 20 years, “communication breaks down” and they don’t talk to each other as much. Just like a marriage, this relationship and communication within it can be repaired, say the researchers.

Aging has successfully been reversed in mice, but Sinclair says he needs to raise more money before he can commit to a date when trials may begin in humans. The results of this initial study in mice are very promising and may pave the way for similar results in humans.

Sources: ABC News , Science Direct , Huffington Post


Friday, December 13, 2013

Instagram Direct messaging arrives to challenge Snapchat and Whatsapp


HIPSTER PHOTO SHARING SERVICE Instagram announced a messaging service called Instagram Direct on Thursday, as it looks to challenge Snapchat and Whatsapp.

Rumours surfaced at the end of November claiming that Instagram was plotting a messaging service to rival send-and-delete photo service Snapchat, and the firm put an end to the speculation on Thursday by announcing Instagram Direct.

The service does what it says on the tin, allowing Instagram users to send images and videos in direct messages to one another.

Instagram Direct lets users of the service send messages to up to 15 friends at once, much like Whatsapp, and allows groups to talk in real-time chats. Only people you follow can send you images and videos, Instagram said on Thursday, so you shouldn't have to worry about your inbox filling up with spam.

There's now an inbox logo on the top right hand corner of the app to access Instagram Direct, replacing the previous refresh button, which has been replaced with a pull to refresh function.

Instagram said, "There are moments in our lives that we want to share, but that will be the most relevant only to a smaller group of people - an inside joke between friends captured on the go, a special family moment or even just one more photo of your new puppy. Instagram Direct helps you share these moments.

"From how you capture photos and videos to the way you start conversations through likes and comments, we built Instagram Direct to feel natural to the Instagram experience you already know." Instagram Direct arrives in an update to the existing Instagram app, which is available to download for free from the iTunes App Store and Google Play store. The feature is not available yet for Windows Phone devices.


Thursday, December 12, 2013

Bangkok designers draw attention for air-purifying bike idea


Some observers are calling it "the photosynthesis bike." The bike of interest is only a concept, not even a prototype yet, from designers in Bangkok. Nonetheless, in concept alone, it has captured a lot if imaginations, press coverage, and even picked up an award in the 2013 Red Dot competition for design concept. Dubbed "Air Purifier Bike," from Bangkok-based Lightfog Creative and Design, the bicycle presents a next-level functionality to bicycles as environmentally sound vehicles—to the point where the rider not only uses a clean mode of transport but also helps to purify the air along with the ride. (The Red Dot Award for design concept is part of a professional design competition for design concepts and prototypes worldwide.)
Silawat Virakul, Torsakul Kosaikul, and Suvaroj Poosrivongvanid are the designers behind the award-winning idea. They said their Air-Purifier Bike incorporates an air filter that screens dust and pollutants from the air, a photosynthesis system (including a water tank) that produces oxygen, an electric motor, and a battery. "While it is being ridden, air passes through the filter at the front of the bike, where it is cleaned before being released toward cyclist. The bike frame houses the photosynthesis system. When the bike is parked, the air-purifying functions can continue under battery power."

According to a report on the bicycle and the designers behind it on the Fast Company Co.Exist site, the designers presently have mock-ups, but they have not yet built a prototype; they plan to build one soon.

"We want to design products which can reduce the air pollution in the city. So we decided to design a bike because we thought that bicycles are environmentally friendly vehicles for transportation," said creative director Silawat Virakul in an email to Co.Exist.


"Riding a bicycle can reduce traffic jam[s] in a city," said Virakul. "Moreover, we wanted to add more value to a bicycle by adding its ability to reduce the pollution."


If they were to advance their concept, they would be responding to many urban dwellers who are growing increasingly aware that bicycles ease pollution and are taking to bicycles for short-distance transportation. Earlier this year, Lucintel, a consulting and market research firm, analyzed the global bicycle industry in "Global Bicycle Industry 2013-2018: Trends, Profit, and Forecast Analysis." They noted that government initiatives to promote cycling to reduce carbon emissions and noise pollution are a strong growth driver. In addition, bicycles' energy efficiency, coupled with cycling as a fitness activity, will help propel demand during the forecast period.


Tuesday, October 15, 2013

New Device Harnesses Sun and Sewage to Produce Hydrogen Fuel


A novel device that uses only sunlight and wastewater to produce hydrogen gas could provide a sustainable energy source while improving the efficiency of wastewater treatment.

The new hybrid solar-microbial device is self-driven and self-sustained, because the combined energy from the organic matter (harvested by the MFC) and sunlight (captured by the PEC) is sufficient to drive electrolysis of water.
The new hybrid solar-microbial device is self-driven 
and self-sustained, because the combined energy from 
the organic matter (harvested by the MFC) and sunlight 
(captured by the PEC) is sufficient to drive electrolysis 
of water. (Credit: Image courtesy of University 
of California - Santa Cruz)
A research team led by Yat Li, associate professor of chemistry at the University of California, Santa Cruz, developed the solar-microbial device and reported their results in a paper published in the American Chemical Society journal ACS Nano. The hybrid device combines a microbial fuel cell (MFC) and a type of solar cell called a photoelectrochemical cell (PEC). In the MFC component, bacteria degrade organic matter in the wastewater, generating electricity in the process. The biologically generated electricity is delivered to the PEC component to assist the solar-powered splitting of water (electrolysis) that generates hydrogen and oxygen.

Either a PEC or MFC device can be used alone to produce hydrogen gas. Both, however, require a small additional voltage (an "external bias") to overcome the thermodynamic energy barrier for proton reduction into hydrogen gas. The need to incorporate an additional electric power element adds significantly to the cost and complication of these types of energy conversion devices, especially at large scales. In comparison, Li's hybrid solar-microbial device is self-driven and self-sustained, because the combined energy from the organic matter (harvested by the MFC) and sunlight (captured by the PEC) is sufficient to drive electrolysis of water.

In effect, the MFC component can be regarded as a self-sustained "bio-battery" that provides extra voltage and energy to the PEC for hydrogen gas generation. "The only energy sources are wastewater and sunlight," Li said. "The successful demonstration of such a self-biased, sustainable microbial device for hydrogen generation could provide a new solution that can simultaneously address the need for wastewater treatment and the increasing demand for clean energy."

Microbial fuel cells rely on unusual bacteria, known as electrogenic bacteria, that are able to generate electricity by transferring metabolically-generated electrons across their cell membranes to an external electrode. Li's group collaborated with researchers at Lawrence Livermore National Laboratory (LLNL) who have been studying electrogenic bacteria and working to enhance MFC performance. Initial "proof-of-concept" tests of the solar-microbial (PEC-MFC) device used a well-studied strain of electrogenic bacteria grown in the lab on artificial growth medium. Subsequent tests used untreated municipal wastewater from the Livermore Water Reclamation Plant. The wastewater contained both rich organic nutrients and a diverse mix of microbes that feed on those nutrients, including naturally occurring strains of electrogenic bacteria.

When fed with wastewater and illuminated in a solar simulator, the PEC-MFC device showed continuous production of hydrogen gas at an average rate of 0.05 m3/day, according to LLNL researcher and coauthor Fang Qian. At the same time, the turbid black wastewater became clearer. The soluble chemical oxygen demand--a measure of the amount of organic compounds in water, widely used as a water quality test--declined by 67 percent over 48 hours.

The researchers also noted that hydrogen generation declined over time as the bacteria used up the organic matter in the wastewater. Replenishment of the wastewater in each feeding cycle led to complete restoration of electric current generation and hydrogen gas production.

Qian said the researchers are optimistic about the commercial potential for their invention. Currently they are planning to scale up the small laboratory device to make a larger 40-liter prototype continuously fed with municipal wastewater. If results from the 40-liter prototype are promising, they will test the device on site at the wastewater treatment plant.

"The MFC will be integrated with the existing pipelines of the plant for continuous wastewater feeding, and the PEC will be set up outdoors to receive natural solar illumination," Qian said.

"Fortunately, the Golden State is blessed with abundant sunlight that can be used for the field test," Li added.

Qian and Hanyu Wang, a graduate student in Li's lab at UC Santa Cruz, are co-first authors of the ACS Nano paper. The other coauthors include UCSC graduate student Gongming Wang; LLNL researcher Yongqin Jiao; and Zhen He of Virginia Polytechnic Institute & State University. This research was supported by the National Science Foundation and Department of Energy.
 

Share this story on Facebook, Twitter, and Google:


Tuesday, August 27, 2013

How the Brain Remembers Pleasure: Implications for Addiction


Key details of the way nerve cells in the brain remember pleasure are revealed in a study by University of Alabama at Birmingham (UAB) researchers published today in the journal Nature Neuroscience. The molecular events that form such "reward memories" appear to differ from those created by drug addiction, despite the popular theory that addiction hijacks normal reward pathways.

Brain activity (artist's rendering). 
(Credit: © James Steidl / Fotolia)
Brain circuits have evolved to encourage behaviors proven to help our species survive by attaching pleasure to them. Eating rich food tastes good because it delivers energy and sex is desirable because it creates offspring. The same systems also connect in our mind's environmental cues with actual pleasures to form reward memories.

This study in rats supports the idea that the mammalian brain features several memory types, each using different circuits, with memories accessed and integrated as needed. Ancient memory types include those that remind us what to fear, what to seek out (reward), how to move (motor memory) and navigate (place memory). More recent developments enable us to remember the year Columbus sailed and our wedding day.

"We believe reward memory may serve as a good model for understanding the molecular mechanisms behind many types of learning and memory," said David Sweatt, Ph.D., chair of the UAB Department of Neurobiology, director of the Evelyn F. McKnight Brain Institute at UAB and corresponding author for the study. "Our results provide a leap in the field's understanding of reward-learning mechanisms and promise to guide future attempts to solve related problems such as addiction and criminal behavior."

The study is the first to illustrate that reward memories are created by chemical changes that influence known memory-related genes in nerve cells within a brain region called the ventral tegmental area, or VTA. Experiments that blocked those chemical changes -- a mix of DNA methylation and demethylation -- in the VTA prevented rats from forming new reward memories.

Methylation is the attachment of a methyl group (one carbon and three hydrogens) to a DNA chain at certain spots (cytosine bases). When methylation occurs near a gene or inside a gene sequence, it generally is thought to turn the gene off and its removal is thought to turn the gene on. This back-and-forth change affects gene expression without changing the code we inherit from our parents. Operating outside the genetic machinery proper, epigenetic changes enable each cell type to do its unique job and to react to its environment.

Furthermore, a stem cell in the womb that becomes bone or liver cells must "remember" its specialized nature and pass that identity to its descendants as they divide and multiply to form organs. This process requires genetic memory, which largely is driven by methylation. Note, most nerve cells do not divide and multiply as do other cells. They can't, according to one theory, because they put their epigenetic mechanisms to work making actual memories.

Natural pleasure versus addiction

The brain's pleasure center is known to proceed through nerve cells that signal using the neurochemical dopamine and generally is located in the VTA. Dopaminergic neurons exhibit a "remarkable capacity" to pass on pleasure signals. Unfortunately, the evolutionary processes that attached pleasure to advantageous behaviors also accidentally reinforced bad ones.

Addiction to all four major classes of abused drugs -- psychostimulants, opiates, ethanol and nicotine -- has been linked to increased dopamine transmission in the same parts of the brain associated with normal reward processing. Cues that predict both normal reward and effects of cocaine or alcohol also make dopamine nerve cells fire as do the experiences they recall. That had led to idea that drug addiction must take over normal reward-memory nerve pathways.

Along those lines, past research has argued that dopamine-producing neurons in the VTA -- and in a region that receives downstream dopamine signals from the VTA called the nucleus accumbens (NAC) -- both were involved in natural reward and drug-addiction-based memory formation. While that may true to some extent, this study revealed that blocking methylation in the VTA with a drug stopped the ability of rats to attach rewarding experiences to remembered cues but doing so in the NAC did not.

"We observed an important distinction, not in circuitry, but instead in the epigenetic regulation of that circuitry between natural reward responses and those that occur downstream with drugs of abuse or psychiatric illness," said Jeremy Day, Ph.D., a post-doctoral scholar in Sweatt's lab and first author for this study. "Although drug experiences may co-opt normal reward mechanisms to some extent, our results suggest they also may engage entirely separate epigenetic mechanisms that contribute only to addiction and that may explain its strength."

To investigate the molecular and epigenetic changes in the VTA, researchers took their cue from 19th century Russian physiologist Ivan Pavlov, who was the first to study the phenomenon of conditioning. By ringing a bell each day before giving his dogs food, Pavlov soon found that the dogs would salivate at the sound of the bell.

In this study, rats were trained to associate a sound tone with the availability of sugar pellets in their feed ports. This same animal model has been used to make most discoveries about how human dopamine neurons work since the 1990s, and most approved drugs that affect the dopamine system (e.g. L-Dopa for Parkinson's) were tested in it before being cleared for human trials.

To separate the effects of memory-related brain changes from those arising from the pleasure of the eating itself, the rats were separated into three groups. Rats in the "CS+" rats got sugar pellets each time they heard a sound cue. The "CS-" group heard the sound the same number of times and received as many sugar pellets -- but never together. A third tone-only group heard the sounds but never received sugar rewards.

Rats that always received sugar with the sound cue were found to poke their feed ports with their noses at least twice as often during this cue as control rats after three, 25-sound-cue sessions. Nose pokes are an established measure of the degree to which a rat has come to associate a cue with the memory of a tasty treat.

The team found that those CS+ rats (sugar paired with sound) that were better at forming reward memories had significantly higher expression of the genes Egr1 and Fos than control rats These genes are known to regulate memory in other brain regions by fine-tuning the signaling capacity of the connections between nerve cells. In a series of experiments, the team next revealed the methylation and demethylation pattern that drove the changes in gene expression seen as memories formed.

The study demonstrated that reward-related experiences caused both types of DNA methylation known to regulate gene expression.

One type involves attaching methyl groups to pieces of DNA called promoters, which reside immediately upstream of individual gene sequences (between genes), that tell the machinery that follows genetic instructions to "start reading here." The attachment of a methyl group to a promoter generally interferes with this and silences a nearby gene. However, ancient organisms such as plants and insects have less methylation between their genes, and more of it within the coding regions of the genes themselves (within gene bodies). Such gene-body methylation has been shown to encourage rather than silence gene expression.

Specifically, the team reported that two sites in the promoter for Egr1 gene were demethylated during reward experiences and, to a greater degree, in rats that associated the sugar with the sound cue. Conversely, spots within the gene body of both Egr1 and Fos underwent methylation as reward memories formed.

"When designing therapeutic treatments for psychiatric illness, addictions or memory disorders, you must profoundly understand the function of the biological systems you're working with," Day said. "Our field has learned from experience that attempts to treat addiction with something that globally impairs normal reward perception or reward memories do not succeed. Our study suggests the possibility that future treatments could dial down drug addiction or mental illness without affecting normal rewards."

Along with Sweatt and Day, authors for the study were Daniel Childs, Mikael Guzman-Karlsson, Mercy Kibe, Jerome Moulden, Esther Song and Absar Tahir within the Department of Neurobiology and the Evelyn F. McKnight Brain Institute at University of Alabama at Birmingham. This work is supported by the National Institute on Drug Abuse (DA029419), the National Institute on Mental Health (MH091122 and MH057014), and the Evelyn F. McKnight Brain Research Foundation.


Friday, August 23, 2013

Researchers use mobile phones to measure happiness


Researchers at Princeton University are developing ways to use mobile phones to explore how one's environment influences one's sense of well-being.

Locations of study subjects on world map. Credit: Demography
In a study involving volunteers who agreed to provide information about their feelings and locations, the researchers found that cell phones can efficiently capture information that is otherwise difficult to record, given today's on-the-go lifestyle. This is important, according to the researchers, because feelings recorded "in the moment" are likely to be more accurate than feelings jotted down after the fact.

To conduct the study, the team created an application for the Android operating system that documented each person's location and periodically sent the question, "How happy are you?"

The investigators invited people to download the app, and over a three-week period, collected information from 270 volunteers in 13 countries who were asked to rate their happiness on a scale of 0 to 5. From the information collected, the researchers created and fine-tuned methods that could lead to a better understanding of how our environments influence emotional well-being. The study was published in the June issue of Demography.

The mobile phone method could help overcome some of the limitations that come with surveys conducted at people's homes, according to the researchers. Census measurements tie people to specific areas—the census tracts in which they live—that are usually not the only areas that people actually frequent.

"People spend a significant amount of time outside their census tracks," said John Palmer, a graduate student in the Woodrow Wilson School of Public and International Affairs and the paper's lead author. "If we want to get more precise findings of contextual measurements we need to use techniques like this."

Palmer teamed up with Thomas Espenshade, professor of sociology emeritus, and Frederic Bartumeus, a specialist in movement ecology at the Center for Advanced Studies of Blanes in Spain, along with Princeton's Chang Chung, a statistical programmer and data archivist in the Office of Population Research; Necati Ozgencil, a former Professional Specialist at Princeton; and Kathleen Li, who earned her undergraduate degree in computer science from Princeton in 2010, to design the free, open source application for the Android platform that would record participants' locations at various intervals based on either GPS satellites or cellular tower signals.

Though many of the volunteers lived in the United States, some were in Australia, Canada, China, France, Germany, Israel, Japan, Norway, South Korea, Spain, Sweden and the United Kingdom.

Palmer noted that the team's focus at this stage was not on generalizable conclusions about the link between environment and happiness, but rather on learning more about the mobile phone's capabilities for data collection. "I'd be hesitant to try to extend our substantive findings beyond those people who volunteered." he said.

However, the team did obtain some preliminary results regarding happiness: for example, male subjects tended to describe themselves as less happy when they were further from their homes, whereas females did not demonstrate a particular trend with regards to emotions and distance.

"One of the limitations of the study is that it is not representative of all people," Palmer said. Participants had to have smartphones and be Internet users. It is also possible that people who were happy were more likely to respond to the survey. However, Palmer said, the study demonstrates the potential for mobile phone research to reach groups of people that may be less accessible by paper surveys or interviews.

Palmer's doctoral dissertation will expand on this research, and his adviser Marta Tienda, the Maurice P. During Professor in Demographic Studies, said she was excited to see how it will impact the academic community. "His applied research promises to redefine how social scientists understand intergroup relations on many levels," she said.


This study involved contributions from the Center for Information Technology Policy at Princeton University, with institutional support from the National Institutes of Health Training Grant T32HD07163 and Infrastructure Grant R24HD047879.


Saturday, August 10, 2013

Report:A Semi-Floating Gate Transistor for Low-Voltage Ultrafast Memory and Sensing Operation


Researchers at Fudan University in China have discovered a way to speed up traditional computer transistors by embedding tunneling field-effect transistors (TFETs) in them. In their paper published in the journal Science, the team describes how embedding TFETs in such transistors allows for them to be run with less power, which in turn causes them to run faster.

Researchers at Fudan University in China have discovered a way to speed up traditional computer transistors by embedding tunneling field-effect transistors (TFETs) in them. In their paper published in the journal Science, the team describes how embedding TFETs in such transistors allows for them to be run with less power, which in turn causes them to run faster.
Schematic view of an SFG memory cell. A pn junction diode between the FG and D makes the FG semi-floating. The device’s symbolic representation is also shown. Credit: Science 9 August 2013: Vol. 341 no. 6146 pp. 640-643 DOI: 10.1126/science.1240961

Most modern computers are run with either metal-oxide-semiconductor field-effect transistors (MOSFETs) or a variation of them called floating-gate (FG) MOSFETs. Such transistors are now reaching their physical limit as far as how thin they can be—just a few atoms thick. For that reason, researchers have been looking for other ways to get more bang for their buck. In this new effort, the researchers turned to TFETs, which use quantum tunneling to move electrons through very thin material.

TFETs have traditionally been used in very low power devices. In this endeavor, they researchers created a TFET that could be used to control the electrodes that monitor the flow of electricity into a MOSFET—in this case, the floating-gate variety (it has an additional electrode gate that allows a charge to be retained). The idea is that if the gate could be made to open and close faster, the transistor as a whole would operate faster. Current chips require a build-up of charge before the gate can be opened or closed—which requires time. TFETs, because they require less power, don't take as long to do their work, thus embedding one in a floating gate-MOSFET would alleviate the necessity of power buildup prior to gate changes, allowing for quicker opening and closing. That's exactly what the team in China has done. Testing thus far has shown MOSFETs with embedded TFETs have improved transistor speeds as well as reduced power requirements.

The team reports that because of the way their TFETs are constructed, embedding them in current model MOSFETs should not require reconfiguration or the use of any new materials. This means that the new TFET technology could be put into use almost immediately, bumping up the speed of computers and hand held devices while lessening the amount of energy used, resulting in longer battery life.

More information: A Semi-Floating Gate Transistor for Low-Voltage Ultrafast Memory and Sensing Operation, Science 9 August 2013: Vol. 341 no. 6146 pp. 640-643 DOI: 10.1126/science.1240961

ABSTRACT

As the semiconductor devices of integrated circuits approach the physical limitations of scaling, alternative transistor and memory designs are needed to achieve improvements in speed, density, and power consumption. We report on a transistor that uses an embedded tunneling field-effect transistor for charging and discharging the semi-floating gate. This transistor operates at low voltages (?2.0 volts), with a large threshold voltage window of 3.1 volts, and can achieve ultra–high-speed writing operations (on time scales of ~1 nanosecond). A linear dependence of drain current on light intensity was observed when the transistor was exposed to light, so possible applications include image sensing with high density and performance.


Friday, July 26, 2013

Bad Night's Sleep? The Moon Could Be to Blame


8G43FPT25P63

Many people complain about poor sleep around the full moon, and now a report appearing in Current Biology, a Cell Press publication, on July 25 offers some of the first convincing scientific evidence to suggest that this really is true. The findings add to evidence that humans -- despite the comforts of our civilized world -- still respond to the geophysical rhythms of the moon, driven by a circalunar clock.

Many people complain about poor sleep around the full moon, and now a report appearing in Current Biology, a Cell Press publication, on July 25 offers some of the first convincing scientific evidence to suggest that this really is true. The findings add to evidence that humans -- despite the comforts of our civilized world -- still respond to the geophysical rhythms of the moon, driven by a circalunar clock.
Many people complain about poor sleep around the full moon, and now a report appearing in Current Biology, a Cell Press publication, on July 25 offers some of the first convincing scientific evidence to suggest that this really is true. The findings add to evidence that humans -- despite the comforts of our civilized world -- still respond to the geophysical rhythms of the moon, driven by a circalunar clock. (Credit: Current Biology, Cajochen et al.)

"The lunar cycle seems to influence human sleep, even when one does not 'see' the moon and is not aware of the actual moon phase," says Christian Cajochen of the Psychiatric Hospital of the University of Basel.

In the new study, the researchers studied 33 volunteers in two age groups in the lab while they slept. Their brain patterns were monitored while sleeping, along with eye movements and hormone secretions.

The data show that around the full moon, brain activity related to deep sleep dropped by 30 percent. People also took five minutes longer to fall asleep, and they slept for twenty minutes less time overall. Study participants felt as though their sleep was poorer when the moon was full, and they showed diminished levels of melatonin, a hormone known to regulate sleep and wake cycles.

"This is the first reliable evidence that a lunar rhythm can modulate sleep structure in humans when measured under the highly controlled conditions of a circadian laboratory study protocol without time cues," the researchers say.

Cajochen adds that this circalunar rhythm might be a relic from a past in which the moon could have synchronized human behaviors for reproductive or other purposes, much as it does in other animals. Today, the moon's hold over us is usually masked by the influence of electrical lighting and other aspects of modern life.

The researchers say it would be interesting to look more deeply into the anatomical location of the circalunar clock and its molecular and neuronal underpinnings. And, they say, it could turn out that the moon has power over other aspects of our behavior as well, such as our cognitive performance and our moods.


Tuesday, July 16, 2013

Computer as Smart as a 4-Year-Old? Researchers IQ Test New Artificial Intelligence System


Artificial and natural knowledge researchers at the University of Illinois at Chicago have IQ-tested one of the best available artificial intelligence systems to see how intelligent it really is.

Artificial and natural knowledge researchers at the University of Illinois at Chicago have IQ-tested one of the best available artificial intelligence systems to see how intelligent it really is.
Artificial and natural knowledge researchers IQ-tested 
one of the best available artificial intelligence systems 
and learned that it's about as smart as the average 
4-year-old. (Credit: © Spofi / Fotolia)
Turns out it's about as smart as the average 4-year-old, they will report July 17 at the U.S. Artificial Intelligence Conference in Bellevue, Wash.

The UIC team put ConceptNet 4, an artificial intelligence system developed at M.I.T., through the verbal portions of the Weschsler Preschool and Primary Scale of Intelligence Test, a standard IQ assessment for young children.

They found ConceptNet 4 has the average IQ of a young child. But unlike most children, the machine's scores were very uneven across different portions of the test.

"If a child had scores that varied this much, it might be a symptom that something was wrong," said Robert Sloan, professor and head of computer science at UIC, and lead author on the study.

Sloan said ConceptNet 4 did very well on a test of vocabulary and on a test of its ability to recognize similarities.

"But ConceptNet 4 did dramatically worse than average on comprehension­the 'why' questions," he said.

One of the hardest problems in building an artificial intelligence, Sloan said, is devising a computer program that can make sound and prudent judgment based on a simple perception of the situation or facts-the dictionary definition of commonsense.

Commonsense has eluded AI engineers because it requires both a very large collection of facts and what Sloan calls implicit facts-things so obvious that we don't know we know them. A computer may know the temperature at which water freezes, but we know that ice is cold.

"All of us know a huge number of things," said Sloan. "As babies, we crawled around and yanked on things and learned that things fall. We yanked on other things and learned that dogs and cats don't appreciate having their tails pulled." Life is a rich learning environment.

"We're still very far from programs with commonsense-AI that can answer comprehension questions with the skill of a child of 8," said Sloan. He and his colleagues hope the study will help to focus attention on the "hard spots" in AI research.

Study coauthors are UIC professors Stellan Ohlsson of psychology and Gyorgy Turan of mathematics, statistics and computer science; and UIC mathematical computer science undergraduate student Aaron Urasky.

The study was supported by award N00014-09-1-0125 from the Office of Naval Research and grant CCF-0916708 from the National Science Foundation.


Friday, July 5, 2013

For better batteries, just add water


A new type of lithium-ion battery that uses aqueous iodide ions in an aqueous cathode configuration provides twice the energy density of conventional lithium-ion batteries.

A new type of lithium-ion battery that uses aqueous iodide ions
in an aqueous cathode configuration provides twice the energy
density of conventional lithium-ion batteries.
Lithium-ion batteries are now found everywhere in devices such as cellular phones and laptop computers, where they perform well. In automotive applications, however, engineers face the challenge of squeezing enough lithium-ion batteries onto a vehicle to provide the desired power and range without introducing storage and weight issues. Hye Ryung Byon, Yu Zhao and Lina Wang from the RIKEN Byon Initiative Research Unit have now developed a lithium-iodine battery system with twice the energy density of conventional lithium-ion batteries.

Byon's team is involved in alternative energy research and, specifically, improving the performance of lithium-based battery technologies. In their research they turned to an 'aqueous' system in which the organic electrolyte in conventional lithium-ion cells is replaced with water. Such aqueous lithium battery technologies have gained attention among alternative energy researchers because of their greatly reduced fire risk and environmental hazard. Aqueous solutions also have other advantages, which include an inherently high ionic conductivity.

For their battery system, the researchers investigated an 'aqueous cathode' configuration (Fig. 1), which accelerates reduction and oxidation reactions to improve battery performance. Finding suitable reagents for the aqueous cathode, however, proved to be a tricky proposition. According to Byon, water solubility is the most important criterion for screening new materials, since this parameter determines the battery's energy density. Furthermore, the redox reaction has to take place in a restricted voltage range in order to avoid water electrolysis. An extensive search led the researchers to produce the first-ever lithium battery involving aqueous iodine—an element with high water solubility and a pair of ions, known as the triiodide/iodide redox couple, that readily undergo aqueous electrochemical reactions.

The team constructed a prototype aqueous cathode device and found the energy density to be nearly double that of a conventional lithium-ion battery, thanks to the high solubility of the triiodide/iodide ions. Their battery had high and near-ideal power storage capacities and could be successfully recharged hundreds of times, avoiding a problem that plagues other alternative high-energy-density lithium-ion batteries. Microscopy analysis revealed that the cathode collector remained untouched after 100 charge/discharge cycles with no observable corrosion or precipitate formation.

Byon and colleagues now plan to develop a three-dimensional, microstructured current collector that could enhance the diffusion-controlled triiodide/iodide process and accelerate charge and discharge. They are also seeking to raise energy densities even further by using a flowing-electrode configuration that stores aqueous 'fuel' in an external reservoir—a modification that should make this low-cost, heavy metal-free design more amenable to electric vehicle specifications.

More information: 1.Zhao, Y., Wang, L. & Byon, H. R. High-performance rechargeable lithium-iodine batteries using triiodide/iodide redox couples in an aqueous cathode. Nature Communications 4, 1896 (2013). dx.doi.org/10.1038/ncomms2907


Wednesday, July 3, 2013

Microsoft creates mood sensing software for smartphones


Microsoft Research Asia has been working on creating software called MoodScope that notes how a user uses his or her phone, and then uses that information to guess that user's mood. Initial testing of the device has shown it to be 66 percent accurate; when tailored to an individual user, the team reports that the accuracy rate jumped to 93 percent. The research team includes Nicholas Lane and Robert LiKamWa of Rice University, and Lin Zhong and Yunxin Liu from Microsoft Research Asia. They built a prototype and posted their test study results on Microsoft's website.
The circumplex mood model.
The circumplex mood model. Credit: Robert LiKamWa et al.

Most people realize that their smartphone has a lot of embedded technology in it that interacts with the world at large—GPS hardware, accelerometers, etc. all monitor activity and use that data to provide useful functions, such as automatically switching from landscape to portrait mode when a phone is rotated. In this new effort, the researchers sought to discover whether software that monitors phone activities could reveal the users' moods.

To find out, the team wrote code that monitored email, texting, app usage, phone calls, location information, and browsing history, then added algorithms to guess mood based on that data. Next, they enlisted the assistance of 32 volunteers to help them test the accuracy of their code. The volunteers were asked to use the system for two months while also completing mood assessments to provide data for comparison. With no training or tweaking, the software was found to provide answers of happy, tense, calm, upset, excited, stressed, or bored that matched the actual mood reported by the volunteers, on average 66 percent of the time. After optimizing the system for the individual habits of each of the volunteers, the rate increased to 93 percent.

The researchers suggest third party hooks could be added to the software to allow for automatically transmitting user moods to applications like Facebook. They also acknowledge that privacy concerns could arise if the software were to be delivered to the public, but suggest the benefits of such software would likely outweigh such concerns. They note that sites like Netflix or Spotify could use data from MoodScope to offer movies or other content based on specific users' moods.


video

The team presented their findings at MobiSys 2013 held in Taiwan last month.

 

More information: MoodScope Building a Mood Sensor from Smartphone Usage Patterns: research.microsoft.com/apps/pubs/default.aspx?id=194498
 

Research paper: www.ruf.rice.edu/~mobile/publications/likamwa2013mobisys2.pdf


Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Powered by Blogger