A group of American researchers has created 25 000 individual "invisibility" cloaks. They are just 30 micrometres in diameter and are laid out together on a 25 millimetre gold sheet.
We report the first experimental realization of an array of broadband invisibility cloaks that operates in the visible frequency range. Such an array is capable of cloaking ~20% of an unlimited surface area. The wavelength and angular dependences of the cloak array performance have been studied.
Wider implications.
Building and studying the arrays of invisibility cloaks offers more refined experimental tools to test cloak performance. Compared to the characterization of individual cloaks, the angular performance of cloak arrays appears to be more sensitive to cloak imperfections. These findings may be useful in such related areas as acoustic and surface-wave cloaking, as well as in the potential practical applications listed above.
They could be used to slow down, or even stop, light, creating what is known as a "trapped rainbow".
The trapped rainbow could be utilised in tiny biosensors to identify biological materials based on the amount of light they absorb and then subsequently emit, which is known as fluorescence spectroscopy. Slowed-down light has a stronger interaction with molecules than light travelling at normal speeds, so it enables a more detailed analysis.
Lead author of the study, Dr Vera Smolyaninova, said: "The benefit of a biochip array is that you have a large number of small sensors, meaning you can perform many tests at once. For example, you could test for multiple genetic conditions in a person's DNA in just one go.
"In our array, light is stopped at the boundary of each of the cloaks, meaning we observe the trapped rainbow at the edge of each cloak. This means we could do 'spectroscopy on-a-chip' and examine fluorescence at thousands of points all in one go."
The 25 000 invisibility cloaks are uniformly laid out on a gold sheet, with each having a microlens that bends light around itself, effectively hiding an area in its middle. As the light squeezes through the gaps between each of the cloaks, the different components of light, or colours, are made to stop at ever narrower points, creating the rainbow.
Read more »
Ben Goertzel and Joel Pitt of Novamente LLC have written Nine Ways to Bias Open-Source AGI Toward Friendliness
While it seems unlikely that any method of guaranteeing human-friendliness ("Friendliness") on the part of advanced Artificial General Intelligence (AGI) systems will be possible, this doesn't mean the only alternatives are throttling AGI development to safeguard humanity, or plunging recklessly into the complete unknown. Without denying the presence of a certain irreducible uncertainty in such matters, it is still sensible to explore ways of biasing the odds in a favorable way, such that newly created AI systems are significantly more likely than not to be Friendly. Several potential methods of effecting such biasing are explored here, with a particular but non-exclusive focus on those that are relevant to open-source AGI projects, and with illustrative examples drawn from the OpenCog open-source AGI project. Issues regarding the relative safety of open versus closed approaches to AGI are discussed and then nine techniques for biasing AGIs in favor of Friendliness are presented:
1. Engineer the capability to acquire integrated ethical knowledge.
2. Provide rich ethical interaction and instruction, respecting developmental stages.
3. Develop stable, hierarchical goal systems.
4. Ensure that the early stages of recursive self-improvement occur relatively slowly and with rich human involvement.
5. Tightly link AGI with the Global Brain.
6. Foster deep, consensus-building interactions between divergent viewpoints.
7. Create a mutually supportive community of AGIs.
8. Encourage measured co-advancement of AGI software and AGI ethics theory.
9. Develop advanced AGI sooner not later.
Read more »
Salk scientists say their findings may lead to strategies to treat age-related diseases and improve regenerative medicine.
Stem cells are essential building blocks for all organisms, from plants to humans. They can divide and renew themselves throughout life, differentiating into the specialized tissues needed during development, as well as cells necessary to repair adult tissue.
Therefore, they can be considered immortal, in that they recreate themselves and regenerate tissues throughout a person's lifetime, but that doesn't mean they don't age. They do, gradually losing their ability to effectively maintain tissues and organs.
Now, researchers at the Salk Institute for Biological Studies have uncovered a series of biological events that implicate the stem cells' surroundings, known as their "niche," as the culprit in loss of stem cells due to aging. Their findings, published May 23rd in Nature, have implications for treatment of age-related diseases and for the effectiveness of regenerative medicine.
These fluorescent microscope images of testes from young (left) and old(right) fruit flies show the effect of aging on the stem cell niche (top center). The hub cells (red) that function as part of the stem cells' supporting niche express more of a microRNA known as let-7 (green) in aged flies, which changes the signaling properties of hub cells, leading to fewer stem cells surrounding the hub that are available for tissue maintenance. Image: Courtesy of the Salk Institute for Biological Studies
Nature - The let-7–Imp axis regulates ageing of the Drosophila testis stem-cell niche
Read more »
In 2013, Blueseed, will be the first-ever sea-based tech incubator.
After struggling with visa issues to come to Silicon Valley and start his own company, Dascalescu said he was inspired by the notion of creating ocean communities in international waters, so that entrepreneurs wouldn't need a visa to essentially startup 12 miles off the California coast.
Dascalescu, who is an ambassador for nonprofit organization The Seasteading Institute, and his co-founders Max Marty and Dario Mutabdzija, have so far received 240 applications from 800 entrepreneurs hailing from 52 countries. Venture capital firms and angel investors can also recommend startups to Blueseed.
Read more »
A recent report by the African Development Bank projected that, by 2030, much of Africa will attain lower-middle- and middle-class majorities, and that consumer spending will explode from $680 billion in 2008 to $2.2 trillion.
Bank estimates suggest that Africa's GDP could increase to over US$15.7 trillion in 2060, from a base of US$1.7 trillion in 2010. Consequently, income per capita expressed in current US dollar terms should grow from US$1,667 in 2010 to over US$5,600 by 2060. While this would represent a major leap forward in standard of living, it is still less than the current South Korean per capita GDP of US$17,000. However, a less optimistic scenario sees real GDP growth accelerating up to 2020, before decelerating to around 5% per annum. The total GDP would then be $12.2 trillion in 2060 and per capita GDP about US$4600.
Africa is projecting to be about where China is now on a per capita GDP basis in 50 years.
Seven of the world's 10 fastest-growing economies are African. The continent is famously resource rich, which has surely helped, but some recent studies suggest that the biggest drivers are far less customary for Africa, and far more encouraging for its future: wholesale and retail commerce, transportation, telecommunications, and manufacturing.
Read more »
China signaled on Wednesday it wanted to ramp up private investment in its energy sector, in line with recently unveiled government plans to fast-track infrastructure investment to help combat a protracted economic slowdown.
That followed the announced plan to allow private investment into the vast railway sector, which is struggling with mounting debts and a corruption scandal while attempting to resolve infrastructure bottlenecks.
Allowing private firms to pour money into the railways, banking, energy and healthcare sectors will give a boost to the world's second-largest economy as the government shuns fresh fiscal stimulus.
The moves seem to be designed to avoid stimulus that would increase imbalanced investment and attempt to make more efficient investments.
Read more »
A novel approach to designing artificial materials could enable magnetic devices with a wider range of properties than those now available.
Luk'yanchuk and the team mathematically modelled a two-dimensional array of metamolecules comprising a silicon sphere next to a partially incomplete copper ring. They studied the influence of both the sphere and the split ring on the magnetic component of an incident electromagnetic wave — a property known as magnetization.
"When the two structures were more than one micrometer apart, they both acted to increase the local magnetic field," says Luk'yanchuk. However, they started to interact when moved closer together, and the researchers observed that the magnetization of the split ring decreases and even becomes negative for separations smaller than 0.5 micrometers.
This situation is somewhat analogous to the magnetic ordering in 'natural' materials. When all the atoms contribute in a positive way to a material's magnetic properties, the material becomes a ferromagnet. However, when alternating regions of the material have opposite magnetization, the material is said to be antiferromagnetic.
"We demonstrate that our hybrid lattices of metamolecule exhibit distance-dependent magnetic interaction, opening new ways for manipulating artificial antiferromagnetism with low-loss materials," explains Luk'yanchuk.
An array of metamolecules comprising silicon spheres and copper split-rings can be used to control magnetization waves. © 2012 American Chemical Society
Read more »
NY Times -
PNAS - Improving fluid intelligence with training on working memory
In the Jaeggi (2008) study, the researchers began by having participants complete a test of reasoning to measure their "fluid" intelligence — the ability to draw connections between things, solve novel problems and adapt to new situations. Then some of the participants received up to eight hours of training in a difficult cognitive task that required paying careful attention to two streams of information (a version of this task is now marketed by Lumosity); others were assigned to a control group and received no such training. Then all of the participants took a different version of the reasoning test.
The results were startling. The authors reported that the trained participants showed a larger gain in the reasoning test than the control group did, and despite the relatively brief period of training, this gain was large enough that it would be expected to substantially improve performance in everyday life.
A University of North Carolina study known as the Abecedarian Early Intervention Project, children received an intensive educational intervention from infancy to age 5 designed to increase intelligence. In follow-up tests, these children showed an advantage of six I.Q. points over a control group (and as adults, they were four times more likely to graduate from college). By contrast, the increase implied by the findings of the Jaeggi study was six I.Q. points after only six hours of training — an I.Q. point an hour.
Read more »
There is a 19 page study that predicts a nuclear famine if a previous study of a nuclear autumn is correct. The nuclear autumn article is not correct.
1. I will repeat my case on why nuclear winter does not happen
2. They then try to build upon a slight drop in temperature (that will not happen) in order to say their will be a 10% drop in agricultural production
3. The agricultural production drop is assumed to hit everyone who gets marginal food by 10% so that they all drop into starvation and die.
Looking at the unique conditions in Hiroshima
I have examined the case for climate effects of an exchange of nuclear weapons in detail.
A nuclear winter is predicated on current cities all reacting to nuclear weapons the Hiroshima did and having a firestorm in order to put a lot of soot into the stratosphere. I will summarize the case I made then in the next section. there is significant additions based on my further research and email exchanges that I had with Prof Alan Robock and Brian Toon who wrote the nuclear winter research.
The Steps needed to prove nuclear winter:
1. Prove that enough cities will have firestorms or big enough fires (the claim here is that does not happen)
2. Prove that when enough cities in a sufficient area have big fire that enough smoke and soot gets into the stratosphere (trouble with this claim because of the Kuwait fires)
3. Prove that condition persists and effects climate as per models (others have questioned that but this issue is not addressed here
The nuclear winter case is predictated on getting 150 million tons (150 teragram case) of soot, smoke into the stratosphere and having it stay there. The assumption seemed to be that the cities will be targeted and the cities will burn in massive firestorms. Alan Robock indicated that they only included a fire based on the radius of ignition from the atmospheric blasts. However, in the scientific american article and in their 2007 paper the stated assumptions are:
assuming each fire would burn the same area that actually did burn in Hiroshima and assuming an amount of burnable material per person based on various studies.
The implicit assumption is that all buildings react the way the buildings in Hiroshima reacted on that day.
Therefore, the results of Hiroshima are assumed in the Nuclear Winter models.
* 27 days without rain
* with breakfast burners that overturned in the blast and set fires
* mostly wood and paper buildings
* Hiroshima had a firestorm and burned five times more than Nagasaki. Nagasaki was not the best fire resistant city. Nagasaki had the same wood and paper buildings and high population density.
Read more »
Researchers sponsored by Semiconductor Research Corporation (SRC), the world's leading university-research consortium for semiconductors and related technologies, today announced that they have successfully created contact hole patterns for a wide variety of practical logic and memory devices using a next-generation directed self-assembly (DSA) process. Applying a relatively simple combination of chemical and thermal processes to create their DSA method for making circuits at 22 nanometers (nm), the research team at Stanford University projects that the nanofabrication technique will enable pattern etching for next-generation chips down to 14nm.
In contrast to the current state-of-art lithography methods that rely on increasingly less-accurate steps to shrink transistor and circuit sizes, the achievement at Stanford provides both a more affordable and more environmentally friendly path to fabricating smaller semiconductor devices. The advancement can be utilized for enhancements not only to the electronics industry, but possibly for other nanoscale devices as well.
"This is the first time that the critical contact holes have been placed with DSA for standard cell libraries of VLSI chips. The result is a composed pattern of real circuits, not just test structures," said H.-S. Philip Wong, lead researcher at Stanford for the SRC-guided research. "This irregular solution for DSA also allows you to heal imperfections in the pattern and maintain higher resolution and finer features on the wafer than by any other viable alternative."
EETimes - By solving one of the outstanding lithographic problems facing further scaling—the tiny contact holes that connect semiconductors to their substrate—researchers at Stanford University have demonstrated working circuits at 22-nanometer and a clear path to 14-nanometers, as well as a bee-line on the chemistry developments needed to scale to single digit sizes.
Advanced Materials - Flexible Control of Block Copolymer Directed Self-Assembly using Small, Topographical Templates: Potential Lithography Solution for Integrated Circuit Contact Hole Patterning
Read more »
NASA's Marshall Space Flight Center in Huntsville, Ala., completed wind tunnel testing for Space Exploration Technologies (SpaceX) of Hawthorn, Calif., to provide Falcon 9 first stage re-entry data for the company's advanced reusable launch vehicle system.
If Spacex gets a reusable (with insignificant maintenance) first stage they could lower the costs of their launches by about half. If they get all stages reusable with very low maintenance costs they could lower costs by one hundred times.
Tests were conducted at several orientations and speeds ranging from Mach numbers 0.3, or 228 miles per hour at sea level, to Mach 5, or 3,811 miles per hour at sea level, to gage how the first stage reacts during the descent phase of flight.
In addition to wind tunnel testing, Marshall is providing propulsion engineering support to SpaceX in the development of the SuperDraco Launch Abort System (LAS) and on-orbit propulsion systems. Marshall is supplying SpaceX with Reaction Control Systems lessons learned that will be incorporated into the Dragon spacecraft's design for steering and attitude control. Marshall engineers also are providing technical insight in the development of materials and processes to support future improvements of the Falcon 9 and Dragon to be used in the SpaceX Commercial Crew Development Program.
The reusable first stage is shown on the left after landing
Read more »
Wired Danger Room - DARPAs Living Foundries project was first announced by the agency last year. Darpa has handed out seven research awards worth $15.5 million to six different companies and institutions including the University of Texas at Austin and the California Institute of Technology. Two contracts were also issued to the J. Craig Venter Institute. Dr. Venter was among the first scientists to sequence a human genome, and his institute was, in 2010, the first to create a cell with entirely synthetic genome.
"Living Foundries" aspires to turn the slow, messy process of genetic engineering into a streamlined and standardized one. Of course, the field is already a burgeoning one: Scientists have tweaked cells in order to develop renewable petroleum and spider silk that's tough as steel. And a host of companies are investigating the pharmaceutical and agricultural promise lurking — with some tinkering, of course — inside living cells.
Read more »
Lockheed appears to be on track for deploying combat versions of the HULC exoskeleton into Afghanistan in early 2013 or even late in 2012.
The Deployment of exoskeletons in commercial sectors will probably remain quite limited for another decade or so, due to their high cost (more than $25,000 per suit). There should be about 11,000 exoskeletons by 2020.
The HULC can assist speed marching at up to 7 mph reduces this somewhat; a battery-draining "burst" at 10mph is the maximum speed
A soldier with a pack would normally go at 3 mph maximum and cover 10-12 miles in a day. Exoskeleton Soldiers could also carry lightweight foldable electric scooters on their exoskeleton that would enable 60-100 mph on roads. If the bike had motocross like capabilities it could still go about 30-60 mph on rougher terrain.
* Lockheed Martin's (LM) Squad Mission Support System (SMSS) has passed a final round of tests at Fort Riley, Kansas, before scheduled deployment to Afghanistan in 2011. The system, which turns a six-wheeled amphibious ATV into a robotic packhorse and charging station, has been subjected to a variety of simulated warzone environments in both remote controlled and fully autonomous modes"
The SMSS can carry a squad's food supplies, water, batteries, heavy weapons, ammunition, survival gear and can even accommodate casualties. Besides transporting up to 600lbs (272 kg) of gear, the SMSS also provides two to four kilowatts of power, and is capable of charging 146 batteries within ten hours.
The HULC exoskeletoned soldiers can carry foldable dirtbikes to enable speeds of 80mph. They would be better served with squad mission systems that could operate up to 80 mph and with several times the cargo capacity. The exoskeleton soldiers could swap out different mission modules for their 200 pound capacity from a faster and larger exo-squad SMSS.
Read more »
More Recent Articles
|
|
0 comments:
Post a Comment