Now declassified & available online! Russian Quantum Leap technology enhances RNA, DNA & health, cures diseases (e.g. diabetes, cancer 2), stops TI targeting.
By Alfred Lambremont Webre
WATCH QUANTUM LEAP PANEL INTERVIEW
Now declassified & available online! Russian Quantum Leap technology enhances RNA, DNA & health, cures diseases (e.g. diabetes, cancer 2), stops TI targeting.
By Alfred Lambremont Webre
WATCH QUANTUM LEAP PANEL INTERVIEW
Advances in neuroscience offer the military the potential of mind-controlled weapons and performance enhancement. Are mind-controlled weapons and extra-sensory enhanced warfare techniques mere science fiction? Recent developments in neuroscience suggest not, with a new Royal Society report claiming that research in areas such as neuropharmacology, functional neuroimaging and neural interface systems could create a new breed of super soldier and diminish enemy ability.
Neuroscience is one of the most rapidly advancing fields in medicine, with highly-detailed imaging offering new insights into the way the brain works and direct brain interfaces enabling weapons to be targeted and fired with just a thought. The technology is not speculative – just last week scientists unveiled an implant called BrainGate that enabled a woman who had lost the use of her limbs after a devastating stroke to control a robot arm using thought processes alone.
The new report, “Neuroscience, Conflict and Security”, formed part of a series that examined the impact of neuroscience on society, dealing specifically with the potential application of advances in neuroscience to the armed forces and security personnel.
It was chaired by Prof Rod Flower FRS, professor of biochemical pharmacology at the William Harvey Research Institute, Queen Mary University of London, and brought together international experts to discuss new developments in the field and the laws and ethics that apply to their application in a military and civil context. According to Flower, it was the first time the Ministry of Defence’s (MoD) Defence Science and Technology Laboratory (Dstl) had worked with academia in neuroscience.
“A key advance in neuroscience has been improvements in real-time neuro-imaging, which can indicate in great detail which parts of the brain ‘light up’ when undertaking certain activities.”
“The people we were in contact with there were the horizon scanning team whose job it is to look out for potential new applications of research,” Flower said. “We relied on what was publicly available, and while the US is extremely open about what its military does, the UK is not so open, and countries like Russia and China are a complete cipher.”
A key advance in neuroscience has been improvements in real-time neuro-imaging, which can indicate in great detail which parts of the brain “light up” when undertaking certain activities. One of its applications could be to screen potential recruits for a specific role, for example to see if they are temperamentally suited to be a commander, pilot or diver.
“At the moment it’s very much a case of taking people on and subjecting them to high-stress exercises and choosing the ones who make it,” says Flower. “If they could be subjected to imaging during assessment you could identify who has good risk-taking behaviour, strategy and planning ability, or 3D analytical skills.”
Brain scanning could also speed up and improve target recognition or identify changes in surveillance satellite images by recognising subconscious objective identification rather than an operator having to process and actively react.
“It has been discovered that when you show the brain different images, it spots the differences between them even though they may not reach conscious awareness,” says Flower. “Wearing a helmet like a hairnet can pick up a spike in brain activity which you can correlate to differences identified between two images, even if they were flashed up too quickly to process consciously.”
That potentially has the ability not only to speed up the process of target selection but also improve accuracy. It could also reduce problems associated with fatigue, which is a big issue facing people whose job involves scanning images for a long time, especially in the dark, such as surveillance UAV operators.
Mind controlled weapons and aircrafts
Solutions for situational awareness – battlefield innovations
Situational awareness solutions allow soldiers to make effective use of varied information in a battlefield context.
Technologies such as the BrainGate implant have already shown that machinery can be controlled with the mind alone, and games manufacturers have already brought out low-cost helmet controllers than enable wearers to play by mind power alone. The obvious application for the military is mind-controlled weaponry and remotely-piloted aircraft, which could make operation and reactions far faster.
“If you couple that with your subconscious mind being much faster at dealing with information you can see a situation sometime in the future where you’re not thinking about flying the aircraft, but your subconscious is doing it without interfering in any way,” says Flower. “You would probably have a much better appreciation of an incoming threat and fire off a couple of missiles without having to consciously think.”
The report also examines evidence that certain drugs can improve the performance of personnel performing certain military tasks. Among these, drugs developed to relieve the symptoms of Attention Deficit Hyperactivity Disorder (ADHD) in children, such as Ritalin, have shown great promise on unaffected adults who want to focus their attention on a specific task.
“It could help when flying a long mission where you may become fatigued and your attention begins to drift off,” says Flower. “It could also help you focus when you have a lot of information to process, like being a fighter pilot in a particularly tense situation when you’re trying to get a missile lock on a target while the aircraft and radio are bombarding you with information and you have to communicate back.”
Another approach that could improve the way the brain works is known as trans-cranial electrical stimulation where electrodes attached to a 9V battery are clamped to the head. Control studies showed it can improve the rate at which things are learnt, and possibly result in better memory formation.
“One controversial subject the report touches on is that of neuropsychology-inspired chemical weapons.”
One controversial subject the report touches on is that of neuropsychology-inspired chemical weapons, discussing the fact that although the international Chemical Weapons Convention (CWC) bans the use of chemical weapons on the battlefield, they are allowed for civil law-enforcement purposes.
“One of the problems is as far as anyone in our field can find, it’s not possible to find a totally safe drug that you could use,” says Flower, citing the example of the Moscow theatre siege in which 150 civilians died alongside their Chechen rebel captors.
“It’s partly because everyone’s unique and responds in different ways. If you start spraying it around you may affect children, women, men, pregnant women, old men, people taking other drugs, and people with heart disease. It won’t just be the 70kg healthy young men on which these drugs are tested.”
Flower is also keen to bust some myths about some chemicals that were reportedly tested for their effects on enemy troops.
“Oxytocin is a hormone that’s produced in pregnancy that produces a feeling of emotional closeness and trust,” says Flower. “There was a lot of talk that you may be able to use this as an interrogation tool to make your captive trust you and tell you all his secrets. But as far as we can tell that’s all nonsense.”
Like automated weaponry and battlefield robotics, however, these new techniques could require an overhaul of ethical guidelines, especially with regards to civilian casualties. Currently the last person who gave the order to fire is responsible, but if it came from the operator’s subconscious, the line becomes blurred.
With advances in neuroscience holding such great potential for military applications, Flower would like to see the MoD to work closer with academia. One approach would be to have a two-way intern exchange between the MoD and academia.
“It’s not rocket science, the research is all out there, and most of it gets published,” says Flower. “It’s just a question of them being aware of it and able to pick up the ideas and exploit them before they read about it in Nature.”
The NSA is a data center to house a 512 qubit quantum computer capable of learning, reproducing the brain’s cognitive functions, and programming itself.
The National Security Center is building a highly fortified $2 Billion highly top secret complex simply named the “Utah Data Center” which will soon be home to the Hydrogen bomb of cybersecurity – A 512 Qubit Quantum Computer — which will revitalize the the “total information awareness” program originally envisioned by George Bush in 2003.
The news of the data center comes after Department of Defense contractor Lockheed Martin secured a contract with D-Wave for $10 million for a 512 qubit Quantum Computer code-named Vesuvius.
Vesuvius is capable of executing a massive number of computations at once, more than 100,000,000,000,000,000,000,000,000,000,000,000,000, which is would take millions of years on a standard desktop.
The computer will be able to crack even the most secure encryption and will give the US government a quantum leap into technologies once only dreamed of including the rise of the world’s very first all-knowing omniscient self-teaching artificial intelligence.
The D-Wave Quantum computer boasts of a wide array of features including:
D-Wave Defies World of Critics With ‘First Quantum Cloud’
The quantum computer is the holy grail of tech research. The idea is to build a machine that uses the mind-bending properties of very small particles to perform calculations that are well beyond the capabilities of machines here in the world of classical physics. But it’s still not completely clear that a true quantum computer can actually be built.
But Rose keeps fighting. In May, D-Wave published a paper in the influential journal Nature that backed up at least some of its claims. And more importantly, it landed a customer. That same month, mega defense contractor Lockheed Martin bought a D-Wave quantum computer and a support contract for $10 million.
The critics have been so vociferous in large part because Rose isn’t shy about promoting his company. But that’s just the way he is. Rose likens D-Wave’s quantum computers to the Large Hadron Collider, the world’s biggest particle accelerator. “They’re the largest programmable quantum systems that have ever been built by a long shot,” he says. And his latest pitch is that D-Wave is on verge of unveiling the world’s first quantum cloud. That’s right, quantum-computing-as-a-service.
D-Wave’s computer is designed to solve what are called combinatorial optimization problems. The classic example is figuring out the most efficient route for a traveling salesman going to multiple destinations. There’s no mathematical shortcut that computers can take to solve combinatorial optimization problems. They have to use brute force: Simply check all possible combinations. The trouble is, the number of possibilities explodes exponentially with the problem size. For example, if you have six destinations, there are 64 possible combinations. If you have 20 destinations, there are 1,048,576 possible combinations.
D-Wave’s next-generation computer is designed to handle problems with as many as 512 variables. In theory, that lets you solve problems involving two to the 512 possible combinations, and a problem of that size is beyond the reach of any classical computer that could ever be built. “It’s bigger than the number of atoms in the universe,” Rose says. “It doesn’t matter how big a supercomputer you make.”
He then convinced Lockheed Martin’s management to buy a D-Wave computer and install it in a lab at USC’s Information Sciences Institute. Lockheed Martin and USC split time on the machine, and Lockheed Martin’s access is via a secure network. The machine came online at noon on December 23, and the company now has 50 people working on it.
OK, so quantum computing may sound all very theoretical (and indeed at present a lot of it actually is!). However, practical quantum computing research is now very much under way. Perhaps most notably, back in 2007 a Canadian company called D-Wave announced what it described as “the world’s first commercially viable quantum computer”. This was based on a 16 qubit processor — the Rainer R4.7 — made from the rare metal niobium supercooled into a superconducting state. Back in 2007, D-Wave demonstrated their quantum computer performing several tasks including playing Sudoku and creating a complex seating plan.
Many people at the time were somewhat sceptical of D-Wave’s claims. However, in December 2009, Google revealed that it had been working with D-Wave to develop quantum computing algorithms for image recognition purposes. Experiments had included using a D-Wave quantum computer to recognise cars in photographs faster than possible using any conventional computer in a Google data centre. Around this time, there was also an announcement from IBM that it was rededicating resources to quantum computing research in the “hope that a five-year push [would] produce tangible and profound improvements”.
In 2011, D-Wave launched a fully-commercial, 128-qubit quantum computer. Called the D-Wave One, this is described by the company as a “high performance computing system designed for industrial problems encountered by fortune 500 companies, government and academia”. The D-Wave One‘s super-cooled 128 qubit processor is housed inside a cryogenics system within a 10 square meter shielded room. Just look at the picture here and you will see the sheer size of the thing relative to a human being. At launch, the D-Wave One cost $10 million. The first D-Wave One was sold to US aerospace, security and military giant Lockheed Martin in May 2011.
D-Wave aside, other research teams are also making startling quantum computing advances. For example, in September 2010, the Centre for Quantum Photonics in Bristol in the United Kingdom reported that it had created a new photonic quantum chip. This is able to operate at normal temperatures and pressures, rather than under the extreme conditions required by the D-Wave One and most other quantum computing hardware. According to the guy in charge — Jeremy O’Brien — his team’s new chip may be used as the basis of a quantum computer capable of outperforming a conventional computer “within five years”.
Another significant quantum computing milestone was reported in January 2011 by a team from Oxford University. Here strong magnetic fields and low temperatures were used to link — or “quantumly entangle” — the electrons and nuclei of a great many phosphorous atoms inside a highly purified silicon crystal. Each entangled electron and nucleus was then able to function as a qubit. Most startlingly, ten billion quantumly entangled qubits were created simultaneously. If a way an be found to link these together, the foundation will have been laid for an incredibly powerful computing machine. In comparison to the 128 qubit D-Wave One, a future computer with even a fraction of a 10 billion qubit capacity could clearly possess a quite literally incomprehensible level of processing power.
Here is a recap of the work the NSA is doing followed by recent technology breakthroughs in quantum physics and detailed overview of what quantum computing.
Cryptogon reports :
Well, it has been the $64,000 question for a couple of decades: Can NSA break something like PGP?
While there might be other black world technologies that could be up to the task (there’s no way to know), what we do know is that a practical quantum computing capability would be, for all intents and purposes, the master key.
I’m pretty confident that NSA has this capability and here’s why: IBM Breakthrough May Make Practical Quantum Computer 15 Years Away Instead of 50. There is no hard constant that one can point to when considering how much more advanced black world technologies are than what we think of as state of the art, but if IBM is 15 years away from building a useful quantum computer, it’s not a stretch to assume NSA has that capability already, or is close to having it.
Bamford lays out a narrative below about the “enormous breakthrough,” but, at the end of the day, it’s conventional computers. There’s no mention quantum computers, or even the far less “out there” photonic systems.
Is Bamford’s piece a limited hangout?
Maybe, but it makes for interesting reading in any event.
Note: For some reason, Bamford refers to Mark Klein as, “A whistle-blower,” without naming him. Because of Mark Klein, we know, for sure, that the mass intercepts are happening, how NSA is doing it, the equipment involved, etc. So, thanks, Mark Klein. Heroes have names on Cryptogon.
The NSA Is Building the Country’s Biggest Spy Center (Watch What You Say)
Under construction by contractors with top-secret clearances, the blandly named Utah Data Center is being built for the National Security Agency. A project of immense secrecy, it is the final piece in a complex puzzle assembled over the past decade. Its purpose: to intercept, decipher, analyze, and store vast swaths of the world’s communications as they zap down from satellites and zip through the underground and undersea cables of international, foreign, and domestic networks. The heavily fortified $2 billion center should be up and running in September 2013. Flowing through its servers and routers and stored in near-bottomless databases will be all forms of communication, including the complete contents of private emails, cell phone calls, and Google searches, as well as all sorts of personal data trails—parking receipts, travel itineraries, bookstore purchases, and other digital “pocket litter.” It is, in some measure, the realization of the “total information awareness” program created during the first term of the Bush administration—an effort that was killed by Congress in 2003 after it caused an outcry over its potential for invading Americans’ privacy.
But “this is more than just a data center,” says one senior intelligence official who until recently was involved with the program. The mammoth Bluffdale center will have another important and far more secret role that until now has gone unrevealed. It is also critical, he says, for breaking codes. And code-breaking is crucial, because much of the data that the center will handle—financial information, stock transactions, business deals, foreign military and diplomatic secrets, legal documents, confidential personal communications—will be heavily encrypted. According to another top official also involved with the program, the NSA made an enormous breakthrough several years ago in its ability to cryptanalyze, or break, unfathomably complex encryption systems employed by not only governments around the world but also many average computer users in the US. The upshot, according to this official: “Everybody’s a target; everybody with communication is a target.”
In the process—and for the first time since Watergate and the other scandals of the Nixon administration—the NSA has turned its surveillance apparatus on the US and its citizens. It has established listening posts throughout the nation to collect and sift through billions of email messages and phone calls, whether they originate within the country or overseas. It has created a supercomputer of almost unimaginable speed to look for patterns and unscramble codes. Finally, the agency has begun building a place to store all the trillions of words and thoughts and whispers captured in its electronic net. And, of course, it’s all being done in secret. To those on the inside, the old adage that NSA stands for Never Say Anything applies more than ever.
The data stored in Bluffdale will naturally go far beyond the world’s billions of public web pages. The NSA is more interested in the so-called invisible web, also known as the deep web or deepnet—data beyond the reach of the public. This includes password-protected data, US and foreign government communications, and noncommercial file-sharing between trusted peers. “The deep web contains government reports, databases, and other sources of information of high value to DOD and the intelligence community,” according to a 2010 Defense Science Board report. “Alternative tools are needed to find and index data in the deep web … Stealing the classified secrets of a potential adversary is where the [intelligence] community is most comfortable.” With its new Utah Data Center, the NSA will at last have the technical capability to store, and rummage through, all those stolen secrets. The question, of course, is how the agency defines who is, and who is not, “a potential adversary.”
According to Binney—who has maintained close contact with agency employees until a few years ago—the taps in the secret rooms dotting the country are actually powered by highly sophisticated software programs that conduct “deep packet inspection,” examining Internet traffic as it passes through the 10-gigabit-per-second cables at the speed of light.
The software, created by a company called Narus that’s now part of Boeing, is controlled remotely from NSA headquarters at Fort Meade in Maryland and searches US sources for target addresses, locations, countries, and phone numbers, as well as watch-listed names, keywords, and phrases in email. Any communication that arouses suspicion, especially those to or from the million or so people on agency watch lists, are automatically copied or recorded and then transmitted to the NSA.
The scope of surveillance expands from there, Binney says. Once a name is entered into the Narus database, all phone calls and other communications to and from that person are automatically routed to the NSA’s recorders. “Anybody you want, route to a recorder,” Binney says. “If your number’s in there? Routed and gets recorded.” He adds, “The Narus device allows you to take it all.” And when Bluffdale is completed, whatever is collected will be routed there for storage and analysis.
According to Binney, one of the deepest secrets of the Stellar Wind program—again, never confirmed until now—was that the NSA gained warrantless access to AT&T’s vast trove of domestic and international billing records, detailed information about who called whom in the US and around the world. As of 2007, AT&T had more than 2.8 trillion records housed in a database at its Florham Park, New Jersey, complex.
Verizon was also part of the program, Binney says, and that greatly expanded the volume of calls subject to the agency’s domestic eavesdropping. “That multiplies the call rate by at least a factor of five,” he says. “So you’re over a billion and a half calls a day.” (Spokespeople for Verizon and AT&T said their companies would not comment on matters of national security.)
After he left the NSA, Binney suggested a system for monitoring people’s communications according to how closely they are connected to an initial target. The further away from the target—say you’re just an acquaintance of a friend of the target—the less the surveillance. But the agency rejected the idea, and, given the massive new storage facility in Utah, Binney suspects that it now simply collects everything. “The whole idea was, how do you manage 20 terabytes of intercept a minute?” he says. “The way we proposed was to distinguish between things you want and things you don’t want.” Instead, he adds, “they’re storing everything they gather.” And the agency is gathering as much as it can.
Once the communications are intercepted and stored, the data-mining begins. “You can watch everybody all the time with data- mining,” Binney says. Everything a person does becomes charted on a graph, “financial transactions or travel or anything,” he says. Thus, as data like bookstore receipts, bank statements, and commuter toll records flow in, the NSA is able to paint a more and more detailed picture of someone’s life.
The NSA also has the ability to eavesdrop on phone calls directly and in real time. According to Adrienne J. Kinne, who worked both before and after 9/11 as a voice interceptor at the NSA facility in Georgia, in the wake of the World Trade Center attacks “basically all rules were thrown out the window, and they would use any excuse to justify a waiver to spy on Americans.” Even journalists calling home from overseas were included. “A lot of time you could tell they were calling their families,” she says, “incredibly intimate, personal conversations.” Kinne found the act of eavesdropping on innocent fellow citizens personally distressing. “It’s almost like going through and finding somebody’s diary,” she says.
Sitting in a restaurant not far from NSA headquarters, the place where he spent nearly 40 years of his life, Binney held his thumb and forefinger close together. “We are, like, that far from a turnkey totalitarian state,” he says.
Meanwhile, over in Building 5300, the NSA succeeded in building an even faster supercomputer. “They made a big breakthrough,” says another former senior intelligence official, who helped oversee the program. The NSA’s machine was likely similar to the unclassified Jaguar, but it was much faster out of the gate, modified specifically for cryptanalysis and targeted against one or more specific algorithms, like the AES. In other words, they were moving from the research and development phase to actually attacking extremely difficult encryption systems. The code-breaking effort was up and running.
The breakthrough was enormous, says the former official, and soon afterward the agency pulled the shade down tight on the project, even within the intelligence community and Congress. “Only the chairman and vice chairman and the two staff directors of each intelligence committee were told about it,” he says. The reason? “They were thinking that this computing breakthrough was going to give them the ability to crack current public encryption.”
Cyrptome further quotes the 4 paged wired article.
The NSA Is Building the Country’s Biggest Spy Center (Watch What You Say)
By James Bamford
March 15, 2012
[Excerpts of excellent NSA overview to focus on the MRF decryption facility.]
When Barack Obama took office, Binney hoped the new administration might be open to reforming the program to address his constitutional concerns. He and another former senior NSA analyst, J. Kirk Wiebe, tried to bring the idea of an automated warrant-approval system to the attention of the Department of Justice’s inspector general. They were given the brush-off. “They said, oh, OK, we can’t comment,” Binney says.
Sitting in a restaurant not far from NSA headquarters, the place where he spent nearly 40 years of his life, Binney held his thumb and forefinger close together. “We are, like, that far from a turnkey totalitarian state,” he says.
There is still one technology preventing untrammeled government access to private digital data: strong encryption. Anyone—from terrorists and weapons dealers to corporations, financial institutions, and ordinary email senders—can use it to seal their messages, plans, photos, and documents in hardened data shells. For years, one of the hardest shells has been the Advanced Encryption Standard, one of several algorithms used by much of the world to encrypt data. Available in three different strengths—128 bits, 192 bits, and 256 bits—it’s incorporated in most commercial email programs and web browsers and is considered so strong that the NSA has even approved its use for top-secret US government communications. Most experts say that a so-called brute-force computer attack on the algorithm—trying one combination after another to unlock the encryption—would likely take longer than the age of the universe. For a 128-bit cipher, the number of trial-and-error attempts would be 340 undecillion (1036).
Breaking into those complex mathematical shells like the AES is one of the key reasons for the construction going on in Bluffdale. That kind of cryptanalysis requires two major ingredients: super-fast computers to conduct brute-force attacks on encrypted messages and a massive number of those messages for the computers to analyze. The more messages from a given target, the more likely it is for the computers to detect telltale patterns, and Bluffdale will be able to hold a great many messages. “We questioned it one time,” says another source, a senior intelligence manager who was also involved with the planning. “Why were we building this NSA facility? And, boy, they rolled out all the old guys—the crypto guys.” According to the official, these experts told then-director of national intelligence Dennis Blair, “You’ve got to build this thing because we just don’t have the capability of doing the code-breaking.” It was a candid admission. In the long war between the code breakers and the code makers—the tens of thousands of cryptographers in the worldwide computer security industry—the code breakers were admitting defeat.
So the agency had one major ingredient—a massive data storage facility—under way. Meanwhile, across the country in Tennessee, the government was working in utmost secrecy on the other vital element: the most powerful computer the world has ever known.
The plan was launched in 2004 as a modern-day Manhattan Project. Dubbed the High Productivity Computing Systems program, its goal was to advance computer speed a thousandfold, creating a machine that could execute a quadrillion (1015) operations a second, known as a petaflop—the computer equivalent of breaking the land speed record. And as with the Manhattan Project, the venue chosen for the supercomputing program was the town of Oak Ridge in eastern Tennessee, a rural area where sharp ridges give way to low, scattered hills, and the southwestward-flowing Clinch River bends sharply to the southeast. About 25 miles from Knoxville, it is the “secret city” where uranium- 235 was extracted for the first atomic bomb. A sign near the exit read: what you see here, what you do here, what you hear here, when you leave here, let it stay here. Today, not far from where that sign stood, Oak Ridge is home to the Department of Energy’s Oak Ridge National Laboratory, and it’s engaged in a new secret war. But this time, instead of a bomb of almost unimaginable power, the weapon is a computer of almost unimaginable speed.
In 2004, as part of the supercomputing program, the Department of Energy established its Oak Ridge Leadership Computing Facility for multiple agencies to join forces on the project. But in reality there would be two tracks, one unclassified, in which all of the scientific work would be public, and another top-secret, in which the NSA could pursue its own computer covertly. “For our purposes, they had to create a separate facility,” says a former senior NSA computer expert who worked on the project and is still associated with the agency. (He is one of three sources who described the program.) It was an expensive undertaking, but one the NSA was desperate to launch.
Known as the Multiprogram Research Facility, or Building 5300, the $41 million, five-story, 214,000-square-foot structure was built on a plot of land on the lab’s East Campus and completed in 2006. Behind the brick walls and green-tinted windows, 318 scientists, computer engineers, and other staff work in secret on the cryptanalytic applications of high-speed computing and other classified projects. The supercomputer center was named in honor of George R. Cotter, the NSA’s now-retired chief scientist and head of its information technology program. Not that you’d know it. “There’s no sign on the door,” says the ex-NSA computer expert.
At the DOE’s unclassified center at Oak Ridge, work progressed at a furious pace, although it was a one-way street when it came to cooperation with the closemouthed people in Building 5300. Nevertheless, the unclassified team had its Cray XT4 supercomputer upgraded to a warehouse-sized XT5. Named Jaguar for its speed, it clocked in at 1.75 petaflops, officially becoming the world’s fastest computer in 2009.
1 Geostationary satellites
Four satellites positioned around the globe monitor frequencies carrying everything from walkie-talkies and cell phones in Libya to radar systems in North Korea. Onboard software acts as the first filter in the collection process, targeting only key regions, countries, cities, and phone numbers or email.
2 Aerospace Data Facility, Buckley Air Force Base, Colorado
Intelligence collected from the geostationary satellites, as well as signals from other spacecraft and overseas listening posts, is relayed to this facility outside Denver. About 850 NSA employees track the satellites, transmit target information, and download the intelligence haul.
3 NSA Georgia, Fort Gordon, Augusta, Georgia
Focuses on intercepts from Europe, the Middle East, and North Africa. Codenamed Sweet Tea, the facility has been massively expanded and now consists of a 604,000-square-foot operations building for up to 4,000 intercept operators, analysts, and other specialists.
4 NSA Texas, Lackland Air Force Base, San Antonio
Focuses on intercepts from Latin America and, since 9/11, the Middle East and Europe. Some 2,000 workers staff the operation. The NSA recently completed a $100 million renovation on a mega-data center here—a backup storage facility for the Utah Data Center.
5 NSA Hawaii, Oahu
Focuses on intercepts from Asia. Built to house an aircraft assembly plant during World War II, the 250,000-square-foot bunker is nicknamed the Hole. Like the other NSA operations centers, it has since been expanded: Its 2,700 employees now do their work aboveground from a new 234,000-square-foot facility.
6 Domestic listening posts
The NSA has long been free to eavesdrop on international satellite communications. But after 9/11, it installed taps in US telecom “switches,” gaining access to domestic traffic. An ex-NSA official says there are 10 to 20 such installations.
7 Overseas listening posts
According to a knowledgeable intelligence source, the NSA has installed taps on at least a dozen of the major overseas communications links, each capable of eavesdropping on information passing by at a high data rate.
8 Utah Data Center, Bluffdale, Utah
At a million square feet, this $2 billion digital storage facility outside Salt Lake City will be the centerpiece of the NSA’s cloud-based data strategy and essential in its plans for decrypting previously uncrackable documents.
9 Multiprogram Research Facility, Oak Ridge, Tennessee
Some 300 scientists and computer engineers with top security clearance toil away here, building the world’s fastest supercomputers and working on cryptanalytic applications and other secret projects.
10 NSA headquarters, Fort Meade, Maryland
Analysts here will access material stored at Bluffdale to prepare reports and recommendations that are sent to policymakers. To handle the increased data load, the NSA is also building an $896 million supercomputer here.
Russia Today Reports:
NSA Utah ‘Data Center’: Biggest-ever domestic spying lab
Overview of Camp Williams site before the construction works began. UDC will be located on the west side of the highway, on what was previously an airfield (Image from http://www.publicintelligence.net)
The biggest-ever data complex, to be completed in Utah in 2013, may take American citizens into a completely new reality where their emails, phone calls, online shopping lists and virtually entire lives will be stored and reviewed.
US government agencies are growing less patient with their own country with every month. First, paying with cash, shielding your laptop screen and a whole list of other commonplace habits was proclaimed to be suspicious – and if you see something you are prompted to say something. Then, reports emerged that drones are being fetched for police forces. Now, the state of Utah seems to be making way in a bid to host the largest-ever cyber shield in the history of American intelligence. Or is it a cyber-pool?
Utah sprang to media attention when the Camp Williams military base near the town of Bluffdale sprouted a vast, 240-acre construction site. American outlets say that what’s hiding under the modest plate of a Utah Data Complex is a prospective intelligence facility ordered by the National Security Agency.
Cyber-security vs. Total awareness
The NSA maintains that the data center, to be completed by September 2013, is a component of the Comprehensive National Cyber-security Initiative. The facility is to provide technical assistance to the Department of Homeland Security, collect intelligence on cyber threats and carry out cyber-security objectives, reported Reuters.
But both ordinary Americans and their intelligence community were quick to dub it “a spy center.”
The Utah Data Center will be built on a 240-acre site near Camp Williams, Utah. Once completed in September 2013, it will be twice as large as the US Capitol. The center will provide 100,000 square feet of computer space, out of a total one million square feet. The project, launched in 2010, is to cost the National Security Agency up to $2 billion
The highly-classified project will be responsible for intercepting, storing and analyzing intelligence data as it zips through both domestic and international networks. The data may come in all forms: private e-mails, cell phone calls, Google searches – even parking lot tickets or shop purchases.
“This is more than just a data center,” an official source close to the project told the online magazine Wired.com. The source says the center will actually focus on deciphering the accumulated data, essentially code-breaking.
This means not only exposing Facebook activities or Wikipedia requests, but compromising “the invisible” Internet, or the “deepnet.” Legal and business deals, financial transactions, password-protected files and inter-governmental communications will all become vulnerable.
Once communication data is stored, a process known as data-mining will begin. Everything a person does – from traveling to buying groceries – is to be displayed on a graph, allowing the NSA to paint a detailed picture of any given individual’s life.
With this in mind, the agency now indeed looks to be “the most covert and potentially most intrusive intelligence agency ever,” as Wired.com puts it.
William Binney, NSA’s former senior mathematician-gone-whistleblower, holds his thumb and forefinger close together and tells the on-line magazine:
“We are that far from a turnkey totalitarian state.”
‘Everybody is a target’
Before the data can be stored it has to be collected. This task is already a matter of the past, as the NSA created a net of secret monitoring rooms in major US telecom facilities – a practice that was exposed by people like William Binney in 2006.
The program allowed the monitoring of millions of American phone calls and emails every day. In 2008, the Congress granted almost impecible legal immunity to telecom companies cooperating with the government on national security issues.
By this time, the NSA network has long outgrown a single room in the AT&T building in San Francisco, says Binney:
“I think there are ten to twenty of them. This is not just San Francisco; they have them in the middle of the country and also on the East Coast.”
Binney suspects the new center in Utah will simply collect all the data there is to be collected. Virtually, no one can escape the new surveillance, created in the US for the War on Terror.
Some data, of course, would be crucial in the anti-terrorism battle: exposing potential adversaries. The question is how the NSA defines who is and who is not a potential adversary.
“Everybody is a target; everybody with communication is a target,” remarks another source close to the Utah project.
Breaking the unbreakable
Now, the last hurdle in the NSA’s path seems to be the Advanced Encryption Standard cipher algorithm, which guards financial transactions, corporate mail, business deals, and diplomatic exchanges globally. It is so effective that the National Security Agency even recommended it for the US government.
Here, the Utah data complex may come in handy for two reasons. First: what cannot be broken today can be stored for tomorrow. Second: a system to break the AES should consist of a super-fast computer coupled with a vast storage capabilities to save as many instances for analysis as possible.
The data storage in Utah, with its 1 million square feet of enclosed space, is virtually bottomless, given that a terabyte can now be stored on a tiny flash drive. Wired.com argues that the US plan to break the AES is the sole reason behind the construction of the Utah Data Center.
The eavesdropping issue has been rocking the US since the Watergate scandal in the 1970s, when domestic spying was eventually outlawed. Nowadays, a lot of questions are still being asked about the secret activities of the US government and whether it could be using the Patriot Act and other national security legislation to justify potentially illegal actions. The NSA’s former employees, who decided to go public, wonder whether the agency – which is to spend up to $2 billion on the heavily fortified facility in Utah – will be able to restrict itself to eavesdropping only on international communications.
Source: Russia Today
By Rob Waugh 09:26 GMT, 14 December 2011
The field of ‘synthetic biology’ is in its infancy. We can ‘tweak’ the genetics of life forms – but billionaire entrepreneur Craig Venter only created ‘artificial life’ for the first time last year, christening his life form ‘Synthia’.
But experts working within the field believe that our expertise is out-accelerating natural evolution by a factor of millions of years – and some warn that synthetic biology could spin out of control.
It could lead, says Andrew Hessel of Singularity University, on Nasa’s research campus, to a world where hackers could engineer viruses or bacteria to control human minds.
Hessel believes that genetic engineering is the next frontier of computing.
‘This is one of the most powerful technologies in the world,’ says Hessel ‘Synthetic biology – the writing of life.’
‘I advocate that cells are living computers and DNA is a programming language.’
‘I want to see life programmed and used to solve global challenges so that humanity can achieve a sustainable relationship within the biosphere,’ he says.It’s growing fast. It will grow faster than computer technologies.’
He predicts a world where we can ‘print’ DNA, and even ‘decode’ it. But he warned, in a speech at technology conference TXM, that viruses and bacteria send chemicals into human brains – and could be used to influence, or even ‘control’ their host.
A literal virus – injected into a ‘host’ in the guise of a vaccine, say – could be used to control behaviour.
Hessel warns that we ‘may have to learn how to counterattack’ against such weapons.
Security expert Marc Goodman said, ‘Synthetic biology will lead to new forms of bioterrorism,’ and said, ‘Bio-crime today is akin to computer crime in the early Eighties, Few initially recognised the problem – but it grew exponentially.’
When billionaire entrepreneur Craig Venter ‘created life’ last year by adding synthetic DNA to a bacteria cell, Professor Julian Savulescu, an Oxford University ethicist, said: ‘Venter is creaking open the most profound door in humanity’s history, potentially peeking into its destiny.This could be used in the future to make the most powerful bioweapons imaginable. The challenge is to eat the fruit without the worm.’
Hessel, however, is generally optimistic about the future of synthetic biology.
The scientist – who had a vasectomy because he ‘never trusted the process’ of natural reproduction, says, ‘We are going to make synthetic genomes – human genomes. It will make cloning look organic. It will make human reproduction look quaint.’
Computer World blogger Darlene Storm says, ‘I know people who can’t even keep their computers protected, updated and patched – I wonder if they would be more security minded when the hacking could be lethal?’
There has been some activity in recent weeks on the development of avatars, as in the film, or at least some agreement on feasibility and intention to develop, with real actual funding.
The concept is that you could inhabit another body and feel it is yours. I have written many times about direct brain links, superhuman AIs, shared consciousness and so on, since 1992, and considered a variety of ways of connecting. It has been fun exploring the possibilities and some of the obvious applications and dangers. For a few years it seemed to be just Kurzweil and me, but gradually a number of people joined in, often labelling themselves transhumanists. Now that it is more obvious how the technology might spin out, the ideas are becoming quite mainstream and no longer considered the realm of cranks. Many quite respectable scientists are now involved.
Google DARPA and avatar and you’ll see a lot of recent commentary on the DARPA project to create surrogate soldiers, just like we see them in the film. Not tomorrow, but by around 2045. Why then? Well, 2045 is the date when some of us expect to be able to do a full direct brain link, at least in prototype. I think with a lot of funding and the right brains involved, it is entirely achievable then.
But DARPA won’t have it all to themselves. The Russians are also looking at it, and hosted a recent conference. Dmitry Itskov, founder of Russia 2045, has been given permission to develop his own avatar program. Check this out:
From their conference press release:
The first Global Future Congress 2045 (GF2045) was held on Feb.17-20 in Moscow, where 56 world leading physicists, biologists, anthropologists, sociologists, psychologists and philosophers met to discuss breakthroughs in life extension technologies and draft a resolution to the United Nations setting the radical lengthening of human lifespan and the creation of Avatars as a priority for preservation of humankind.
About 500 people attended the three-day event featuring presentations by over 50 scientists including inventor Ray Kurzweil, Microsoft Research Director Rane Johnson-Stempson, and Astronaut Sergey Krichevskiy. The event was focused on breakthrough technologies that could create a synthetic body-vessel for the mind, offering humans unlimited prolongation of life to the point of immortality…..
Among the featured life-extension projects is “2045” a Russia-based Avatar project consisting of three phases. First, to create a humanoid robot named “Avatar”, and a state-of-the-art brain-computer interface system. Next, to create a life support system between the “Avatar” and the human brain. The final step is creating an artificial brain in which to transfer the original individual consciousness.
Development of a cybernetic body. This is about as advanced as it gets currently. You can link to nerves, and transmit signals to and from them to capture and relay sensations. But this will progress quickly over coming years as we start seeing strong positive feedback among the nano-bio-info-cogno disciplines. I’m just annoyed that I am not just starting my career about now, it would be an excellent time to do so. But at least I’ll get pleasure from saying ‘I told you so’ a few times.
I won’t repeat all the exciting possibilities for the military, sex and games industries, or electronic immortality, I’ve blogged enough on these. For now, it’s just great to see the field moving another important step further from sci-fi into the realms of reality
In a scene right out of a George Orwell novel, a team of scientists working in the fields of “neural engineering” and “Biomimetic MicroElectronic Systems” have successfully created a chip that controls the brain and can be used as a storage device for long-term memories. In studies the scientists have been able to record, download and transfer memories into other hosts with the same chip implanted. The advancement in technology brings the world one step closer to a global police state and the reality of absolute mind control.
More terrifying is the potential for implementation of what was only a science fiction fantasy – the “Thought Police” – where the government reads people’s memories and thoughts and can then rehabilitate them through torture before they ever even commit a crime based on a statistical computer analysis showing people with certain types of thoughts are likely to commit a certain type of crime in the future.
The Matrix reality: Scientists successfully implant artificial memory system
It seems the sci-fi industry has done it again. Predictions made in novels like Johnny Mnemonic and Neuromancer back in the 1980s of neural implants linking our brains to machines have become a reality.
Back then it seemed unthinkable that we’d ever have megabytes stashed in our brain as Keanu Reeves’ character Johnny Mnemonic did in the movie based on William Gibson’s novel. Or that The Matrix character Neo could have martial arts abilities uploaded to his brain, making famous the line, “I know Kung Fu.” (Why Keanu Reeves became the poster boy of sci-fi movies, I’ll never know.) But today we have macaque monkeys that can control a robotic arm with thoughts alone. We have paraplegics given the ability to control computer cursors and wheelchairs with their brain waves. Of course this is about the brain controlling a device. But what about the other direction where we might have a device amplifying the brain? While the cochlear implant might be the best known device of this sort, scientists have been working on brain implants with the goal to enhance memory. This sort of breakthrough could lead to building a neural prosthesis to help stroke victims or those with Alzheimer’s. Or at the extreme, think uploading Kung Fu talent into our brains.
Decade-long work led by Theodore Berger at University of Southern California, in collaboration with teams from Wake Forest University, has provided a big step in the direction of artificial working memory. Their study is finally published today in theJournal of Neural Engineering. A microchip implanted into a rat’s brain can take on the role of the hippocampus—the area responsible for long-term memories—encoding memory brain wave patterns and then sending that same electrical pattern of signals through the brain. Back in 2008, Berger told Scientific American, that if the brain patterns for the sentence, “See Spot Run,” or even an entire book could be deciphered, then we might make uploading instructions to the brain a reality. “The kinds of examples [the U.S. Department of Defense] likes to typically use are coded information for flying an F-15,” Berger is quoted in the article as saying.
In this current study the scientists had rats learn a task, pressing one of two levers to receive a sip of water. Scientists inserted a microchip into the rat’s brain, with wires threaded into their hippocampus. Here the chip recorded electrical patterns from two specific areas labeled CA1 and CA3 that work together to learn and store the new information of which lever to press to get water. Scientists then shut down CA1 with a drug. And built an artificial hippocampal part that could duplicate such electrical patterns between CA1 and CA3, and inserted it into the rat’s brain. With this artificial part, rats whose CA1 had been pharmacologically blocked, could still encode long-term memories. And in those rats who had normally functioning CA1, the new implant extended the length of time a memory could be held.
Source: Smart Planet
USC: Restoring Memory, Repairing Damaged Brains
Biomedical engineers analyze—and duplicate—the neural mechanism of learning in rats
LOS ANGELES, June 17, 2011 /PRNewswire-USNewswire/
Scientists have developed a way to turn memories on and off—literally with the flip of a switch.
(Image Information)For stroke or Alzheimer’s victims, the promise of Dr. Theodore Berger’s recent breakthrough is enormous: imagine a prosthetic chip inserted in the brain that imitates the function of a brain’s damaged hippocampus (the region associated with long term memory). The current successful laboratory tests on rats, restoring long term memory at the flick of a switch, will next be duplicated in primates (monkeys) and eventually humans. (PRNewsFoto/USC Viterbi School of Engineering)
Using an electronic system that duplicates the neural signals associated with memory, they managed to replicate the brain function in rats associated with long-term learned behavior, even when the rats had been drugged to forget.
“Flip the switch on, and the rats remember. Flip it off, and the rats forget,” said Theodore Berger of the USC Viterbi School of Engineering’s Department of Biomedical Engineering.
Berger is the lead author of an article that will be published in the Journal of Neural Engineering. His team worked with scientists from Wake Forest University in the study, building on recent advances in our understanding of the brain area known as the hippocampus and its role in learning.
In the experiment, the researchers had rats learn a task, pressing one lever rather than another to receive a reward. Using embedded electrical probes, the experimental research team, led by Sam A. Deadwyler of the Wake Forest Department of Physiology and Pharmacology, recorded changes in the rat’s brain activity between the two major internal divisions of the hippocampus, known as subregions CA3 and CA1. During the learning process, the hippocampus converts short-term memory into long-term memory, the researchers prior work has shown.
“No hippocampus,” says Berger, “no long-term memory, but still short-term memory.” CA3 and CA1 interact to create long-term memory, prior research has shown.
In a dramatic demonstration, the experimenters blocked the normal neural interactions between the two areas using pharmacological agents. The previously trained rats then no longer displayed the long-term learned behavior.
“The rats still showed that they knew ‘when you press left first, then press right next time, and vice-versa,’” Berger said. “And they still knew in general to press levers for water, but they could only remember whether they had pressed left or right for 5-10 seconds.”
Using a model created by the prosthetics research team led by Berger, the teams then went further and developed an artificial hippocampal system that could duplicate the pattern of interaction between CA3-CA1 interactions.
Long-term memory capability returned to the pharmacologically blocked rats when the team activated the electronic device programmed to duplicate the memory-encoding function.
In addition, the researchers went on to show that if a prosthetic device and its associated electrodes were implanted in animals with a normal, functioning hippocampus, the device could actually strengthen the memory being generated internally in the brain and enhance the memory capability of normal rats.
“These integrated experimental modeling studies show for the first time that with sufficient information about the neural coding of memories, a neural prosthesis capable of real-time identification and manipulation of the encoding process can restore and even enhance cognitive mnemonic processes,” says the paper.
Next steps, according to Berger and Deadwyler, will be attempts to duplicate the rat results in primates (monkeys), with the aim of eventually creating prostheses that might help the human victims of Alzheimer’s disease, stroke or injury recover function.
The paper is entitled “A Cortical Neural Prosthesis for Restoring and Enhancing Memory.” Besides Deadwyler and Berger, the other authors are, from USC, BME Professor Vasilis Z. Marmarelis and Research Assistant Professor Dong Song, and from Wake Forest, Associate Professor Robert E. Hampson and Post-Doctoral Fellow Anushka Goonawardena.
Berger, who holds the David Packard Chair in Engineering, is the Director of the USC Center for Neural Engineering, Associate Director of the National Science Foundation Biomimetic MicroElectronic Systems Engineering Research Center, and a Fellow of the IEEE, the AAAS, and the AIMBE.
SOURCE USC Viterbi School of Engineering
Following the link to the University website we find the following research centers and programs associated with the school.
National Research Centers
» Biomimetic MicroElectronic Systems
» Center for Energy Nanoscience
» Integrated Media Systems Center
» DHS Center for Risk and Economic Analysis of Terrorism Events
» The National Center for Metropolitan Transportation Research
» Listing of Viterbi School Research Centers and Labs
This technology has potential for a wide array of applications. It could even be the breakthrough needed to create the the first long-imagined artificial intelligence network.
However, given the association between the University and the Federal Government’s Department of Homeland Security, and related studies on terrorism, which is constantly being used as an excuse to chip away at the civil liberties and constitutional rights of US citizens, my bets are the Feds will use this in the war on terror before they try using it for good.
That means the potential for misuse to enact a true Orwellian-style “thought police” and even the ability to implement complete mind control among hosts.
From cosmetics to cars, many products we use on a daily basis already utilise nanotechnology – but are you aware of the implications? Penny Sarchet and David Adam explain all you need to know
What is nanotechnology?
Nanotechnology is technology that operates on the nanoscale, about one billionth of a metre. If a living cell were a large city, then a nanometre would be about the size of a car. Nanotechnology is the art of engineering down at this hard-to-fathom scale.
The idea started in 1959 when famous physicist Richard Feynman suggested we could manipulate individual atoms and use them to build tiny machines. However, the term “nanotechnology” was not coined until the 1980s and lumps together different and varied ideas.
All that unites these different technologies is that they use nano-sized building blocks. While other technologies make machines out of bulk materials – microchips out of silicon, wires out of copper, cars out of steel – nanotechnology makes machines out of large, complex molecules. Because nanotechnology works at such an extreme and unexplored scale, it opens up a world of new possibilities. Many nanomaterials possess special properties, such as great strength or high ability to conduct electricity.
Scientists working in the field of nanotechnology often look to nature to provide ideas for “smart” ways to solve complex problems. For example, spider silk and lotus leaves have both been studied in order to replicate their special properties, ie their tensile strength or ability to repel water, in engineered materials.
Why should I care about nanotechnology?
Nanotechnology is not just one technology, it is a whole new toolkit. It has the potential to change almost everything, from unimaginably small computer chips to tiny machines that find and fix damaged arteries inside our bodies. Nanotechnology could make our energy cleaner, our lives longer, and all of our existing technologies better.
This is not just a distant, science fiction-like dream – nanomaterials are already present in more than 1,000 consumer products, from cosmetics to cars. However, because nanotechnology works on such a new scale, it can be difficult to assess its dangers.
Nanotechnology is a powerful tool for answering some of our most difficult questions. Scientists say we cannot afford to ignore the new medical, agricultural and environmental technologies it could provide as we search for solutions to an expanding global population.
Consumers will face different risks than the workers who manufacture these products, as they are the ones exposed to high doses. Various agencies are involved in making sure protection measures are adequate for both groups.
Where is nanotechnology used?
Nanotechnology is used in every sector you can think of, from the cars we drive to the clothes we wear.
It is easy to see why ultra-lightweight materials with special electrical properties are useful, in electronics and computing for example, but nanomaterials are also present in scratch-resistant car bumpers, self-cleaning glass and anti-odour socks. Nanotech materials are being added to health and fitness products, from anti-ageing creams to skis.
Silver is a powerful anti-microbial agent and more than 300 products use nanoscale silver to make anti-bacterial surfaces, clothing and even condoms.
Nanotechnology can also address environmental concerns. Nanotech catalytic converters and water filters remove environmental pollutants from our exhaust fumes and waste water. Wind turbines with nanomaterials are more efficient and cheaper, nanocrystal solar panels are just around the corner. Nanotechnology is already providing fresh solutions wherever it is applied.
The first use of nanomedicine was approved back in 1995 to treat cancer; since then researchers have continued to find new ways for nanotechnology to combat diseases.
Nanosilver, if over-used in consumer products that have a short lifespan, might cause harm to the environment.
How could it be used in the future?
Today, many of our products are improved by nanomaterials. In the future, however, nanotechnology aims to use these nanomaterials to construct tiny nano-engineered machines, computers and medicines.
In energy research, we may see a shift from using nanomaterials to improve existing technologies to using nanotechnology to develop entirely new ways of harnessing energy and, in the future, cars might be powered cleanly by hydrogen, stored safely in a solid form thanks to nanotechnology.
One day, nanotechnology may allow us to build any kind of structure we want from atomic building blocks to construct powerful computers, capable of processing as much information as a DNA molecule.
Researchers are already developing nanomedicines that target the proteins that accumulate in the brains of Alzheimer’s patients, and nanoparticles that bind to tumour cells and treat them with antibodies. The big step will be using them to treat human diseases.
As with the adoption of all technologies, it is important to understand the motivation for doing so (eg money, public benefit) and its impacts, particularly risks.
Should nanoproducts be labelled?
In November 2009, the European Union passed a law that will soon force manufacturers of cosmetics to state on the label if their products contain nanoparticles.
Will more products follow? The issue of labelling has divided campaigners and politicians, just as it has with whether foods should be labelled as containing genetically modified ingredients.
Supporters of labels on nanotech say they will encourage consumer choice. Opponents say they will be meaningless without extra information on whether the nano-ingredients pose any risk. Without context, such labels could be misinterpreted as warnings, they say.
Lipsticks and face creams that contain nanoparticles will soon be labelled as such. Perhaps “nano-free” could then start to appear on products too?
Labels might be a shortcut for industry to appear “transparent” without really informing the public.
Is the use of nanomaterials in food safe?
The food industry has been slow to exploit nanotechnology, perhaps because of public attitudes towards “non-natural” foods. But there are numerous ways the technology could be used, from stronger plastic films to keep sandwiches fresh for longer, to adding flavours to foods.
In most cases, the development of such techniques has proceeded faster than the safety checks that are needed to make sure they are safe – could nanoparticles escape from plastic wrapping to enter a ham baguette? Would that packaging need to be disposed of differently and could it be recycled?
The European Food Standards Authority is keeping a close watch on developments.
Nanoparticles could indicate the presence of harmful bacteria in foods.
The safety of nanomaterials used in food and food packaging must be fully assessed to avoid unwanted side effects.
What is the real added value of nanotech?
Nanotechnology is such a broad term that it is difficult to generalise about the value it could bring.
There will be clear social value, in the form of cleaner energy and improved medicines; and individual value in that nanotechnology used in sporting equipment, for example, would improve an individual’s performance. On the other hand, having hundreds of products that are anti-bacterial most likely will not bring many additional benefits to society.
Value, to a certain extent, is in the eye of the user of the technology. A patient who can benefit from one type of tiny particle injected into their bloodstream to test for heart disease and so avoid surgery, for example, would probably see more added value than an average tennis player who started playing with a racket made and improved by using nanomaterials. A mis-hit still goes into the net after all, nanotech racket or not.
Nanotechnology is being used as a marketing tag to sell everything from socks and t-shirts to sports equipment. The true benefit comes from the way it is used by the consumer.
Do nanoproducts require special disposal?
One of the concerns over the widespread use of nanotechnology is that tiny particles could escape from where they are intended to be used and end up polluting the wider environment.
This is not a problem confined to nanotechnology of course, and a number of chemicals that we know to be toxic are used in everyday products such as light bulbs and batteries. Consumers are asked to dispose of these products carefully, but there is no way to be sure that they will do so. And some escapes are impossible to prevent – for example, much of the mercury in the atmosphere comes from the cremation of dead people with dental fillings.
The effects of nanoparticles on the wider environment are largely unknown, so perhaps a cautious approach is the best one?
Factories and research laboratories could release nanoparticles in their waste streams. Many international bodies are looking at ways in which accidental release of nanoparticles can be monitored and prevented.
Toxic particles could pollute the natural environment and kill wildlife.
KiPnews related Articles;
Neotame is officially marketed as an inexpensive artificial sweetener made by NutraSweet, which is a former division of Monsanto and original manufacturer of aspartame.
Eighty percent of all Food and Drug Administration (FDA) complaints pertain to aspartame’s adverse reactions.
These reports include: grand mal seizures, brain tumors, blindness and other health-related problems, including deaths. Monsanto’s Nick Rosa stated in 1998, that Neotame is “based on the aspartame formula.”
It is up to 13,000 times sweeter than sucrose (table sugar). The product is very attractive to food manufacturers, as its use greatly lowers the cost of production compared to using sugar or high fructose corn syrup (due to the lower quantities needed to achieve the same sweetening).
Neotame is aspartame plus 3-di-methylbutyl, which can be found on the EPA’s list of most hazardous chemicals.
The aspartame formula is comprised of Phenylalanine [50%], which caused seizures in lab animals and Aspartic Acid [40%], which caused “holes in the brains” of lab animals — bonded by Methyl Alcohol, or Methanol [10%] which is capable of causing blindness, liver damage and death.
Methanol, or wood alcohol in aspartame breaks down further in heat and in the body, into Formaldehyde (embalming fluid), Formic Acid (venom in ant stings) and the most deadly of all — Diketopiperazine (DKP), a brain tumor agent.
When it comes to human health, neotame is in the same dangerous category as aspartame, but it is a deadlier neurotoxin, immunotoxin and excitotoxin. The long-term effects are essentially cell-death.
Even Monsanto’s own pre-approval studies of neotame revealed adverse reactions. Unfortunately, Monsanto only conducted a few one-day studies in humans rather than encouraging independent researchers to obtain NIH funding to conduct long-term human studies on the effects of neotame.
There were NO independent studies that found neotame to be safe. All industry-funded studies are now being found to be based on very poorly designed, deceptive and fraudulent research .
This is no surprise given all of the problems with aspartame industry research and scientific abuse. It is clear that any neotame research that Monsanto, industry groups, or consultants of Monsanto should be rejected until which time more trustworthy, independent research can be conducted. Such experiments should include independent animals studies and especially long-term (e.g., 4-5 years+) human studies in various susceptible population groups.
Approval and Labeling
Neotame was approved by the FDA for general use in July 2002, and has now been approved by the EU. It is also is approved for use in Australia, New Zealand and Canada.
The FDA loosened all labeling requirements for Neotame as part of a large-scale effort to make it a near-ubiquitous artificial sweetener, to be found on the tabletop, in all prepared foods, even in organics. It simply does not have to be included in the ingredient list. How’s that for stealth?
If you purchase processed foods, whether USDA Certified Organic or not, that food may likely contain Neotame because it is cost-effective, and since no one knows it is there, there is no public backlash.
The USDA states that their National Organic Program (NOP) does not permit the use of neotame in products labeled certified organic, however this is likely a deceptive ploy to soothe the public’s concerns about this toxic sweetener.
Since the USDA is controlled by politicians and lobbyists, it cannot be trusted to follow through to protect any of its regulatory policies. The NOP is a division within the USDA in charge of regulating the USDA Certified Organic products, labeling, enforcement etc. Considering the size of this division in comparison to the amount of organic food they regulate, NOP standards are arguably as lax and useless as USDA’s conventional foods. The employees that enforce NOP standards know this very well.
Bottom Line: Don’t trust USDA organic foods and confide in local farms with reputable practices.
Where Do We Go From Here?
Due to corporate greed, it is becoming quite apparent that the entire food supply is becoming one toxic wasteland that none of us can rely on. We need to support local farms and move our sustenance back to sustainable farming practices that benefit the population rather than harm it.
If you’re still consuming processed foods with artificial sweeteners, you are gambling with your long-term well being.
There are no corporations that serve agribusiness that can be trusted to safeguard public health, and the regulatory agencies that are officially in charge of that mandate are in bed with them. Where does that leave the safety of the food industry? I think you can figure that one out.
Marco Torres is a research specialist, writer and consumer advocate for healthy lifestyles. He holds degrees in Public Health and Environmental Science and is a professional speaker on topics such as disease prevention, environmental toxins and health policy.
KiPnews related articles;