Friday, July 7, 2017

Examples of Using Augmented Glasses and Contact Lenses For Spy Purposes

Watch the video below. The sound is very bad, turn down your speakers and just read the subtitles that are provided in the video. To understand this better, see this post which covers the topic of using augmented glasses and contact lenses. For more about the DARPA artificial intelligence grid, see here.

Thursday, July 6, 2017

IBM Builds A Scalable Computer Chip Inspired By The Human Brain

See here for another article on this topic. See here for more about DARPA brain projects, see here for more about the B.R.A.I.N. Initiative.


By Alex Knapp


"I’m holding in my hand a chip with one million neurons, 256 million synapses, and 4096 cores. With 5.4 billion transistors, it's the largest chip IBM has built."

Dr. Dharmendra S. Modha sounds positively giddy as he talks to me on the phone. This is the third time I've talked to him about his long-term project - an IBM project with the goal of creating an entirely new type of computer chip, SyNAPSE, whose architecture is inspired by the human brain. This new chip is a major success in that project.

"Inspired" is the key word, though. The chip's architecture is based on the structure of our brains, but very simplified. Still, within that architecture lies some amazing advantages over computers today. For one thing, despite this being IBM's largest chip, it draws only a tiny amount of electricity - about 63 mW - a fraction of the power being drawn by the chip in your laptop.

What's more, the new chip is also scalable - making possible larger neural networks of several chips connected together. The details behind their research has been published today in Science.

"In 2011, we had a chip with one core," Modha told me. "We have now scaled that to 4096 cores, while shrinking each core 15x by area and 100x by power."

Each core of the chip is modeled on a simplified version of the brain's neural architecture. The core contains 256 “neurons” (processors), 256 “axons” (memory) and 64,000 “synapses” (communications between neurons and axons). This structure is a radical departure from the von Neumann architecture that's the basis of virtually every computer today (including the one you're reading this on.)

Work on this project began in 2008 in a collaboration between IBM and several universities over the years. The project has received $53 million in funding from the Defense Advanced Research Projects Agency (DARPA). The first prototype chip was developed in 2011, and a programming language and development kit was released in 2013.

"This new chip will provide a powerful tool to researchers who are studying algorithms that use spiking neurons," Dr. Terrence J. Sejnowski told me. Sejnowski heads Computational Neurobiology Laboratory at the Salk Institute. He's unaffiliated with IBM's project but is familiar with the technology. "We know that such algorithms exist because the brain uses spiking neurons and can outperform all existing approaches, with a power budget of 20 watts, less than your laptop."

It's important to note, though, that the SyNAPSE system won't replace the computers of today - rather, they're intended to supplement them. Modha likened them to co-processors used in high performance computers to help them crunch data faster. Or, in a more poetic turn as he continued talking to me, he called SyNAPSE a "right-brained" computer compared to the "left-brained" architecture used in computers today.

"Current von Neumann machines are fast, symbolic, number-crunchers," he said. "SyNAPSE is slow, multi-sensory, and better at recognizing sensor data in real-time."

So to crunch big numbers and do heavy computational lifting, we'll still need conventional computers. Where these "cognitive" computers come in is in analyzing and discerning patterns in that data. Key applications include visual recognition of patterns - something that Dr. Modha notes would be very useful for applications such as driverless cars.

As Sejnowski told me, "The future is finding a path to low power computing that solves problems in sensing and moving -- what we do so well and digital computers do so awkwardly."

And that's what IBM is looking to do with SyNAPSE - finding the patterns that normal computers can't. As Modha put it, "Google Maps can plot your route, but SyNAPSE can see if there's a pothole."

What gives the SyNAPSE an advantage in pattern recognition is that, unlike a traditional computer, which crunches data sequentially, its brain-inspired architecture allows for more parallel processing. For example, in a facial recognition app, one core of the chip might be focused on nose shape, one on hair texture and color, one on eye color, etc. Each individual core is slower than a traditional processor, but since they run simultaneously in parallel, the chip as a whole can perform this type of operation much more quickly and accurately.

Other potential applications for the chip include use in cameras to automatically identify interesting items in cluttered environments. Modha's team also believes that the chip could be quite useful in natural language processing - being able to parse out and obey commands from people. Kind of like the computers on Star Trek that understood when they were in use and when people were just talking among themselves.

It probably won't be long before we see more of these applications in action. The scalable chip that IBM developed was built using conventional fabrication techniques for other chips - it just requires some different workflow.

Already over 200 programs have been developed for the chip, thanks to a simulation of the architecture running on supercomputers at at the Lawrence Livermore and Lawrence Berkeley National Laboratories. Those simulations allowed IBM to develop a programming language for the chip even before it existed.

"We've been working with IBM for the last 18 months and are extremely impressed with their achievement," Prof. Tobi Delbruck of the Institute of Neuroinformatics at UZH-ETH Zurich told me. "Applications like real time speech and vision that run continuously on battery power may finally be within reach."

"It's too soon to say who will win the race to implement practical realizations of brain-like computing in silicon," Delbruck added. "but IBM's solution is a serious contender."

Now that this new chip architecture has been developed and a fabrication technique setup, Modha said that the technology now is "like the 4 minute mile. Now that someone's done it, a lot of people can do it."

To help facilitate the development of the chip, both on the hardware and software side, IBM has developed a teaching curriculum for universities, its customers, its employees, and more.

On the hardware end, Modha's next goal is the development of what he calls a "neurosynaptic supercomputer." This would be a traditional supercomputer that uses both traditional and SyNAPSE chips - a computer with both a left and right brain, as it were - enabling it both to crunch numbers and quickly analyze real-time patterns as the data's crunched.

One question that Modha couldn't answer, though, what what the new chip means for video games - nobody's programmed one for SyNAPSE yet.
"That's an interesting question," he laughed. "But we're too busy for games!"

More Examples of Classified Technology That Can Be Used In Contact Lenses and Sunglasses


Watch The Videos Below!

See here for more about mind control, light, and synthetic biology. See here for more about the world's smallest robots and nanotechnology. 

See here for a more complete list of nanotechnology. See here for more about mind control and brain chips. See here for changing brain structure and Transhumanism. 


See here for more about classified technology. See here for a clip from the movie Gamer about using what is called "synthetic telepathy" to influence others. See this Cisco ad referring to synthetic telepathy, see here for Mark Zuckerberg talking about it. See here and here for more about connecting brains: merging two brains to become one --- or the ethical implications of when "I" becomes "We."  
What can you do when you change the EEG of another person? Effectively, two brains become one. All of this manipulation to humans can be done by computers, (computer to human,) or human beings, (from one human to another human when they are both hooked up to a computer.)  

See this previous post about Microsoft HoloLens and this one for more about something similar in contact lenses. As the movie "They Live" shows, what helps the good guys to see the villains is special sunglasses or contact lenses.... now you know why. (Of course, some of the organized stalkers just use their cell phones to receive messages from those in higher places with access to more information or more advanced technology.) The CSIS videos literally show intelligence agents stalking and following people


Imagine being able to "dial" into a person and listen to their thoughts, not so hard to believe once you realize that everyone can easily be tied to something similar to a phone number or an I.P. address. This is also what they talk about in the movie Gamer. Meaning, they can listen to your thoughts through an earpiece or a pair of headphones they are wearing, or on a phone app. Just like a radio station. Once they have the technology inside your body, they can look out your eyes, (see this old clip of a scientist looking out of a cat's eyes, what they have now is way more advanced. They can see CLEARLY OUT YOUR EYES NOW.) This is like in the movie Gamer or Surrogates and when you are talking to them....they can EVEN SEE what you are thinking. 


Once you know that everything is light and resonance, and that light and sound can be measured --- that means what you are thinking,(photons,) thoughts can be seen, read and sent. If you can think it, you can send it. Just like writing an email with your mind. Once they can decode your EEG it doesn't take much more effort for them to use augmented cognitionaugmented reality and holography (like in the videos below,) so they can see what you are thinking. (See here, where the physicist Dr. John Morgan says 2 minutes and 10 seconds into the video, that all natural things emit radio waves. See this video here for how Michio Kaku admits the brain radiates radio. Also, watch this video with Michael Persinger where he talks about how light and photons can be measured.)

Sound nuts? When this technology is finally revealed people will see I am telling the truth.....it is real. Now you know why we live in a caste society with a bunch of low life (mostly Jewish) intelligence agents and organized crime psychos dancing on our heads. They are all pedophiles because they can look into your house and look out the eyes of children. THEY ARE ALL 
VOYEURS! They have access to almost everything they want. Just as a thought experiment, you already know that Google earth can see into your backyard or see your house, that technology is ancient and was sold to Google by the CIA. In May 25, 2007, the U.S. Director of National Intelligence Michael McConnell authorized the National Applications Office (NAO) of the Department of Homeland Security to allow local, state, and domestic Federal agencies to access imagery from military intelligence Reconnaissance satellites and Reconnaissance aircraft sensors which can now be used to observe the activities of U.S. citizens. The satellites and aircraft sensors will be able to detect chemical traces, and identify objects in buildings and "underground bunkers", and will provide real-time video at much higher resolutions than the still images produced by programs such as Google Earth. 


But that is not the worst of it. Right now there are mini nanotechnology cameras floating around everywhere. (See here for a quick introduction.)  There could also be a bunch of people acting like your friends who are really spies and know all of this. They could have access to your brain and your house. They could be watching you, your family or your kids. EVEN WORSE, IT COULD BE A FAMILY MEMBER OR YOUR SPOUSE! They will not tell you the truth because they are compulsive lying weasels. They are trained to lie. YES, IT IS CREEPY AS HELL. It is absolutely amazing that our governments and the media are covering this up from the population and guess what? WE PAY FOR IT! This is what our tax dollars go to in the intelligence agencies. 

Think of all the people using Google or who are on Facebook and Twitter. Now you know one way they can find people. If you are an interesting person, smart or attractive, there is probably someone watching you. See here and here for counterintelligence against political dissidents. 







Physicist Talks About Mind Reading and Brain Imaging

Michio Kaku talks about reading minds, recording dreams and brain imaging. To see an old clip of scientists looking out a cat's eyes, go here. See here for more about nanofibers in the brain. See here for more about nanotechnology. See here for the worlds smallest robots. You'll also notice that he mentions that the brain "radiates" radio. Just as the physicist Dr. John Morgan says 2 minutes and 10 seconds into this video, that all natural things emit radio waves. Also, watch this video with Michael Persinger where he talks about how light and photons can be measured. Now, see this video about how eyes emit energy. This plays into what I am saying here with classified technology.


Wednesday, July 5, 2017

Radio Telescope- See a Book On the Moon From Earth

Watch the video below, listen to what he says 1 minute and 24 seconds in. What do you think they have for intelligence agencies?  Imagine something even more powerful than the Hubble Telescope turned towards the Earth and used for surveillance. (See here for more about this.)

See here, where the physicist Dr. John Morgan says 2 minutes and 10 seconds into the video, that all natural things emit radio waves. Watch this video with Michael Persinger where he talks about how light and photons can be measured. Also, take notice in this video when the physicist mentions that the brain "radiates" radio. 


In May 25, 2007, the U.S. Director of National Intelligence Michael McConnell authorized the National Applications Office (NAO) of the Department of Homeland Security to allow local, state, and domestic Federal agencies to access imagery from military intelligence Reconnaissance satellites and Reconnaissance aircraft sensors which can now be used to observe the activities of U.S. citizens. The satellites and aircraft sensors will be able to detect chemical traces, and identify objects in buildings and "underground bunkers", and will provide real-time video at much higher resolutions than the still images produced by programs such as Google Earth. See here and here for more.

But that's not all, these people have access to technology that can find oil and mineral deposits underground with up to 99% accuracy. That includes diamonds! (See this ad from Shell as an example. Also, see here for an example of synthetic aperture radar satellites.)

IBM Develops a New Chip That Functions Like a Brain


See here for more about DARPA brain projects, see here for more about the B.R.A.I.N. Initiative.


By John Markoff


Inspired by the architecture of the brain, scientists have developed a new kind of computer chip that uses no more power than a hearing aid and may eventually excel at calculations that stump today’s supercomputers.

The chip, or processor, is named TrueNorth and was developed by researchers at IBM and detailed in an article published on Thursday in the journal Science. It tries to mimic the way brains recognize patterns, relying on densely interconnected webs of transistors similar to the brain’s neural networks.

The chip’s electronic “neurons” are able to signal others when a type of data — light, for example — passes a certain threshold. Working in parallel, the neurons begin to organize the data into patterns suggesting the light is growing brighter, or changing color or shape.

The processor may thus be able to recognize that a woman in a video is picking up a purse, or control a robot that is reaching into a pocket and pulling out a quarter. Humans are able to recognize these acts without conscious thought, yet today’s computers and robots struggle to interpret them.

The chip contains 5.4 billion transistors, yet draws just 70 milliwatts of power. By contrast, modern Intel processors in today’s personal computers and data centers may have 1.4 billion transistors and consume far more power — 35 to 140 watts.

Today’s conventional microprocessors and graphics processors are capable of performing billions of mathematical operations a second, yet the new chip system clock makes its calculations barely a thousand times a second. But because of the vast number of circuits working in parallel, it is still capable of performing 46 billion operations a second per watt of energy consumed, according to IBM researchers.

The TrueNorth has one million “neurons,” about as complex as the brain of a bee.

“It is a remarkable achievement in terms of scalability and low power consumption,” said Horst Simon, deputy director of the Lawrence Berkeley National Laboratory.

He compared the new design to the advent of parallel supercomputers in the 1980s, which he recalled was like moving from a two-lane road to a superhighway.

The new approach to design, referred to variously as neuromorphic or cognitive computing, is still in its infancy, and the IBM chips are not yet commercially available. Yet the design has touched off a vigorous debate over the best approach to speeding up the neural networks increasingly used in computing.

Photo
A silicon chip relies on webs of transistors similar to the brain’s neural networks. Credit I.B.M.

The idea that neural networks might be useful in processing information occurred to engineers in the 1940s, before the invention of modern computers. Only recently, as computing has grown enormously in memory capacity and processing speed, have they proved to be powerful computing tools.

In recent years, companies including Google, Microsoft and Apple have turned to pattern recognition driven by neural networks to vastly improve the quality of services like speech recognition and photo classification.

But Yann LeCun, director of artificial intelligence research at Facebook and a pioneering expert in neural networks, said he was skeptical that IBM’s approach would ever outpace today’s fastest commercial processors.

“The chip appears to be very limited in many ways, and the performance is not what it seems,” Mr. LeCun wrote in an email sent to journalists. In particular, he criticized as inadequate the testing of the chip’s ability to detect moving pedestrians and cars.

“This particular task,” he wrote, “won’t impress anyone in computer vision or machine learning.” Mr. LeCun said that while special-purpose chips running neural networks might be useful for a range of applications, he remained skeptical about the design IBM has chosen.

Several neuroscience researchers and computer scientists disputed his critique.

“The TrueNorth chip is like the first transistor,” said Terrence J. Sejnowski, director of the Salk Institute’s Computational Neurobiology Laboratory. “It will take many generations before it can compete, but when it does, it will be a scalable architecture that can be delivered to cellphones, something that Yann’s G.P.U.s will never be able to do.”

G.P.U. refers to graphics processing unit, the type of chip being used today to deliver graphics and video to computer screens and for special processing tasks in supercomputers.

IBM’s research was funded by the Defense Advanced Research Projects Agency, a research arm of the Pentagon, under a program called Systems of Neuromorphic Adaptive Plastic Scalable Electronics, or SyNapse. According to Gill Pratt, the program manager, the agency is pursuing twin goals in its effort to design ultralow-power biological processors.

The first, Dr. Pratt said, is to automate some of the surveillance done by military drones. “We have lots of data and not enough people to look at them,” he said.

The second is to create a new kind of laboratory instrument to allow neuroscientists to quickly test new theories about how brains function.

Correction: August 7, 2014

Because of an editing error, an earlier version of this article misstated the day on which the report of a new computer chip was published. It was Thursday, not Wednesday.

Correction: August 11, 2014

An article on Friday about a new IBM computer chip that is said to mimic the way a human brain works omitted the last word in the name of a program known by the acronym SyNapse, which funded IBM’s research. It is Systems of Neuromorphic Adaptive Plastic Scalable Electronics.