Who needs tech specs when you could have a $10 HUD on your visual cortex

"Image from the glassbrain project, neuroscapelab.com, UCSF

The Defense Advanced Research Projects Agency (DARPA) is developing a brain interface it hopes could inject images directly into the visual cortex.

news of the “Cortical Modem” project has emerged in transhumanist magazine Humanity Plus, which reports the agency is working on a direct neural interface (DNI) chip that could be used for human enhancement and motor-function repair.

Project head Dr Phillip Alvelda, Biological Technologies chief with the agency, told the Biology Is Technology conference in Silicon Valley last week the project had a short term goal of building a US$10 device the size of two stacked nickels that could deliver images without the need for glasses or similar technology.

The project was built on research by Dr Karl Deisseroth whose work in the field of neuroscience describes how brain circuits create behaviour patterns.

Specifically the work dealt in Deisseroth’s field of Optogenetics, where proteins from algae could be inserted into neurons to be subsequently controlled with pulses of light.

“The short term goal of the project is the development of a device about the size of two stacked nickels with a cost of goods on the order of $10 which would enable a simple visual display via a direct interface to the visual cortex with the visual fidelity of something like an early LED digital clock,” the publication reported.

“The implications of this project are astounding.”

The seemingly dreamy research was limited to animal studies, specifically the real time imaging of a zebra fish brain with some 85,000 neurons, due to the need to mess with neuron DNA and the ‘crude device’ would be a long way off high fidelity augmented reality, the site reported.

DARPA’s Biological Technologies Office was formed last April to cook up crazy ideas born at the intersection of biology and physical science. Its mind-bending research fields are geared to improve soldiers’ performance, craft biological systems to bolster national security, and future the stability and well-being of humanity.

The project follows DARPA’s upgrading of the heavy-set Atlas robot which was granted a battery allowing it to move about free of its electrical umbilical cord.

The agency also revealed biometric tracking that could identify users based on how they moved a mouse in what was dubbed a ‘cognitive fingerprint’ and slated as a possible replacement for password authentication. ®

By Tiberio Caetano, NICTA

Much of big data comes from people. Web logs, mobile phone usage, financial transactions, insurance claims, you name it: it’s being recorded for potential further analysis to generate business value and improved customer experience.

It goes by the name of customer analytics, and large retailers and service providers, at least in the US, are obsessed with it.

Online businesses are significantly ahead of traditional bricks and mortar businesses when it comes to leveraging data to drive business value. The major reasons are cultural, social and operational.

These online businesses are much closer to a truly scientific culture in which every idea or proposition is automatically considered a hypothesis subject to testing rather than a heavenly insight for which the burden of evidence can be waived. They not only have an obsession with measurement, but also with experimentation.

A scientific approach to business

The design of experiments, data collection, analysis and understanding are what characterise scientific enterprise. So, in order to embrace big data, it’s necessary to embrace science, meaning its values, culture and new methods based on machine learning, which is the automation of hypothesis generation (from data) and testing (against data).

Yet, a scientific culture is not what you will find in a typical bricks and mortar business.

The top online companies have researchers and scientists who seriously understand science and its new machine learning method. Google recently hired Geoffrey Hinton, the father of neural networks and deep learning (instances of machine learning). It has also just reportedly acquired, for US$500 million, a startup comprised of deep learning experts.

Facebook followed suit by catching Yan LeCun, who pioneered the use of neural networks to solve large-scale real-world problems.

Extracting value from data requires not only the right tools but also the right leaders to build the right teams to use these tools (and build the ones that still don’t exist).

Bricks and mortar businesses in general do not have such people on board, although those who are ahead are desperately trying to hire them. The bad news is that the demand for those people is way, way beyond the supply.

Another crucial point is that those giant online properties have an operational model in which the results of the science can make their way into every decision that results in some intervention, with relatively small cost. This is in contrast to traditional businesses that are burdened with a range of channels each with legacy IT systems and human processes.

Data analysis is itself innocuous unless it drives some form of action. Internet companies have mastered this trade through computational advertising. The causal business effect of interventions such as displaying an ad in a webpage is quantified precisely by how much an advertiser has bid for having the ad displayed or clicked on.

The user’s feedback (in general through clicking or not) is then automatically sent back to a machine learning algorithm that learns how profitable that ad is (per customer). The loop is then closed. The system that determines the intervention allocation policy monitors the business outcomes of every intervention and from that updates the policy automatically so as to maximise the business value of future allocations.

The offline world

What to say of existing bricks and mortar businesses in this regard? Josh Wills, director of data science at Cloudera, a leading big data solutions provider for enterprise, claims no one is doing this automated closed-loop revenue generation mechanism apart from the giant online properties.

Maybe he is right, maybe not. But even if there are others doing this, there is certainly a long way to go. Granted, there are existing data-driven policies for marketing, credit scoring, pricing and other activities in big service providers like banks, telecoms and insurance companies.

But even in the US such large corporations suffer with the operational issues of legacy systems, as well as cultural and technological silos that simply make it too hard to integrate data science and intervention policy in a closed loop across a variety of business areas.

So, what’s the solution? I don’t think there is any silver bullet. The best bet I would place is simply to follow what has worked for online businesses: work as fast as possible on acquiring the right culture, people and operations model. In the US and Europe, some large retailers and service providers have been moving fast.

Walmart has had for years a large team dedicated to data science to leverage the historical purchase data to better tailor offers to its customers. Retailer Target made headlines two years ago when New York Times reporter Charles Duhigg brought to the public’s attention the now famous incident of one of Target’s analytics models predicting a teenager’s pregnancy before her father did.

One of the world’s largest mobile carriers, Telefonica from Spain, has several years ago established a scientific research group in machine learning.

Although Australian companies are in general significantly behind, in the past two years a few large corporations have started to make moves on the people side by succeeding in hiring data scientists. A notable domestic event was Woolworths recently acquiring a 50% stake in data analytics company Quantium.

Whether such and other large retailers and service providers will go a step beyond by realising a cultural and operational shift is also required remains to be seen.

The Conversation

Tiberio Caetano is affiliated with NICTA and its subsidiary, Ambiata Pty Ltd, as well as with the Australian National University. His role at Ambiata focusses on growing data-rich businesses through use of large-scale machine learning systems.

•• This article was originally published on The Conversation. Read the original article.

NewSell

(Boardroom Books, New York 1984) by Michael Hewitt-Gleeson

Chapter Nine, Page 78

“… The cognos is the larger universe within which, at some point, exists the cosmos. Before something “pops” into existence in the cosmos, it exists in the cognos. We could say that the cognos is the “before” and the “cosmos” is the after.

“… Just as the cosmos is made out of atoms, electrons, quarks, etc., the cognos itself is made out of cogns (pronounced “cones”). A cogn is simply the smallest unit, or particle, that makes up the cognos. Cogns “pop” into existence in the cognos when we simply focus on them. When we think about an area of the cognos, it comes into existence as a result of our thinking about it. This is just like when we look at something in the cosmos, it comes into existence, to our attention, as a result of our looking at it. Look over your shoulder now and, by looking, whatever is there to be seen will pop into your attention.

Example: If I think about “socks made of glass” that idea has popped into existence in the cognos simply because I thought about it. The cogns that make up that part of the cognos which can be described as “glass socks” now exist simply because I thought about them. An object pops into existence in the cosmos as a result of it already existing in the cognos. Thus, if I want to think more about the idea of “glass socks”, I can examine it even more, focus on it to a greater depth, and eventually make a pair of “glass socks”. When I have done that, the glass socks not only exist in the cognos but also have popped into existence in the cosmos. Just as before America popped into existence in the cosmos, it first existed in the cognos through Columbus’ mind.

The cognos could be described as that larger universe, or meta-universe, within which exists the cosmos. How big is the cognos? Well, it’s hard to say, of course, but it’s certainly much, much bigger than the cosmos, which may only be a tiny corner of the cognos. Of what is the cognos made? The cognos is made of cogns which pop into existence when we think about them.”

 

DFQ: In cognitive physics the Theory of the Cognos is not a theory of everything but a theory of anything. So, what is an important question that you would like to see explored in cognitive physics and by cognitive physicists? What is it about your question that makes it important?

Please add your contribution below …

New Scientist Life:

The brain’s power will turn out to derive from data processing within the neuron rather than activity between neuron, suggests University of Cambridge research biologist Brian J. Ford.

“Each individual neuron is itself a computer, and the brain a vast community of microscopic computers… the human brain may be a trillion times more capable than we imagine,” he adds.

••Click to read the original article …

“Scientists and engineers change the world. I’d like to tell you about a magical place called DARPA where scientists and engineers defy the impossible and refuse to fear failure!”

THOUGHT EXPERIMENT:

What would happen if you removed the fear of failure?

DFQ: What if you knew you couldn’t fail?