Definition of CLOUD ROBOTING for cheap and popular cloud-robots with generic intelligence indistinguishable or beyond human.
In 1989 was "invented" the WWW which is the high level layer of application for humans over low level layer protocols TCP/IP created by DARPA in 60's years of last century for defense purposes.
Now it's time to create the new high level layer application for robots and all kind of machines with thousands (millions?) of specialized tools which I call "AI bricks", hosted in cloud (Internet). Will be used standards protocols in communications for structure mission information, independent of technology and OS used, as XML or more optimized JSON, uploading multimedia & sensor magnitudes, etc. to the cloud and obtaining a response "on demand" & Just In Time.
- 1.- Mobile new technologies (Smartphones) are mature and growing exponentially in features, latency delay times and bandwith with Wi-Fi (indoor robots) or 3G, 4G, … (outdoor robots). Smartphones are already produced in big scale and are commodities, so prices will decrease. (Client side brain=cheap robot = AI cloud-robots become popular)
- 2.- Cloud Computing technologies are mature and growing exponentially with social media, social networks, search engines, (server side brain = millions of expert tools, learning tools, NLP, etc. AI tools or "bricks" for high speed process and quick calculations and access to data, correlations of patterns for artificial vision, voice & face recognition, etc)
- Key players for image, video, face and voice recognition on cloud. High power calculate correlations and high storage of patterns for all real objects giving service to millions of concurrent real cloud-robots (machines): Google, Apple, Microsoft, Samsung…
- Startups and enterpreneurs with a high knowhow of specific field of knowledge and field of human behavior.
- Middle companyes with some experience in human behaviour, expertise or knowhow.
- Open Source/Hard community.
- Academic community.
- AI institutes and departments (exists from 1956, and they have had success in so many years in a lot of fields, ready to integrate as cloud-services for concurrent robots and machines use them)
- RTS (Robot Tools Server) for redirecting each cloud-robot (machine in real live) to the optimal specialized tool (like DNS in WWW) for complete a mission.(*)
- Expert tools: chess (Deep Blue machine won the human world champion Kasparov on 1997), psicology, emotional intelligence, lawyer, languages, empathy, medical, history, mycology, weather, philosophy, feelings, etc…
- Prediction tools.
- High process technic tools: Artificial vision for pattern recognition of thousands of objects with millions of patterns hosted in the Cloud. Voice recognition, Face recognition, etc. High Storage space for bilions of patterns of reality, as our brain stores.
- Storage tools: Personalized experience of each robot. Memory of robots live.
- Downloading specialized programs and firmware “on demand” for robots. (Remember MATRIX movie and download a program for pilot helicopters?)
- Learning tools: the robots can learn from their own past experience depending on if a past mission failed or not, memory, knowhow, etc. A human from his born is always learning from it's own experience and delays 18 years in get a bit of maturity, and more than 30 years in working professionally with experience... lets give the same time to strong and generalist AI cloud-robots to learn! Learning by Doing!
(*) The way to integrate in orquestrated way these AI bricks for not being a "large number of airplane parts or tools flying in formation" is key. So I introduce the concept of RTS (Robot Tools Server) than in human high level layer of application WWW over TCP/IP is something conceptually similar as DNS (Domain Name Server), thought for humans because humans we don't remember IP numbers, but we remember alphanumeric names and words and domains. In worldwide DNS, thousands of machines, are automatically updated when only one DNS is updated with a new domain name in less than 24h, and this extraordinary network of machines is largely distributed for assurance the service for humans.
RTS (Robot Tool Server) actuates as a Router and is similar than DNS concept but conceived for robots and machines (not humans), but more complex than DNS: a robot or machine provides to RTS a mission in XML predefined format, with optional attached URL files (information, multimedia, magnitudes captured by sensors, etc. uploaded and hosted in cloud repositories), and RTS redirects the robot to the optimal machine (IP) to complete his mission like DNS redirects to a IP when it receives a domain name. A regulated register of AI bricks or tools and tasks is required.
AI Bricks registered in RTS will survive by natural selection on depending of feedback from all worldwide millions of cloud-robots or it’s human owners, scoring successes or fails of robot ordered missions and tasks, scoring each AI Brick for deciding which cloud-tool better scored AI Brick use next time in similar circumstances.
This is the definition of the ecosystem for CLOUD ROBOTICS evoluting by natural selection exponential growth for strong and generic intelligence as human or beyond human, using a critical high number of specialized AI tools or “AI bricks” to construct and build interdisciplinary on cloud, for cheap and popular cloud-robots which will change our human civilization, which is close to collapse: