Advanced Stealth – Part 2

Quantcast

Advanced Stealth – Part 2
by, Concept Acvtivity Research Vault ( CARV )
April 21, 2009
CALIFORNIA, Los Angeles – April 21, 2009 – The Los Angeles Times article ( below ) mentions NORTHROP GRUMMAN prototype development testing of lighter than air stealth type dirigibles in the Washington, D.C. Metroplex Area ( New Jersey, Pennsylvania, Maryland, Virginia, and Washington, D.C. ).
Might central Virginia’s NORTHROP GRUMMAN SPERRY MARINE DIVISION ( U.S. Route 29 / South Seminole Trail, Charlottesville, Virginia ) test site monitor new stealth technology equipped aerial vehicles?
Airship vessel ships and crafts whose surface skin material that can be remotely-controlled to illuminate like an artificial star or approaching aircraft with landing lights brightly shining downward from reaction to ground-based InfraRed-like invisible light pumped onto a stealth dirigible  ( lighter-than-air, pressurized helium gas, aerial platform, LEO vehicle ), a surveillance vessel whose external surface skin has the technological capability to gather invisible light and convert that hidden light into a refracted visible white-light spectrum like a extremely large floodlight / landing-light / strobe light with emissions emminating from buoyant airborne surveillance platforms. Imagine, central Virginia Greene County western quadrant foothills – near Rockingham County – with an artificial star, a snooping blimp that doesn’t look like a blimp but just an innocent looking star above in the night sky.
Conventional technologies would normally be comprised of bulky and heavy items such as arrays of refractive mirrors, heat diffusers, high-energy collectors / converters, high-energy power supplies, and even more that could not possibly be launched inside a lighter-than-air dirigible airship platform; at least not a blimp that would still wish to remain invisible or un-observable from the ground.
If the craft, vehicle, or vessel surface skin was made from a new electronic sensitivity controlled material, like electronic textile materials the U.S. Department of Defense ( DoD ) DARPA MEMS Biofutures Program has been working on for over 10-years, it might be a material so advanced that it could – on demand / command – ’react’ to any spectrum of invisible light ( blue-green laser communication beamlight ) and be controlled from a long distance wireless control station that might make anything possible. What about super secret electronic textiles and hidden super characteristic capabilities?
What about ‘transparent airships’ and other the new advanced stealth technologies?
– – – –
Scientists Create Observable Fiber Webs That See
July 2006
In a radical departure from conventional optic lens type materials, MIT scientists developed a sophisticated optical system comprised of mesh type webbed light detection fibers.
Having a number of advantages over its conventional lens predecessors, the constructs of this new fiber is currently capable of measuring the direction, intensity and phase of light ( a property used to describe a light wave ) without the lenses, filters, or detector array classic elements of conventional optical systems such as eyes or cameras.
Researchers expect the new system will be capable of much more.
Potential applications range from ‘improved space telescopes’ to ‘clothing that provides situational awareness’ to soldiers, or ‘those visually impaired’.
The transparent fiber-webs could even enable huge computer screens to be activated with beams of light instead of the touch of a finger. “We could use light to enhance interaction with computers and even gaming systems,” said Professor Yoel Fink of the Department of Materials Science and Engineering and the Research Lab of Electronics, leader of the team. “It’s intriguing–the idea of touching with light.”
The scientists report the work in the June 25 online edition of Nature Materials, and it is featured on the cover of the July print issue of the magazine.
The human eye, digital and film cameras, and even the Hubble space telescope rely on lenses and detector surfaces ( like the retina ) to create images. But while these systems deliver excellent images, they are constrained by their size, weight, fragility and limited field of view.
In contrast, the fiber webs are flexible and lightweight. Plus, a fiber web in the shape of a sphere can sense the entire volume of space around it, according to Fink.
“When you’re looking at something with your eyes, there’s a particular direction you’re looking in,” says Ayman Abouraddy a research scientist in Fink’s lab. “The field of view is defined around that direction. Depending on the lens, you may be able to capture a certain field of view around that direction, but that’s it. Until now, most every optical system was limited by an optical axis or direction.”
In addition to having an unlimited field of view, the fiber sphere can also detect the direction of incoming light. Light enters the transparent sphere at one point and exits at another, providing a directional reference back to the light source.
Fink’s team has also created a flat, two-dimensional web of fibers and placed two such webs in parallel. These constructs, which can measure the intensity of incoming light, are capable of generating rough images of objects placed near them, such as the shape of a letter “E” cut stencil-like from paper and lit from behind. The image shows up on a computer screen, reconstructed from a light intensity distribution measured by the webs.
The fibers used in the webs are about 1 millimeter in diameter. They consist of a photoconductive glass core with metal electrodes that run along the length of the core, all surrounded by a transparent polymer insulator.
The fibers can detect light anywhere along their length, producing a change in current in an external electrical circuit. While one fiber on its own cannot detect the exact location of an incoming beam of light, when many fibers are arrayed in a web, their points of intersection provide the exact coordinates of the beam. A computer assimilates the data generated by the web and translates it for the user.
If the fibers were woven into a textile, for instance, an embedded computer could provide information on a small display screen or even audibly.
Improving the imaging power of the fiber webs will require reducing the diameter of the fibers and creating denser webs. Fink says he’s not certain whether the new technology will one day replicate human vision. “Just the idea of imaging with a transparent object is a true eye opener,” he said.
Reference(s)
– – – –
SOURCE: The Los Angeles Times ( Los Angeles, California, USA )
Scientists Create Observable Fiber Webs That See
July 2006
In a radical departure from conventional optic lens type materials, MIT scientists developed a sophisticated optical system comprised of mesh type webbed light detection fibers.March 13, 2009
Having a number of advantages over its conventional lens predecessors, the constructs of this new fiber is currently capable of measuring the direction, intensity and phase of light ( a property used to describe a light wave ) without the lenses, filters, or detector array classic elements of conventional optical systems such as eyes or cameras.
Researchers expect the new system will be capable of much more.
Potential applications range from ‘improved space telescopes’ to ‘clothing that provides situational awareness’ to soldiers, or ‘those visually impaired’.
The transparent fiber-webs could even enable huge computer screens to be activated with beams of light instead of the touch of a finger. “We could use light to enhance interaction with computers and even gaming systems,” said Professor Yoel Fink of the Department of Materials Science and Engineering and the Research Lab of Electronics, leader of the team. “It’s intriguing–the idea of touching with light.”
The scientists report the work in the June 25 online edition of Nature Materials, and it is featured on the cover of the July print issue of the magazine.
The human eye, digital and film cameras, and even the Hubble space telescope rely on lenses and detector surfaces ( like the retina ) to create images. But while these systems deliver excellent images, they are constrained by their size, weight, fragility and limited field of view.
In contrast, the fiber webs are flexible and lightweight. Plus, a fiber web in the shape of a sphere can sense the entire volume of space around it, according to Fink.
“When you’re looking at something with your eyes, there’s a particular direction you’re looking in,” says Ayman Abouraddy a research scientist in Fink’s lab. “The field of view is defined around that direction. Depending on the lens, you may be able to capture a certain field of view around that direction, but that’s it. Until now, most every optical system was limited by an optical axis or direction.”
In addition to having an unlimited field of view, the fiber sphere can also detect the direction of incoming light. Light enters the transparent sphere at one point and exits at another, providing a directional reference back to the light source.
Fink’s team has also created a flat, two-dimensional web of fibers and placed two such webs in parallel. These constructs, which can measure the intensity of incoming light, are capable of generating rough images of objects placed near them, such as the shape of a letter “E” cut stencil-like from paper and lit from behind. The image shows up on a computer screen, reconstructed from a light intensity distribution measured by the webs.
The fibers used in the webs are about 1 millimeter in diameter. They consist of a photoconductive glass core with metal electrodes that run along the length of the core, all surrounded by a transparent polymer insulator.
The fibers can detect light anywhere along their length, producing a change in current in an external electrical circuit. While one fiber on its own cannot detect the exact location of an incoming beam of light, when many fibers are arrayed in a web, their points of intersection provide the exact coordinates of the beam. A computer assimilates the data generated by the web and translates it for the user.
If the fibers were woven into a textile, for instance, an embedded computer could provide information on a small display screen or even audibly.
Improving the imaging power of the fiber webs will require reducing the diameter of the fibers and creating denser webs. Fink says he’s not certain whether the new technology will one day replicate human vision. “Just the idea of imaging with a transparent object is a true eye opener,” he said.
Reference(s)
– – – –
SOURCE: Los Angeles Times ( Los Angeles, California, USA )
Pentagon Plans Blimp To Spy From New Heights
by, Julian E. Barnes
March 13, 2009
USA, Washington, D.C. – The Pentagon said Thursday that it intends to spend $400,000,000.00 (USD) million to develop a giant dirigible that will float 65,000 feet above the Earth for 10-years, providing unblinking and intricate radar surveillance of the vehicles, planes and even people below.
 “It is absolutely revolutionary,” Werner J.A. Dahm, chief scientist for the U.S. Air Force, said of the proposed unmanned airship – describing it as a cross between a satellite and a spy plane.
 The 450-foot-long craft would give the U.S. military a better understanding of an adversary’s movements, habits and tactics, officials said. And the ability to constantly monitor small movements in a wide area – the Afghanistan and Pakistan border, for example – would dramatically improve military intelligence.

“It is constant surveillance, uninterrupted,” Dahm said. “When you only have a short time view – whether it is a few hours or a few days – that is not enough to put the picture together.”

The project reflects a shift in Pentagon planning and spending priorities under Defense Secretary Robert M. Gates who has urged the military services to improve intelligence and surveillance operations while cutting high-tech weaponry costs.

If successful, the dirigible – brainchild of the Air Force and the Pentagon research arm [ DARPA ] – could pave the way for a fleet of spy airships, military officials said.

However, it marks the return to a form of flight that has stirred anxiety and doubt since the 1937 Hindenburg disaster – 36 people were killed when that airship went up in flames in New Jersey.

The military has used less sophisticated tethered blimps, called ‘aerostats’, to conduct surveillance around military bases in Iraq.

But flying at 65,000-ft., the giant airship would be nearly impossible to see beyond the range of any hand-held missile, yet safe from most fighter planes.

And its range would be such that the spy craft could operate at the distant edges of any military theater, probably out of the range of surface-to-air [ SAM ] missiles as well.

The Air Force intelligence, surveillance and reconnaissance abilities have improved dramatically in the last 5-years with the expansion of the Predator and other drones [ Unmanned Aerial Vehicles ( UAV ) ]. Although such craft can linger over an area for a long time, they do not watch constantly.

The giant airship military value would come from its radar system giant antenna – would allow the military to see farther and with more detail than it can now.

“Being able to observe threats [and] understand what is happening is really the game-changing piece here,” Dahm said.

The dirigible will be filled with helium and powered by an innovative system that uses solar panels to recharge hydrogen fuel cells. Military officials said those underlying technologies, plus a very lightweight hull, were critical to making the project work.

“The things we had to do here were not trivial; they were revolutionary,” said Jan Walker, a spokeswoman for the Defense Advanced Research Projects Agency [ DARPA ], the Pentagon research arm.

The U.S. Air Force has signed an agreement with DARPA to develop a demonstration dirigible by 2014. The prototype will be one-third ( 1/3rd ) as long as the planned surveillance craft known as ISIS ( Integrated Sensor Is Structure ), because the radar system will be built into the structure of the ship.

While the military says the craft is closer to a blimp than a zeppelin, which has a rigid external structure, officials usually call the project an airship. Blimps get their shape from pressurized helium gas.

The Pentagon has not yet awarded a contractor to build the prototype. Earlier work was done by NORTHRUP GRUMMAN in Redondo Beach, California; Baltimore, Maryland; and other locations – and by LOCKHEED MARTIN in Palmdale, California; Akron, Ohio; and Denver, Colorado.

Reference

http://www.latimes.com/news/nationworld/nation/la-na-spyblimp13-2009mar13,0,4608400.story

– – – –

 

Submitted for review and commentary by,

 

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

 

Advertisements

Robot Combat Intelligence

[ PHOTO ( above ): W-88 miniature nuclear bomb property of USA ( click to enlarge ) ]

Robot Combat Intelligence
by, Concept Activity Research Vault ( CARV )

January 18, 2011 21:08:42 ( PST ) ( Originally Published: February 1, 2002 )

DISTRICT OF COLUMBIA, Washington – January 18, 2011 – Over 12-years ago, after the United States realized too late that its ‘miniature nuclear weapons technology delivery system’ ( W-88 ) secrets had already been stolen ( from the vault of its insurance carrier ) after the People’s Republic of China ( PRC ) rapidly produced their own version, ‘only a select few’ realized a secret U.S. decision took futuristic concepts into development for U.S. global military applications deploying technologies that only seemed to have been conceived from science-fiction motion picture films ( e.g. STAR TREK, STAR WARS, MATRIX, and more ) shocking audiences worldwide.   In 1999, U.S. secret defense endeavors forgings – with several universities and U.S. government contract private sector organizations – were led by the U.S. Department of Defense ( DoD ), Defense Advanced Research Projects Agency ( DARPA ) created even newer more advanced multiple Program stratagems employing various forms of ‘combinatoric’ technologies developed for globally deploying U.S. military dominance with various and sundry secret-sensitive devices and systems far beyond many imaginations.

DARPA SIMBIOSYS Program –

DARPA SIMBIOSYS Program entails, amongst other things, multi-functional microbiological nano technology robot android devices primarily for military applications, where such remained until just a few years ago, until it began being applied in some medical arenas today.

To understand what is ‘current’, one must first look briefly at DARPA Programs ‘past’ ( 1999 – 2002 ), which ( alone ) is enough to ‘still send chills down many people’s spines today’. Once realizing what DARPA was doing 12-years ago, it’s not all that unfathomable to comprehend where DARPA has taken and will continue taking many.

SIMBIOSYS ( 1999 – 2002 ) –

In 1999, DARPA SIMBIOSYS developed a combined quantitative understanding of various biological phenomena characteristics opening the DARPA door to what amounts to MicroElectroMechanical Systems ( MEMS ) integrating microphotonics in, amongst many things such as electro-optic spatial light modulators ( SLM ) combining very short pulse solid state lasers providing powerful new capabilities for secure communication up-links ( multi-gigabits per second ), ‘aberration free’ 3-D imaging and targeting performed at very long ranges ( greater than 1,000 kilometers away ), innovative design system integration of MEMS spatial light modulators ( SLM ) providing quantum wavefront control leaps in photonics and high speed electronics, and even ‘flexible cloth-like smart materials’ DARPA wants hardware placed into production devices and systems applications optimizing both U.S. and ‘its specially selected few other foreign nation U.S. friendlies’ ( Israel ) to hold in future warfaring battlespace management superiority over other foreign nation threats.

DARPA SIMBIOSYS includes classes of biological molecules ( i.e. antigens, antibodies, DNA, cytokines, enzymes, etc. ) for analyses and diagnoses studies, from:

1. Biochemical sensors, sensing ‘details from environments’; and,

2. Biochemical sensors, sensing ‘details from human body fluids’.

Specific examples under each of those two ( 2 ) groups being left up to the discretion of the PI.

Bio-molecules importance slect criteria, includes:

1. Microsystem sensors, for automated sampling and analyses, extendibility;

2. Bio-molecules simulant, to which it represents U.S. Department of Defense ( DoD ) relevant extents; and,

3. Bio-detection high degree of sensitivity and specificity processing, etc.

DARPA SIMBIOSYS emphasis is at the ‘molecular level’ for ‘sensing’ and ‘detection’.

SIMBIOSYS Program precludes human cells and human tissue based sensing because other DARPA programs currently address those issues in combination thereof.

SIMBIOSYS Goals –

SIMBIOSYS Program ‘stimulates multi-disciplinary research’ – bringing together biologists, chemists, engineers, physicists, computer scientists and others to address difficult and pressing challenges in advancing micro and nano-biotechnology.

SIMBIOSYS Program goal is to ‘utilize phenomena’ in ‘bio-fluidic transport’, ‘molecular recognition’ and ‘signal transduction from joint studies in modeling and experiments.

SIMBIOSYS Program joint effort expects results in ‘new hardware device, new hardware processes and new hardware production communities that will begin utilizing new models, new rules, new methods and new processes together enabling design and development of enhanced performance next generation bio-microdevices.

DARPA Advanced Projects –

DARPA is focusing on, amongst many, these advanced projects:

1. Bioengineering artificial intelligence ( AI ) systems sized from nanometers and meters up to large-scale robotic systems deployed globally;

2. Biological hybrid devices and systems, inspired from computational algorithms and models;

3. Biosynthesized composite materials incorporating synthetic enzymes and pathways from biochemical cellular engineered concepts for application productions;

4. Neural phenomena control over system science computation measurement application interfaces addressing humans;

5. Micro-scale reagents biochemically engineered;

6. Biosynthesis signal processing control platform studies;

7. Molecular biological population level behavior dynamic simulation modeling complexes; and,

8. Subcellular device physics affects and cellular device physics affects within biological component systems using real-time non-destructive observation study techniques.

[ PHOTO ( above ): legacy MicroFlyer, only a Microelectronic Aerial Vehicle – MAV ( click to enlarge ) ]

Bioengineered MicroBots Developed & Deployed –

Battlefields now require ‘unmanned combat aerial vehicles’ ( UCAV ) and ‘advanced weapons’ that self-navigate and self-reconfigure with autonomous communication systems accomplishing time-critical commands, however while many use Commercial Off The Shelf ( COTS ) products, such is not the case for developed and deployed bioengineered microrobots.

MicroBot AMR Control By MARS –

DARPA mobile autonomous robot software ( MARS ) Project is designed to develop and transition ‘currently unavailable software technologies programming’ operations of autonomous mobile robots ( AMR ) in partially known changing and unpredictable environments.

DARPA SIMBIOSYS Program aims provide new software removing humans from combat, conveyance, reconnaissance, and surveillance processes by:

1. Extending military hardware range;

2. Lowering manpower costs;

3. Removing human physiology for swifter concepts, designs, engineering, development, and deployment successes; and,

4. Researchers demonstrating autonomous navigation of humanoid robots, unmanned military vehicles, autonomous vehicles and interactions between humans.

DARPA indicates that robots – to be meaningful – must be fully integrated into human lives in military, commercial, educational and domestic usages must be capable of interacting in more natural human ways.

DARPA funded research and development of robots given similar bodies with human-like intelligence for humanoid interaction providing new ways for the human world.

COG Robot –

DARPA funded Massachusetts Institute of Technology ( MIT ) researchers, employing a set of sensors and actuators ( with small microcontrollers for joint level control, up to larger audio-visual digital signal network pre-processors for controlling different levels of its heterogeneous hierarchy network ) approximating human body sensory and motor dynamics, created the robot named COG that eventually allowed DARPA further development of deployable, modular, reconfigurable and autonomous robots.

[ PHOTO ( above ): legacy Biomorphic Explorers – Snakes and Bats ( click to enlarge ) ]

CONRO Robots –

CONRO robots, developed through DARPA, employed autonomous capabilities, of:

1. Self-repair; and, 2. Morphogenesis ( changing shapes ).

Examples, amongst many, included design styled:

Snake robots, able to move ‘in-to’ and ‘out-of’ tight spaces; and,

Insect robots, able to move faster ( covering more ground meeting military mission swifter needs ).

[ PHOTO ( above ): legacy Spider, and Payload biochemical delivery simulation ( click to enlarge ) ]

CONRO robots were design equipped to perform two ( 2 ) missions:

1. Reconnaissance ( activity detection, monitorization, and reporting – surveillance ); and,

2. Deliver small ‘military payloads’ ( bio-chemical weapons, etc. ) into ‘enemy occupied remote territory locations’ ( away-from friendly warfighters ).

CONRO robots are comprised of multiple SPIDERLINK modules.

In 1999, DARPA built both ‘snakes’ and ‘hexapods’ as ‘initially tethered’ prototypes termed 1-DOF, equipped with abilities to both ‘dock’ and ‘gait ambulate’ based on applied computational algorithms.

In 2000, DARPA had twenty ( 20 ) autonomous self-sufficient ‘modules’ – not mentioning what those resembled – built designated as 2-DOF, after:

1. Hormone based control developed and tested theory;

2. Hormone hexapods and snakes implemented motions ( for 2-DOF );

3. Quadrupeds, hexapods and snakes implemented locomotion with centralized control for 2-DOF;

4. Morphing self-repair ‘modules’ delivering small payloads used ‘miniature cameras’ that were designed and tested; and,

5. Snake head with snake tail with configured docking capabilities were implemented laboratory two dimensional ( 2-D ) testing.

CONRO DARPA Near-Term Milestones:

1. Modules’ reconfigurability ( morphogenesis ) robust automation designed and demonstrated ( for 2-DOF );

2. Topology ‘discovery’ ( automatic topography recognition ) demonstration;

3. Gait reconfiguration ( morphogenesis ) automation for ambulating a ‘given’ ( programmed instruction ) topology designed and demonstrated;

4. Gait reconfiguration ( morphogenesis ) automation for ambulating a ‘discovery’ ( automatic topography recognition ) designed and demonstrated;

5. Wireless ( radio frequency, infra-red, etc. ) control of miniature cameras demonstrated;

6. Pointing ( waving, mousepad, etc. ) control of miniature cameras demonstrated; and,

7. Large scale deployment of CONRO robots demonstrated.

[ PHOTO ( above ): DARPA BioBot named Blaberus ( click to enlarge ) ]

Deployer Robot ( DR ) –

Deployer Robots ( DR ) ‘support’ and ‘deploy’ distributed ‘teams of other smaller robots’ termed “Joeys” ( singular, “Joey” ) that perform either ‘hazardous tasks’ or ‘tedious tasks’.

Deployer Robots ( DR ) have two ( 2 ) roles, that:

1. Carry and launch given numbers of smaller Joey robots ( Joeys ); and,

2. Command and control ( C2 ) – after launching – Joey robots ( Joeys ).

[ PHOTO ( above ): legacy CyberLink HID test USAF personnel with DARPA robots ( click to enlarge ) ]

Robot Loop Pyramid –

Robot-in-the-Loop ( RIL ) concept, augments Human-in-Loop ( HIL ), building a ‘pyramid of robots’ – supervised by one ( 1 ) person.

‘Launch’ and ‘Command and Control’ ( C2 ) – of different Joey robots ( multiple, i.e. Joeys ) – two ( 2 ) goals are handled independently, as:

1. ‘Launch’ of robots, via grenade sized Joey robot clusters ( multiple ), developed under DARPA Deployer Robot ( DR ) Program availability of smaller Joeys; and,

2. ‘Command and Control’ ( C2 ), is investigated using ‘larger robots’ developed for DARPA ITO sister Software for Distributed Robotics ( SDR ) Program enabling fully leverage of both Deployer Robot ( DR Program and Software for Distributed Robotics ( SDR ) Program development of algorithms leveraging heterogeneous interaction between a ‘smart’ highly mobile ‘Deployer Robot’ ( DR ) and a ‘team’ of Joey robots that are more powerful, less computational and less mobile.

[ PHOTO ( above ): legacy Virtual Combiman digital glove waving battlespace management ( click to enlarge ) ]

DARPA key universal elements of robot deployment examined:

1. Emplacement – Launching and dynamically situating the Joeys for mission goals;

2. Operations – Maintaining the infrastructure to support the distributed front, including communications and error detection and recovery ( e.g., getting back on course after positional drift ); and,

3. Recovery – Collecting Joey robots data to analyze after delivery into a format useful for the human operator.

DARPA Deployer Robot ( DR ) Program development acquired and refitted two ( 2 ) Urban Robot Upgrades ( URU ) in new Deployer Robots ( DR ) types.

DARPA, investigated five ( 5 ) alternate launch strategies, but selected only one ( 1 ):

1. Grenade barrel launch, delivery of robots, into a three ( 3 ) story building.

2. Grenade barrel launcher was designed, equipped and developed, with:

3. Grenade Magazine contains ‘multiple Joey robots’ for ejection – supports full mobility integrity of the Deployer Robot ( DR );

4. Sensor mast ( collapsible ) – for Deployer Robot ( DR ) interaction with Joey robots launched on arrival at destination location; and,

5. Communication ( 916 MHz ) link between Deployer Robot ( DR ) and Joey robots.

DARPA SDR Program –

DARPA Software for Distributed Robotics ( SDR ) Program development designed and built Joey robot prototypes ( approximately 3-1/2 inch cube ) for ultimate fabrication in a production lot quantity of 120 Joey robot units.

DARPA Software for Distributed Robotics ( SDR ) Program leverage and adaptation controls swarms of Joey robots.

DARPA Near-Term Milestones:

1. Launch propulsion mechanisms ( C02 cartridge, .22 caliber shell, or other ) deployment testing of Joey robots into battlefield areas;

2. Launcher ( of multiple Joey robot deployment ) mechanism built on-board first ( 1st ) Deployer Robot ( DR ) named Bandicoot;

3. Sensor mast ( collapsible ) built and installed on-board second ( 2nd ) Deployer Robot ( DR ) named Wombat;

4. Radio Frequency ( RF ) development protocols for interaction between Deployer Robot ( DR ) and Joey robots;

5. Infra-Red ( IR ) deployment protocols for interaction mechanisms between Deployer Robot ( DR ) and Joey robots using IR ( Infra-Red );

6. Human Interface Device ( HID ) operator remote control unit ( ORCU ) development for Deployer Robot ( DR ).

DARPA SIMBIOSYS began over 12-years ago. All the photographs ( above ) are almost one decade ( 10-years ) old.

Current careful research on this subject further provides more information about where the U.S. stands today.

Submitted for review and commentary by,

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

Reference

http://web.archive.org/web/20021214110038/http://groups.msn.com/AnExCIA/rampdintell.msnw