Continental Plates Split Continues Elsewhere

A diagram of the shifts in the San Andreas fault.

Continental Plates Split Continues Elsewhere
by, KentronIntellectResearchVault ( KIRV )

April 30, 2013 ( Originally Published: September 27, 2012 )

CALIFORNIA, Palos Verdes Penninsula – April 30, 2013 – Earthquake magnitude severity levels and frequency of earthquake swarms are increasing, and in a series of three ( 3 ) reports written by teams of geophysicts recently, it appears plates are splitting open and spreading to other tectonic plates elsewhere where its growing.

From mega earthquakes that unfortunately went publicly unrecognized for several months now, the report indicates a growing trend in ultra-deep oceanic earthquakes spreading elsewhere.

The December 26, 2004 magnitude 9.3 oceanic earthquake ( “Boxing Day Earthquake” ) resulted saw Indian Ocean tsunami deaths of 300,000 people located in several nations ( e.g. Indonesia, Thailand, and India ).

These ultradeep oceanic mega earthquakes crack open continental plates causing land and sea ocean waters to travel.

The March 2011 Japan ultradeep sea Pacific Ocean magnitude 9.2 earthquake split open an entire undersea mountain range 300-miles long, and the resultant tsunami killed everyone ( 15,000 + ) in an entire city.

On April 11, 2012 several back-to-back near 9.0 magnitude earthquakes released an unusual ‘energy factor’ of “Seismic Radiation” travelling at a speed of 7,000 miles per hour with so much power in rapid succession multiple strikes they ‘cracked open a major continental plate’ in the Indian Ocean plus triggered ‘other tectonic plates’ toward the Pacific Ocean to split open as well.

From continental plates in the Indian Ocean growing toward the Pacific Ocean immense geographic region known as the “Asia-Pacific Ring of Fire,” nations from Australia up to Japan and Russia over to Alaska down to California and Argentina have become activated with an unusual continental plate energy.

Unexpected surprises, however recently went publicly unrecognized, a few months ago in Southern California.

During August 2012, strange anomalies began occurring when more than 400 earthquakes in 1-day centered around a little known about ‘volcano’ buried in the Salton Sea near Bombay Beach of Brawley, California located right on-top-of the southern end of the San Andreas Fault Zone where a “Great Earthquake” has not occurred in hundreds of years, which has U.S. Geological Survey ( USGS ) geophysicts believing is now far too long over due.

To read more details about the Indian Ocean continental plate cracking to others splitting toward the Pacific Ocean, review the three ( 3 ) brief reports, global charts, graphs, and films at these official websites ( below ):

Research References

#1. REPORT:

http://www.nature.com/nature/journal/vaop/ncurrent/full/nature11492.html

ADOBE ADDITIONAL DATA:

http://www.ncbi.nlm.nih.gov/pubmed/23060185

MOVIE CLIP DATA ( ADDITIONAL ):

http://es.ucsc.edu/~thorne/NATURE_SUMATRA/YLK_SI_Movies.html

http://www.nature.com/nature/journal/vaop/ncurrent/extref/nature11492-s2.mov

#2. REPORT:

http://www.nature.com/nature/journal/vaop/ncurrent/full/nature11504.html

ADOBE ADDITIONAL DATA:

Click to access nature11504-s1.pdf

#3. REPORT:

http://www.nature.com/nature/journal/vaop/ncurrent/full/nature11520.html

ADOBE ADDITIONAL DATA:

Click to access nature11520-s1.pdf

Submitted for review and commentary by,

 

Kentron Intellect Research Vault
E-MAIL: KentronIntellectResearchVault@Gmail.Com
WWW: KentronIntellectResearchVault.WordPress.Com

/

/

AntiMatter Technology Problems

AntiMatter Technology Problems

by, Paul Collin, host of Concept Activity Research Vault ( CARV ) on May 15, 2011 ( Originally Published: May 10, 2011 )

CALIFORNIA, Los Angeles – May 15, 2011 – The global scientific community is eyeing suspiciously a 1952 ‘experimental projects’ organization known as the Conseil Européen pour la Recherche Nucléaire ( CERN ) also known as European Organization for Nuclear Research ] wherein its Large Hadron Collider ( LHC ) consists of a huge 27-mile in diameter high-energy particle collider is conducting some extremely serious experiments involving what scientists and physicisys say involves something called a “CP-violation” that deals with creating a variety of new subatomic particles that are believed to have never existed anywhere on Earth.

There is quite a bit of controversy concerning something called a “strangelet” ( strangelets ) and other particle creations within the CERN experiment, which because of conjectures in scientific theories are feared by some professionals, could create a ’new subatomic particle’ that may upset the balance of Earth as we know it, and what is even more frightening is that if something goes out-of-control, it may take anywhere between 1-year to 5-years ‘before anyone notices a chain reaction having already been created that some have already identified as a ‘micro-blackhole’ that could theoretically begin consuming Earth from within its own magnetic iron core. Sounding like ‘science fiction’, apparently CERN experiments are ’definitely not’ something to be taken lightly.

This serious and highly controversial subject amongst scientists and physicists around the world is being touched-on in this report, amongst other related information, amongst which includes video clips ( below ) for better understanding some of the many aspects for public knowledge not being addressed by mainstream news broadcasts.

CERN went even further, though, by expanding its deep underground experiments to conduct related experiments in outerspace within what it calls the Alpha Magnetic Spectrometer ( AMS / AMS-02 ) now scheduled for launch aboard the U.S. Space Shuttle Endeavor STS-134 mission set for May 16, 2011. The AMS-02 is, however, to be delivered to the International Space Station ( ISS ) where it will continue CERN designated experiments.

Interestingly, during July 2010 the Alpha Magnetic Spectrometer ( AMS / AMS02 ) was ‘not’ launched as the video clip ( above ) depicted. The Alpha Magnetic Spectrometer ( AMS / AMS-02 ), being equated to that of the Hubble space telescope, actually holds far more technological advancements from CERN and is solely designed to focus on subatomic particles surrounding antimatter issues.

U.S. Space Shuttle Endeavor mission STS-134 was scheduled to launch on April 14, 2011 but was delayed until the end of April 2011, but then was delayed yet again until May 16, 2011. Why so many delays and reschedulings?

Earth anti-matter issues are rarely addressed by the mainstream news media with the public, however in-lieu of the recent NASA public warning that it is expecting a ‘significant’ “solar flare” to erupt, coming bound for Earth, as something “we all need to be concerned about,” the Alpha Magnetic Spectrometer ( AMS ) having just been recently placed onboard the U.S. Space Shuttle Endeavour mission – scheduled to deliver the AMS aboard the International Space Station ( ISS ) – is something the public really needs to take a closer look at.

AMS-02 onboard ISIS

[ PHOTO ( above ): Alpha Magnetic Spectrometer ( AMS / AMS-02 ) in U.S. Space Shuttle Endeavour cargo bay April 2011 ( click to enlarge ) ]

– –

Source: Nature.Com

AntiUniverse Here We Come by, Eugenie Samuel Reich

May 4, 2011

A controversial cosmic ray detector destined for the International Space Station will soon get to prove its worth.

The next space-shuttle launch will inaugurate a quest for a realm of the Universe that few believe exists.

Nothing in the laws of physics rules out the possibility that vast regions of the cosmos consist mainly of anti-matter, with anti-galaxies, anti-stars, even anti-planets populated with anti-life.

“If there’s matter, there must be anti-matter. The question is, where’s the Universe made of antimatter?” says Professor Samuel C.C. Ting, a Nobel prize winning physicist at the Massachusetts Institute of Technology ( MIT ) in Cambridge, Massachusetts. But most physicists reason that if such antimatter regions existed, we would have seen the light emitted when the particles annihilated each other along the boundaries between the antimatter and the matter realms. No wonder the Professor Samuel C.C. Ting brainchild, a $2,000,000,000 billion dollar space mission was sold ‘partly on the promise of looking for particles emanating from anti-galaxies’, is fraught with controversy.

Professor Ting’s project, however has other ‘more mainstream scientific goals’ so, most critics of which held their tongues last week as the U.S. Space Shuttle Endeavour STS-134 mission – prepared to deliver the Alpha Magnetic Spectrometer ( AMS version, known as the AMS-02 ) to the International Space Station ( ISS ) – flight was delayed ( because of problems ) until later this month ( May 2011 ).

Pushing The Boundaries –

Seventeen ( 17 ) years in the making, the Alpha Magnetic Spectrometer ( AMS ) is a product of the former NASA administrator Dan Goldin quest to find remarkable science projects for the Internation Space Station ( ISS ) and of the Ting fascination with anti-matter.

Funded by NASA, the U.S. Department of Energy ( DOE ), plus a sixteen ( 16 ) country consortium of partners, the Alpha Magnetic Spectrometer ( AMS ) has prevailed – despite delays and technical problems – along with the doubts of many high-energy and particle physicists.

“Physics is not about doubt,” says Roberto Battiston, deputy spokesman for the Alpha Magnetic Spectrometer ( AMS ) and physicist at the University of Perugia, Italy. “It is about precision measurement.”

As the Alpha Magnetic Spectrometer ( AMS ) experiment headed to the Space Shuttle Endeavour launch pad, Roberto Battiston and other scientists were keen to emphasize the Alpha Magnetic Spectrometer ( AMS ) ‘unprecedented sensitivity’ to the gamut of cosmic rays, that rain down on Earth, that should allow the Alpha Magnetic Spectrometer ( AMS ) to perform two ( 2 ) things:

1. Measure Cosmic Ray High-Energy Charged ‘Particles’ and ‘Properties’ ( thereof ), sent from:

– Sun ( Earth’s ); – Supernovae ( distant ); and, Gamma ( γ ) Ray Bursts ( GRB ).

AND,

2. Detect AntiMatter ( errant chunks ), sent from the:

a. Universe ( far-away ).

Cosmic rays ( on Earth ) can only be indirectly detected by their showers of ‘secondary particles’ produced – when slamming into molecules of atmosphere in high regions above the Earth, but the Alpha Magnetic Spectrometer ( AMS ) in space will get an undistorted view.

“We’ll be able to measure ( solar ) Cosmic Ray Flux very precisely,” says collaboration member physicist Fernando Barão of the Laboratory of Instrumentation and Experimental Particle Physics ( in Lisbon, Spain ). “The best place ( for detecting this ) is to be in ‘space’ because you don’t have Earth’s atmosphere that is going to destroy those cosmic rays.”

No matter what happens, with the more speculative search for antimatter, the Alpha Magnetic Spectrometer ( AMS ) should produce a definitive map of the cosmic ray sky – helping to build a kind of ‘astronomy not dependent on light’.

Alpha Magnetic Spectrometer ( AMS ) consists of a powerful permanent magnet surrounded by a suite of particle detectors.

Over 10-years ( or more ), that the Alpha Magnetic Spectrometer ( AMS ) experiment will run, the Alpha Magnetic Spectrometer ( AMS ) magnet will bend the paths of cosmic rays by an amount that reveals their energy and charge, thereby their identity.

Some will be ‘heavy atomic nuclei’, while others ( made from anti-matter ), will reveal themselves by ‘bending in the opposite direction’ from their ‘matter’ counterparts ( see, e.g. cosmic curveballs ).

By ‘counting positrons’ ( i.e. antimatter ‘electrons’ ), the Alpha Magnetic Spectrometer ( AMS ) could also ‘chase a tentative signal of dark matter’, the so-far ‘undetected stuff’ thought to account for ‘much of the mass of the Universe’.

In 2009, Russia and Italy researchers – with the Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics ( PAMELA ) onboard a Russia satellite – published evidence of an ‘excess amount of positrons in the space environment surrounding Earth’ ( O. Adriani et al. Nature 458 , 607–609; 2009 ). One potential source of this is the ‘annihilation of dark-matter particles’ within the ‘halo enveloping our Galaxy’.

Another speculative quest, is to follow up on hints of ‘strange matter’, a ‘hypothetical substance’ that should be found in ‘some collapsed stars’ containing ‘strange quarks’, ‘up quarks’ and ‘down quarks’ – within ordinary nuclei.

NASA Alpha Magnetic Spectrometer ( AMS ) program manager Mark Sistilli says hints of ‘strange matter’ were seen – during a 1998 pilot flight of the Alpha Magnetic Spectrometer ( AMS / AMS-01 ) aboard the Space Shuttle, however NASA determined results ‘too tentative to publish’.

Because the Alpha Magnetic Spectrometer ( AMS / AMS-02 ) status was made as an “exploration mission,” the Alpha Magnetic Spectrometer ( AMS ) ‘did not need to follow’ “peer review” NASA would ‘normally have required’ for a ”science mission.”

But Sistilli emphasizes the Alpha Magnetic Spectrometer ( AMS ) earned flying colors from committees convened by the U.S. Department of Energy ( DOE ), which is supplying $50,000,000 million of the funding.

Now their ( DOE ) confidence will be put to the test.

Reference

http://www.nature.com/news/2011/110504/full/473013a.html

While for some it may appear strangelet subatomic antimatter particle research is for advancing our knowledge of unlocking the secrets of life in the Universe, others are still asking NASA what they really know is behind ‘why’ an ‘expected significant’ Solar Energetic Particle Event ( SEPE ) is something “we all need to be concerned about” on Earth.

With Solar Energetic Particle Event ( SEPE ) high-energy effects capable of disrupting Earth ground-based and space-based electrical components and electricity grid infrastructure systems for up to 10-years, many wonder why billions upon billions of dollars were and are still being pumped into the CERN project studying ‘strangelets’ and people want to know just why we need ‘more immediate information detection capabilities’ on high-energy solar flare proton and electron ejections coming toward Earth soon, which NASA and other agencies ‘know far more about’ than they are willing to tell the public.

How advanced has government authorities grown from private-sector science and technology knowledge? The United States has already mapped internal magma flows of the Sun.

How could the U.S. government possibly ‘see inside the Sun’ to know when a ‘direct or near direct Earth facing’ Sun based Coronal Mass Ejection ( CME ) from a solar flare would occur in the future?

In layman terms, for government it was like looking through a clear glass Pyrex bowl positioned atop a stove burner, watching as water starts to boil inside it, and then predicting – based on the flame heating it the water – when bubbles will come to the surface, when one takes into account a government ’ground-based’ ( does ‘not’ require ‘space-based placement’ ) observatory telescope equipped with a “super lens” used for imaging ( observing ) ‘objects at great distances inside matter’ – a “superlens” that now even ‘defies light-speed’ and ‘matter’. ( Read Below )

– –

[ PHOTO ( above ): Antimatter photon ‘optic’ substrate structure material for ‘subsurface solar imaging plasma flows’ inside Sun enables plotting Coronal Mass Ejections ‘before solar surface eruptions’ ( click to enlarge ) ]

Source: U.S. Department of Energy, Lawrence Berkeley National Laboratory, Operated by the University of California

Optical Antimatter Structure Shows The Way For New Super Lens by, Aditi Risbud

April 21, 2009

A device, made from alternating layers of ‘air’ and ‘silicon photonic crystal’, behaves like a ‘super lens’ – providing the first experimental demonstration of optical antimatter.

Scientists at Berkeley Lab ( Berkeley, California, USA ) and the Institute for Microelectronics and Microsystems ( CNR ) in Naples, Italy have experimentally demonstrated – for the first time – the ‘concept of optical antimatter’ by ‘light traveling through a material without being distorted’.

By engineering a material focusing light through its own internal structure, a beam of light can enter and exit ( unperturbed ) after traveling through millimeters of material.

For years, optics researchers have struggled to bypass the ‘diffraction limit’, a physical law restricting imaging resolution to about 1/2 the wavelength of light used to make the image.

If a material with a negative index of refraction ( a property describing how light bends as it enters or exits a material ) could be designed, this diffraction hurdle could be lowered.

Such a material could also behave as a superlens, useful in observing objects from imaging equipment with ‘details finer than allowed by the diffraction limit’, a physical law restricting imaging resolution to about 1/2 the wavelength of light used to make the image.

Despite the intriguing possibilities posed, by a substance with a negative index of refraction, ‘this property is inaccessible through naturally occurring ( positive index ) materials’.

During the mid 1990s, English theoretical physicist Sir John Pendry proposed his clever ‘sleight of light’ using so-called metamaterials – engineered ‘materials’ whose underlying structure ‘can alter overall responses’ to ‘electrical fields’ and ‘magnetic fields’.

Inspired by the Sir John Pendry proposal, scientists have made progress in scaling metamaterials from microwave to infrared wavelengths while illuminating the nuances of light-speed and direction-of-motion in such engineered structures.

“We’ve shown a ‘completely new way to control and manipulate light’, ‘using a silicon photonic crystal’ as a ‘real metamaterial’ – and it works,” said Stefano Cabrini, Facility Director of the Nanofabrication Facility in the Molecular Foundry, a U.S. Department of Energy ( DOE ) User Facility located at Lawrence Berkeley National Laboratory ( LBNL ) providing support to nanoscience researchers around the world.

“Our findings will open-up an easier way to make structures and use them effectively as a ‘super-lens’.”

Through the Molecular Foundry user program, Cabrini and post-doctoral researcher Allan Chang collaborated with Vito Mocella, a theoretical scientist at the Institute of Microelectronics and Microsystems ( CNR ) in Naples, Italy to fabricate a 2 X 2 millimeter device consisting of alternating layers of air and a silicon based photonic crystal containing air holes.

Using high precision nanofabrication processes, the team designed the spacing and thicknesses of each layer to behave like the metamaterial Sir John Pendry had envisioned.

This device was then used to focus a beam of near-infrared ( I-R ) light, essentially ‘annihilating’ 2 millimeters of ‘space’.

“Now that we have a prototype to demonstrate the concept, our next step will be to find the geometry and material that will work for visible light,” said Cabrini.

Along with possibilities in imaging, the researchers’ findings could also be used to develop hybrid negative-index and positive-index materials, Cabrini added, which may lead to novel ‘devices’ and ‘systems’ unachievable through either material alone.

“Self-collimation of light over millimeter-scale distance in a quasi zero average index metamaterial,” by Vito Mocella, Stefano Cabrini, Allan S.P. Chang, P. Dardano, L. Moretti, I. Rendina, Deirdre Olynick, Bruce Harteneck and Scott Dhuey, appears in Physical Review Letters available in Physical Review Letters online.

Portions of this work were supported by the U.S. Department of Energy ( DOE ) Office of Science, Office of Basic Energy Sciences under Contract No. DE-AC0205CH11231.

The Molecular Foundry is one ( 1 ) of five ( 5 ) U.S. Department of Energy ( DOE ) Nanoscale Science Research Centers ( NSRC ) that are premier national user facilities for interdisciplinary research at the nanoscale. Together, the U.S. Department of Energy ( DOE ) Nanoscale Science Research Centers ( NSRC ) comprise a suite of complementary facilities providing researchers with state-of-the-art capabilities to fabricate, process, characterize and model nanoscale materials, which constitutes the ‘largest infrastructure investment’ of the National Nanotechnology Initiative ( NNI ).

U.S. Department of Energy ( DOE ) Nanoscale Science Research Centers ( NSRC ) are located at these six ( 6 ) locations:

– Argonne National Laboratory ( ANL ); – Brookhaven National Laboratory ( BNL ); – Lawrence Berkeley National Laboratory ( LBNL ); – Oak Ridge National Laboratory ( ORNL ); – Sandia National Laboratory ( SNL ); and, – Los Alamos National Laboratory ( LANL ).

For more information about the DOE NSRCs, please visit http://nano.energy.gov.

Berkeley Lab is a U.S. Department of Energy ( DOE ) National Laboratory located in Berkeley, California conducting ‘unclassified scientific research’ managed by the University of California.

References

http://www.lbl.gov http://foundry.lbl.gov http://newscenter.lbl.gov/feature-stories/2009/04/21/optical-antimatter

– –

If the public could keep its eye open for one second, it would see what is coming at them before it hits them with a surprise that only government knows anything about however governments continue conveying its double-speak vernacular to citizens over a very long period of decades; perhaps, a mere fact ’known today’ may eventually come as no surprise to many whom would have otherwise been kept in the dark while only a few know far more about what awaits the masses.

Perhaps, people may begin asking more questions of their country’s agencies spending so much money so quickly for apparently some ‘mysterious emergency purpose’, and if not for some ‘mysterious emergency purpose’, why is so much money being spent on science and space projects while the general public is told about ‘serious government budget cutbacks’ seeing so many people suffer?

If there is no ’emergency’, then people should know ‘why they are suffering financially more’ – just for the sake of ‘growing science experiment budgets’?

Might be a good idea for everyone to begin keeping their eyes open a little more often and trained on something more than light-hearted mainstream media news entraining entertainment broadcastings.

If people think they get real serious about ‘what they know’ as told on television news broadcasts, imagine how much more serious they will become when they learn about what they ‘were not told’?

Think about it. How Fast Is Technology Growing?

Just beginning to grasp something ‘new’?

Now think about something even newer than the Large Hadron Collider ( LHC ) at CERN.

Reference: https://web.archive.org/web/20120921064312/http://conceptactivityresearchvault.wordpress.com/2011/05/15/antimatter-technology-problems/

BNL Time Chamber

Running Time Backwards –

The Relativistic Heavy Ion Collider ( RHIC ) that added its Solenoidal Tracker At RHIC ( STAR ) claiming to “reverse time” by ultra super computers reconstructing sub-atomic particle interactions producing particles – emerging from each collision – that STAR is believed to be able to “run time backward” in a process equated to examining final products coming out-of a factory that scientists and physicists have no idea ’what kinds of machines produced the products’. Basically, they are developing items so fast, they do not know how they were formed, much less what the capabilities are. Fact is, ’they could easily produce a monster’ and ‘not know what it is until after they are eaten by it’. Scary, really, like kids being given matches to play with.

They are being educated beyond their own intelligence, so much so and to the point by which scientists and physicists ’cannot even grasp what ‘it’ is they’re looking at – much less know what they are trying to manipulate to ‘see what it does next’ – nevertheless they are conducting experiments like children playing with dynamite.

Think this is science fiction? Think they are mad scientists at play? Check the research reference links ( below ).

Think antimatter technology has advanced alot since you began reading this report? calculate ‘more’ because the public does not even know half of it.

Reference: https://unwantedpublicityintel.wordpress.com/2015/09/22/time-foolery/ and, Research ( 26MAY05 ): https://web.archive.org/web/20120925015521/http://www.bnl.gov/rhic/news2/news.asp?a=2647&t=today

CERN has been operating since 2002, and the “SuperLens” was worked-on ‘before’ 2002, making ‘both’ today now 10-years old.

Want newer ‘news’?

Superlenses – created from perovskite oxides – are simpler and easier to fabricate than ‘metamaterials’.

Superlenses are ideal for capturing light travelling in the mid-infra-red ( IR ) spectrum range, opening even newer technological highly sensitive imaging devices, and this superlensing effect can be selectively turned ‘on’ and ‘off’, opening yet another technology of ‘highly dense data storage writing’ for ‘far more advanced capability computers’.

Plasmonic whispering gallery microcavities, consisting of a silica interior coated with a thin layer of silver, ‘improves quality by better than an order of magnitude’ of current plasmonic microcavities. and paves the way for ‘plasmonic nanolasers’.

Expand your knowledge, begin researching the six ( 6 ) reference links ( below ) so that the next time you watch the ‘news’ you’ll begin to realize just how much you’re ‘not being told’ about what is ‘actually far more important’ – far more than you’re used to imagining.

Submitted for review and commentary by,

Paul Collin ( Concept Activity Research Vault ) E-MAIL: UnwantedPublicity@GMAIL.com  WWW: http://ConceptActivityResearchVault.WordPress.Com

References

http://www.bnl.gov/rhic/
http://www.bnl.gov/rhic/STAR.asp
http://www.bnl.gov/bnlweb/pubaf/pr/PR_display.asp?prID=1075&template=Today
http://newscenter.lbl.gov/news-releases/2011/03/29/perovskite-based-superlens-for-the-infrared/
http://newscenter.lbl.gov/news-releases/2009/01/22/plasmonic-whispering-gallery/
http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=1018060

Earth Event Alerts

Earth Event Alerts

[ IMAGE ( above ): IBM Stratus and Cirrus supercomputers analyze Global Environmental Intelligence ( click to enlarge ) ]

Earth Event Alerts

by, Kentron Intellect Research Vault [ E-MAIL: KentronIntellectResearchVault@Gmail.Com ]

August 17, 2012 19:00:42 ( PST ) Updated ( Originally Published: March 23, 2011 )

MARYLAND, Fort George G. Meade – August 17, 2012 – IBM Stratus and IBM Cirrus supercomputers as well as CRAY XK6m and CRAY XT5 ( Jaguar ) massive parallel supercomputers and vector supercomputers are securely controlled via the U.S. National Security Agency ( NSA ) for analyzing Global Environmental Intelligence ( GEI ) data extracted from ground-based ( terrestrial ) monitoring stations and space-based ( extraterrestrial ) spaceborne platforms studying Earth Event ( Space Weather ) effects via High-Performance Computing ( HPC ) as well as, for:

– Weather Forecasting ( including: Space Weather ); – U.S. Government Classified Projects; – Scientific Research; – Design Engineering; and, – Other Research.

[ IMAGE ( above ): CRAY XK6m supercomputers analyze Global Environmental Intelligence ( click to enlarge ) ]

CRAY INC. largest customers are U.S. government agencies, e.g. the U.S. Department of Defense ( DoD ) Defense Advanced Research Projects Agency ( DARPA ) and the U.S. Department of Energy ( DOE ) Oak Ridge National Laboratory ( ORNL ), which accounts for about 3/4 of revenue for CRAY INC. – as well as other supercomputers used worldwide by academic institutions ( universities ) and industrial companies ( private-sector firms ).

CRAY INC. additionally provides maintenance, support services and sells data storage products from partners ( e.g. BlueArc, LSI and Quantum ).

Supercomputer competitors, of CRAY INC., are:

– IBM; – HEWLETT-PACKARD; and, – DELL.

On May 24, 2011 CRAY INC. announced its new CRAY XK6 supercomputer, a hybrid supercomputing system combining its Gemini InterConnect, AMD Opteron™ 6200 Series processors ( code-named: InterLagos ) and NVIDIA Tesla 20 Series GPUs into a tightly integrated upgradeable supercomputing system capable of more than 50 petaflops ( i.e. ‘quadrillions of computing operations’ per ‘second’ ), a multi-purpose supercomputer designed for the next-generation of many-core High Performance Computing ( HPC ) applications.

The SWISS NATIONAL SUPERCOMPUTING CENTRE ( CSCS ) – located in Manno, Switzerland – is the CRAY INC. first ( 1st ) customer for the new CRAY XK6 system. CSCS ( Manno, Switzerland ) promotes and develops technical and scientific services in the field of High-Performance Computing ( HPC ) for the Swiss research community, and is upgrading its CRAY XE6m system ( nick-named: Piz Palu ) into a multiple cabinet new CRAY XK6 supercomputer. The SWISS NATIONAL SUPERCOMPUTING CENTRE ( CSCS ) supports scientists working, in:

– Weather Forecasting; – Physics; – Climatology; – Geology; – Astronomy; – Mathematics; – Computer Sciences; – Material Sciences; – Chemistry; – Biology; – Genetics; and, – Experimental Medicine.

Data additionally analyzed by these supercomputers, include:

– Ultra-Deep Sea Volcanoes located in continental plate fracture zones several miles beneath ocean basin areas ( e.g. Asia-Pacific Rim also known as the “Pacific Ring of Fire” where a circum-Pacific seismic belt of earthquakes frequently impact areas far across the Pacific Ocean in the Americas ).

Global geoscience realizes Earth ‘ground movement shaking’ earthquakes hide alot, of what people are actually walking on-top-of, large geographic land mass areas known as ‘continental shelves’ or “continental plates” that move ( tectonics ) because of superheated pressurized extrasuperconducting magnetic energy properties released from within molten magma material violently exploding beneath the surface of the Earth down in ultra-deep seas.

[ IMAGE ( above ): Global Tectonic Plate Boundaries & Major Volcano HotSpots ( click to enlarge ) ]

Significant volcanoes are positioned like dots along this global 25,000-mile circular region known as the “Pacific Ring of Fire” extending from south of Australia up the ‘entire eastcoast’ of Japan, China and the Kamchatka Pennisula of Russia to across the Aleutian Islands of Alaska and then south down the ‘entire westcoast’ of North America and Latin America.

[ IMAGE ( above ): Ultra-Deep Sea Pacific Ocean Basin ( click to enlarge ) ]

March 11, 2011 Tohoku-chiho Taiheiyo-oki Japan 9.0 earthquake held several secrets, including U.S. government contractors simultaneously monitoring a significant ”moment of magnitude” ( Mw ) Earth Event occurring parallel to the eastcoast of Japan beneath the Western Pacific Ocean where an entire suboceanic mountain range was being split in-half ( south to north ) 310-miles long and split open 100-feet wide ( east to west ), which the public was unaware of nor were they told details about.

Interestingly, the March 11, 2011 Japan island earthquakes have not yet stopped, as the swarm of 4.0, 5.0, 6.0 and 7.0 Richter scale earthquakes continue as a direct and proximate cause of erupting ‘suboceanic volcanoes‘ moving these large “plates” beginning to force yet others to slam into one another thousands of miles away.

Japan’s Western Pacific Ocean ‘eastcoast’ has a ’continental plate’ slamming point meeting the ’westcoast’ of North America near the Cascade mountain range ‘plate’ reacting in one ( 1 ) of two ( 2 ) ways, i.e. ’seaward’ ( plate thrusting toward Japan ) or ‘landward’ ( plate thrusting toward the Pacific Northwest ) of the United States and/or Canada.

What The Public Never Knew

Government leadership, globally, is acutely familiar with these aforementioned types of major Earth Events, including ‘monstrous plate tectonic pushing matches’, which usually collapse one or more ‘national infrastructures’ and typically spells ‘death’ and ‘serious injuries’ for populations in developed areas.

Extremely familiar with mass public panic resulting from Earth Event catastrophes, government ‘contingency actions’ pre-approved by ‘governing bodies’ and/or ‘national leadership’ Executive Order Directives, which although not advertised is a matter of public record, ‘immediately calls’ upon ‘all military forces’ to carry-out “risk reduction” ( ‘minimization of further damages and dangers’ ) through what is referred to as “mitigation” ( ‘disaster management’ ) within “National Disaster Preparedness Planning” ( ‘national contingency measures’ ) details citizens are unaware-of. Government decision-makers know a “national emergency can bring temporary suspension of Constitutional Rights and a loss of freedoms – a volatile subject few care to discuss because ’any significant natural disaster’ will result in government infringment on many civil liberties most populations are accustomed to enjoying.

Before 1-minute and 40-seconds had passed into the March 11, 2011 Tohoku, Japan earthquake ( Richter scale: M 9.0 ), key U.S. government decision-makers discussed the major Earth Event unfolding off Japan’s eastcoast Plate-Boundary subduction zone beneath the ultra-deep sea of the Western Pacific Ocean where Japan’s monstrous volcano mountain range had split at least 100-feet wide open and cracked 310-miles long in a northern direction headed straight for the Aleutian Islands of Alaska in the United States.

U.S. Military Contingent Standby “Red Alert” Notification

U.S. Air Force ( USAF ) ‘subordinate organization’ Air and Space Operations ( ASO ) Communications Directorate ( A6 ) ‘provides support’ over ‘daily operations’, ‘contingency actions’ and ‘general’ Command, Control, Communication and Computer Intelligence ( C4I ) for the U.S. Air Force Weather Agency ( USAFWA ) saw its 1st Weather Group ( 1ST WXG ) Directorate ready its USAFWAWXGOWS 25th Operational Weather Squadron ( OWS ) at Davis-Monthan Air Force Base ( Tucson, Arizona ) responsibile for conjuntive communication notification issuance of an Earth Event “Red Alert” immediately issued directly to U.S. Army Western Command ( WESTCOM ) with a “Standby-Ready” clause pausing Western Region ( CONUS ) mobilization of Active, Reserve and National Guard military forces at specific installations based on the Japan Earth Event “moment of magnitude” ( Mw ) Plate-Boundary consequential rebound expected to strike against the North America westcoast Plate-Boundary of the Cascadia Range reactionarily triggering its subduction zone into a Cascadia ‘great earthquake’.

CALTECH Public News Suppression Of Major Earth Event

Officials, attempting to diminish any clear public understanding of the facts only knowing a Richter scale level earthquake ‘magnitude’ ( never knowing or hearing about what a major Earth Event “moment of magnitude” ( Mw ) entailed ), only served-up ‘officially-designed double-speak psycho-babble terms’ unfamiliar to the public as ‘creative attention distraction’ announcing the “Japan earthquake experienced,” a:

– “Bilateral Rupture;” and,

– “Slip Distribution.”

The facts are that, the Japan ‘earthquake’ would ‘never have occurred’, ‘unless’:

1ST – “Bilateral Rupture” ( ‘suboceanic subterranean tectonic plate split wide open  ) occurred; followed by,

2ND – “Slip Distribution” ( ‘tectonic plate movement’ ); then finally,

3RD – “Ground Shaking” ( ‘earthquake’ ) response.   Officials failed the public without any notification a major Earth Event “moment of magnitude” ( Mw ) on the “Pacific Ring of Fire” ( circum-Pacific seismic belt ) in the Western Pacific Ocean had a, huge:

1. Continental Plate Break Off;

3. Undersea Plate Mountain Range Crack Wide Open; plus,

2. Mountain Range Split Open 310-Miles Long.

There are some, laying at rest, that might ‘not consider’ the aforementioned three ( 3 ) major Earth Event occurences significant, except those ‘still living’ on Earth.

Asia-Pacific Rim

This western Pacific Ocean huge ‘undersea mountain range’ moved ‘east’, crushing into the smaller portion of its tectonic plate’ toward the continent of Asia, which commenced continous streams of day and night significant earthquakes still registering 5.0 + and 6.0 + according to Richter scale levels of magnitude now and for over 12-days throughout the area surrounding Japan, the point nearest where the tectonic plate meets the continent of Asia within the western Pacific Ocean from where this ‘monstorous undersea mountain range’ suddenly split, sending the ‘eastern half’ – with the ‘tectonic plate’ broken beneath it – slamming into the continent of Asia.

Simultaneously pushed, even greater with more force outward ( note: explosives – like from out-of a cannon or from a force-shaped explosive – project blasts outward from the ‘initial explosive blast’ is blunted by a back-stop ) away-from the Asia continent, was this ‘monstorous undersea mountain range’ split-off ( 310-miles / 500-kilometers long ) ‘western half’ slammed west up against the Americas ‘western tectonic plates’ .

This ‘is’ the ‘major’ “Earth Event” that will have consequential global impact repurcussions, ‘officially minimized’ by ‘focusing public attention’ on a ‘surface’ Earth Event earthquake 9.0 Richter scale magnitude ( once ), while even further diminishing the hundreds of significant earthquakes that are still occuring 12-days after the initial earthquake.

Asia-Pacific Rim “Ring Of Fire”

Many are unaware the “Asia-Pacific Rim” is ( also known as ) the “Ring of Fire” whereunder the ”ultra-deep sea Pacific Ocean’ exists ‘numerous gigantic volatile volcanoes’ positioned in an ‘incredibly large circle’ ( “ring” ) around a ‘huge geographic land mass area’ comprised of ‘tectonic plates’ that ‘connect’ the ‘Eastern Asias’ to the ‘Western Americas’.

Yellowstone National Park Super Volcano

Many people are still wondering ‘why’ the Japan earthquakes have not yet stopped, and why they are being plagued by such a long swarm of siginificant earthquakes still to this very day nearly 60-days later. The multiple color video clips viewed ( below ) provides information on unusual earthquake swarm patterns and reversals while studying the World’s largest supervolcano in Wyoming ( USA ) located at Yellowstone National Park, a global public attraction viewing natural underground volcano steam vents known as geyser eruptions:

[ PHOTO ( above ): Major HotSpot at Yellowstone displays Half Dome cap of granite rock above unerupted volcano magma. ]

Ultra-Deep Sea Volcanoes

When huge undersea volcanoes erupt they dynamically force incredibly large geographic land mass plates to move whereupon simultaneously and consequentially movement is experienced on ‘surface land areas’ people know as ’earthquakes’ with their ’aftermath measurements’ provided in “Richter scale level” measurements that most do not understand. These Richter scale measurements are only ‘officially provided estimates’, as ’officials are never presented with totally accurate measurements’ because many of which are ‘not obtained with any great precision for up-to 2-years after the initial earthquake’.

Rarely are ‘precise measurements’ publicly provided, and at anytime during that 2-year interim the public may hear their previously reported earthquake Richter scale level measurement was either “officially upgraded” or “officially downgraded.” Often, this is apparently dependent when one sees ’many other countries contradicting U.S. public news announcements’ about the magnitude of a particularly controversial earthquake. An example of this was seen surrounding the March 12, 2011 earthquake in Japan:

– Japan 1st public announcement: 9.2 Richter scale;

– United States 1st public announcement: 8.9 Richter scale;

– United States 2nd public announcement: 9.0 Richter scale; and,

– United States 3rd public announcement: 9.1 Richter scale.

What will the March 12, 2011 Japan earthquake be officially reported as in 2-years? Who knows?

Never publicly announced, however are measurements of an earthquake ‘force strength pressure accumulation’ transmitted through suboceanic tectonic plates grinding against one another, a major Earth Event ‘geographic pushing process’, having been seen by U.S. NSA supercomputers from global ground and space-based monitoring analysis surrounding the “Asia-Pacific Rim Ring of Fire” – stretching from the ‘Eastern Asias’ to the ‘Western Americas’ and beyond.

This ‘domino plate tectonic principle’ results from combined amounts of ‘volcanic magmatic eruptive force strength’ and ‘tectonic plate accumulative pressure build-up’ against ‘adjacent tectonic plates’ causing ‘suboceanic, subterranean and surface land to move’ whereupon ‘how significant such amounts occur determines strength’ of both consequential ‘earthquakes’ and resultant ‘tsunamis’.

Waterway Tsunamis

When most of the public ‘hears about’ a “tsunami”, they ‘think’ ‘high waves’ near “ocean” coastal regions presented with significant floods over residents of cities nearby. Few realize the ‘vast majority of Earth’s population predominantly live all along ocean coastal regions. Few realize one ( 1 ) ‘gigantic tsunami’ could ‘kill vast populations living near oceans in the wake of popular beaches, a tough trade-off for some while logical others choose ‘living further inland’ – away from large bodies of water like ‘large lakes’ where ‘tide levels are also effected by the gravitational pull of the moon’ that can also can a ‘vast deep lake body’ bring a tsunami dependent on which direction tectonic plates move a ‘force directionalized earthquake’ creating a ‘tsunami’ with significant innundating floods over residents living in cities near those ‘large shoreline’ areas too.

What most of the public does not yet fully realize is that ‘large river bodies of water’, like the Mississippi River that is a ‘north’ to ‘south’ directional river’ could easily see ‘east to ‘west’ directional ‘tectonic plates’ move adjacent states – along the New Madrid Fault subduction zone – with significant ‘earthquakes’ – from tectonic plate movement easily capable of squeezing the side banks of even the Missippi River forcing huge amounts of water hurled out onto both ‘east’ and ‘west’ sides resulting in ‘seriously significant inland flooding’ over residents living in ‘low lying’ states of the Central Plains of the United States.

Japan “Pacific Ring Of Fire” Earthquakes To Americas Cascadia Fault Zone

Japan accounts, of a co-relative tsunami, suggest the Cascadia Fault rupture occurred from one ( 1 ) single earthquake triggering a 9-Mw Earth Event on January 26, 1700 where geological evidence obtained from a large number of coastal northern California ( USA ) up to southern Vancouver Island ( Canada ), plus historical records from Japan show the 1,100 kilometer length of the Cascadia Fault subduction zone ruptured ( split cauding that earthquake ) major Earth Event at that time. While the sizes of earlier Cascadia Fault earthquakes are unknown, some “ruptured adjacent segments” ( ‘adjacent tectonic plates’ ) within the Cascadia Fault subduction zone were created over periods of time – ranging from as little as ‘hours’ to ‘years’ – that has historically happened in Japan.

Over the past 20-years, scientific progress in understanding Cascadia Fault subduction zone behavior has been made, however only 15-years ago scientists were still debating whether ‘great earthquakes’ occured at ‘fault subduction zones’. Today, however most scientists realize ‘great earthquakes’ actually ‘do occur in fault subduction zone regions’.

Now, scientific discussions focus on subjects, of:

– Earth crust ‘structural changes’ when a “Plate Boundary” ruptures ( splits ) – Related tsunamis; – Seismogenic zone ( tectonic plate ‘locations’ and ‘width sizes’ ).

Japan America Earthquakes And Tsunamis Exchange

Great Cascadia earthquakes generate tsunamis, which most recently was at-least a ’32-foot high tidal wave’ onto the Pacific Ocean westcoast of Washington, Oregon, and California ( northern portion of state ), and that Cascadia earthquake tsunami sent a consequential 16-foot high todal wave onto Japan.

These Cascadia Fault subduction zone earthquake tsunamis threaten coastal communities all around the Pacific Ocean “Ring of Fire” but have their greatest impact on the United States westcoast and Canada being struck within ’15-minutes’ to ’40-minutes’ shortly ‘after’ a Cascadia Fault subduction zone earthquake occurs.

Deposits, from past Cascadia Fault earthquake tsunamis, have been identified at ‘numerous coastal sites’ in California, Oregon, Washington, and even as far ‘north’ as British Columbia ( Canada ) where distribution of these deposits – based on sophisticated computer software simulations for tsunamis – indicate many coastal communities in California, Oregon, Washington, and even as far ‘north’ as British Columbia ( Canada ) are well within flooding inundation zones of past Cascadia Fault earthquake tsunamis.

California, Oregon, Washington, and even as far ‘north’ as British Columbia ( Canada ) westcoast communities are indeed threatened by future tsunamis from Cascadia great earthquake event ‘tsunami arrival times’ are dependent on measuring the distance from the ‘point of rupture’ ( tectonic plate split, causing earthquake ) – within the Cascadia Fault subduction zone – to the westcoast “landward” side.

Cascadia Earthquake Stricken Damage Zone Data

Strong ground shaking from a “moment of magnitude” ( Mw ) “9″ Plate-Boundary earthquake will last 3-minutes or ‘more’, dominated by ‘long-periods of further earthquakes’ where ‘ground shaking movement damage’ will occur as far inland as the cities of Portland, Oregon; Seattle, Washington; and Vancouver, British Columbia ( Canada ).

Tsunami Optional Wave Patterns “Following Sea”

Large cities within 62-miles to 93-miles of the nearest point of the Cascadia Plate-Boundary zone inferred rupture, will not only experience ‘significant ground shaking’ but also experience ‘extreme duration ground shaking’ lasting far longer, in-addition to far more powerful tsunamis carrying far more seawater because of their consequential “lengthened  wave periods” ( ‘lengthier distances’ between ‘wave crests’ or ‘wave curls’ ) bringing inland akin to what fisherman describe as a deadly “following sea” ( swallowing everything within an ‘even more-so powerful waterpath’ ), the result of which inland causes ‘far more significant damage’ on many ‘tall buildings’ and ‘lengthy structures’ where ‘earthquake magnitude strength’ will be felt ‘strongest’ – all along the United States of America Pacific Ocean westcoast regional areas – experiencing ‘far more significant damage’.Data Assessments Of Reoccuring Cascadia Earthquakes

cascadia ‘great earthquakes’ “mean recurrence interval” ( ‘time period occuring between one earthquake with the next earthquake ) – specific ‘at the point’ of the Cascadia Plate-Boundary – time is between 500-years up-to 600-years, however Cascadia Fault earthquakes in the past have occurred well within the 300-year interval of even less time since the Cascadia ‘great earthquake’ of the 1700s. Time intervals, however between ‘successive great earthquakes’ only a few centuries up-to 1,000 years has little ‘well-measured data’ as to ‘reoccurance interval’ because the numbers of recorded Cascadia earthquakes have rarely measured over ‘five’ ( 5 ). Data additionally indicates Cascadia earthquake intermittancy with irregular intervals when they did occur, plus data lacks ‘random distribution’ ( ‘tectonic plate shift’ or ‘earth movement’ ) or ‘cluster’ of these Cascadia earthquakes over a lengthier period of time so ‘more accurate assessments are unavailable’ for knowning anything more about them. Hence, because Cascadia earthquake ‘recurrence pattern’ is so ‘poorly known’, knowing probabilities of the next Cascadia earthquake occurrence is unfortunately unclear with extremely sparse ‘interval information’ details.

Cascadia Plate-Boundary “Locked” And “Not Locked”

Cascadia Plate-Boundary zone is ‘currently locked’ off the U.S. westcoast shoreline where it has accumulating plate tectonic pressure build-up – from other tectonic plates crashing into it for over 300-years.

The Cascadia Fault subduction zone, at its widest point, is located northwest just off the coast of the State of Washington where the maximum area of seismogenic rupture is approximately 1,100 kilometers long and 50 kilometers up-to 150 kilometers wide. Cascadia Plate-Boundary seismogenic portion location and size data is key-critical for determining earthquake magnitude, tsunami size, and the strength of ground shaking.

Cascadia Plate-Boundary “landward limit” – of only its “locked” portion – where ‘no tectonic plate shift has yet to occur’ is located between the Juan de Fuca tectonic plate and North America tectonic plate were it came to be “locked” between Cascadia earthquakes, however this “ocked” notion has only been delineated from ‘geodetic measurements’ of ‘surface land deformation’ observations. Unfortunately, its “seaward limit” has ‘very few constraints’ up-to ‘no constraints’ for travelling – on its so-called “locked zone” portion – that could certainly move at any time.

Cascadia Plate Continues Sliding

Cascadia transition zone, separating its “locked zone” from its “continuous sliding zone” headed east into the continent of North America, is constrained ( held-back ) poorly so, Cascadia rupture may extend an unknown distance – from its now “locked zone” to its “continously sliding transition zone.”

On some Earth crust faults, near coastal regions, earthquakes may also experience ‘additional Plate-Boundary earthquakes’, ‘increased tsunami tidal wave size’ plus ‘intensification of local area ground shaking’.

Earth Event Mitigation Forces Global Money Flow

Primary ‘government purpose’ to ‘establishing international’ “risk reduction” is solely to ‘minimize global costs from damages’ associated with major magnitude Earth Events similar-to but even-greater than the what happend on March 11, 2011 all over Japan.

Historical earthquake damages assist in predictive projections of damage loss studies suggesting disastrous future losses will occur in the Pacific Northwest from a Cascadia Fault subduction ‘great earthquake’. National ‘loss mitigation efforts’ – studying ‘other seismically active regions’ plus ‘national cost-benefit studies’ indicate that ‘earthquake damage loss mitigation’ may effectively ‘reduce losses’ and ‘assist recovery’ efforts in the future. Accurate data acquired, geological and geophysical research and immediate ‘technological information transfer’ to ‘national key decision-makers’ was to reduce Pacific Northwest Cascadia Fault subduction zone additional risks to those of the Western North America coastal region.

Damage, injuries, and loss of life from the next great earthquake from the Cascadia Fault subduction zone will indeed be ‘great’, ‘widespread’ and ‘significantly ‘impact national economies’ ( Canada and United States ) for years to decades in the future, which has seen a global concerted increase, in:

– International Cooperative Research; – International Information Exchanges; – International Disaster Prepardeness; – International Damage Loss Mitigation Planning; – International Technology Applications; and, – More.

Tectonics Observatory

CALTECH Advanced Rapid Imaging and Analysis ( ARIA ) Project collaborative members of the NASA Jet Propulsion Laboratory ( JPL ), University of California Institute of Technology ( Pasadena ) Tectonics Observatory ARIA Project members, CALTECH scientists, Shengji Wei and Anthony Sladen ( of GEOAZUR ) modelled the Japan Tohoku earthquake fault zone sub-surface ( below surface ) ‘tectonic plate movement’, dervived from:

– TeleSeismic Body Waves ( long-distance observations ); and,

– Global Positioning Satellites ( GPS ) ( near-source observations ).

A 3D image of the fault moving, can be viewed in Google Earth ( internet website webpage link to that KML file is found in the “References” at the bottom of this report ) projects that fault rupture in three dimensional images, which can be viewed from any point of reference, with ‘that analysis’ depicting the rupture ( ground splitting open 100-feet ) resulting in the earthquake ( itself ) ‘triggered from 15-miles ( 24-kilometers ) beneath the ultra-deep sea of the Western Pacific Ocean, with the ‘entire island of Japan being moved east’ by 16-feet ( 5 meters ) from its ‘before earthquake location’.

[ IMAGE ( above ): NASA JPL Project ARIA Tectonic Plate Seismic Wave Direction Map ( click image to enlarge and read ) ]

National Aeronautics and Space Administration ( NASA ) Jet Propulsion Laboratory ( JPL ) at the University of California ( Pasadena ) Institute of Technology ( also known as ) CALTECH Project Advanced Rapid Imaging and Analysis ( ARIA ) used GEONET RINEX data with JPL GIPSY-OASIS software to obtain kinematic “precise point positioning solutions” from a bias fixing method of a ‘single station’ matched-up to JPL orbit and clock products to produce their seismic displacement projection map details that have an inherent ’95% error-rating’ that is even an ‘estimate’, which ‘proves’ these U.S. government organization claims that ‘all they supposedly know’ ( after spending billions of dollars ) are what they are ‘only willing to publicly provide may be ‘only 5% accurate’. So much for what these U.S. government organizations ‘publicly announce’ as their “precise point positioning solutions.”

Pay Any Price?

More ‘double-speak’ and ‘psycho-babble’ serves to ‘only distract the public away from the ‘truth’ as to ‘precisely what’ U.S. taxpayer dollars are ‘actually producing’, and ‘now knowing this’ if ‘those same officials’ ever ‘worked for a small business’ they would either be ‘arrested’ for ‘fraud’ or ‘fired’ because of ‘incompetence’, however since ‘none of them’ will ever ‘admit to their own incometence’ their ‘leadership’ needs to see ‘those responsible’ virtually ‘swing from’ the end of an ‘incredibly long U.S. Department of Justice rope’.

Unfortunately, the facts surrounding all this only get worse.

[ IMAGE ( above ): Tectonic Plates ( brown color ) Sinking and Sunk On Earth Core. ]

Earthquake Prediction Falacy

Earthquake prediction will ‘never be an accomplished finite science for people to ever rely upon’, even though huge amounts of money are being wasted on ‘technology’ for ‘detection sensors’ reading “Seismic Waveforms” ( also known as ) “S Waves” that ‘can be detected and stored in computer databases’, because of a significant fact that will never be learned no matter how much money or time may be devoted to trying to solve the unsolvable problem of the Earth’s sub-crustal regions that consist primarily of ‘molten lake regions’ filled with ‘floating tectonic plates’ that are ‘moving while sinking’ that ‘cannot be tested’ for ‘rock density’ or ‘accumulated pressures’ existing ‘far beneath’ the ‘land surface tectonic plates’.

The very best, and all, that technology can ever perform for the public is to record ‘surface tectonic plates grinding aganist one another’ where ‘only that action’ ( alone ) does in-fact emit the generation of upward ‘accoustic wave form patterns’ named as being ‘seismic waves’ or ‘s-waves’ that ‘do occur’ but ‘only when tectonic plates are moving’.

While a ‘public early warning’ might be helpful for curtailing ‘vehicular traffic’ crossing an ‘interstate bridge’ that might collapse or ‘train traffic’ travel being stopped, thousands of people’s lives could be saved but it would fail to serve millions more living in buildings that collapse.

Early Warning Exclusivity

Knowing governments, using publicly unfamiliar terms, have ‘statisticly analyzed’ “international economics” related to “national infrastructure preparedness” ( ‘early warning systems’ ) – both “public” ( i.e. ‘utility companies’ via ‘government’ with ‘industrial leadership’ meetings ) and “private” ( i.e. ‘residents’ via ‘television’, ‘radio’, ‘newspaper’ and ‘internet’ only ‘commercial advertisements’ ) between which two ( 2 ) sees “national disaster mitigation” ‘primary designated provisions’ for “high density population centers” near “coastal or low-lying regions” ( ‘large bodies of ocean, lake and river water’ ) “early warning” but for only one ( 1 ) being “public” ( i.e. ‘utility companies’ via ‘government’ with ‘industrial leadership’ meetings ) “in the interest of national security” limiting ‘national economic burdens’ from any significant Earth Event impact ‘aftermath’.

In short, and without all the governmentese ‘psycho-babble’ and double-speak’, costs continue being spent on ‘high technology’ efforts to ‘perfect’ a “seismic early warning” for the “exclusive use” ( ‘national government control’ ) that “provides” ( ‘control over’ ) “all major utility company distribution points” ( facilities from where ‘electrical power is only generated’ ) able to “interrupt power” ( ‘stop the flow of electricity nationwide’ from ‘distribution stations’ ), thus “saving additional lives” from “disasterous other problems” ( ‘aftermath loss of lives and injuries’ caused by ‘nuclear fallout radiation’, ‘exploding electrical transformers’, and ‘fires associated with overloaded electrical circuits’ ).

Logically, ‘much’ – but ‘not all’ – of the aforementioned ‘makes perfect sense’, except for “John Doe” or “Jane Doe” ‘exemplified anonomously’ ( herein ) as individuals whom if ‘earlier warned’ could have ‘stopped their vehicle ‘before crossing the bridge that collapsed’ or simply ‘stepped out of the way of a huge sign falling on them’ being ‘killed’ or ‘maimed’, however one might additionally consider ‘how many more would ‘otherwise be killed or maimed’ after an ‘ensuing mass public mob panics’ by ‘receiving’ an “early warning.” Tough call for many, but few.

Earth Data Publicly Minimized

Tohoku-oki earthquake ‘seismic wave form data’ showing the Japan eastcoast tectonic plate “bilaterally ruptured” ( split in-half for a distance of over 310-miles ) was obtained from the USArray seismic stations ( United States ) was analyzed and later modelled by Caltech scientists Lingsen Meng and Jean-Paul Ampuero whom created preliminary data animation demonstrating a ‘super major’ Earth Event simultaneously occurring when the ‘major’ earthquake struck Japan.

U.S. National Security Stations Technology Systems Projects

United States Seismic Array ( USArray ) Data Management Plan Earthscope is composed of three ( 3 ) Projects:

1. Incorporated Research Institutions for Seismology ( IRIS ), a National Science Foundation ( NSF ) consortium of universities, Data Management Center ( DMC ) is ‘managed’ by the “United States Seismic Array ( USArray )” Project;

2. UNAVCO INC. ‘implemented’ “Plate-Boundary Observatory ( PBO )” Project; and,

3. U.S. Geological Service ( USGS ) ‘operated’ “San Andreas Fault Observatory at Depth ( SAFOD )” Project at Stanford University ( California ).

Simultaneous Earth Data Management

USArray component “Earthscope” data management plan is held by USArray IRIS DMC.

USArray consists of four ( 4 ) data generating components:

Permanent Network

Advanced National Seismic System ( ANSS ) BackBone ( BB ) is a joint effort – between IRIS, USArray and USGS – to establish a ‘Permanent Network’ of approximately one-hundred ( 100 ) Earth monitoring ‘receiving stations’ ( alone ) located in the Continental United States ( CONUS ) or lowere 48 states of America, in-addition to ‘other stations’ located in the State of Alaska ( alone ).

Earth Data Multiple Other Monitors

USArray data contribution to the Advanced National Seismic System ( ANSS ) BackBone ( BB ) consists, of:

Nine ( 9 ) new ‘international Earth data accumulation receiving stations’ akin to the Global Seismic Network ( GSN );

Four ( 4 ) “cooperative other stations” from “Southern Methodist University” and “AFTAC”;

Twenty-six ( 26 ) ‘other receiving stations’ from the Advanced National Seismic System ( ANSS ) with ‘upgrade funding’ taken out-of the USArray Project “EarthScope;” plus,

Sixty ( 60 ) additional stations of the Advanced National Seismic System ( ANSS ) BackBone ( BB ) network that ‘currently exist’, ‘will be installed’ or ‘will be upgraded so that ‘data channel stream feeds’ can and ‘will be made seamlessly available’ through IRIS DMC where ‘data can be continuously recorded’ at forty ( 40 ) samples per second and where 1 sample per second can and ‘will be continously transmitted in real-time back into IRIS DMC where quality assurance is held at facilities located in ‘both’ Albuquerque, New Mexico and Golden, Colorado with ‘some’ U.S. Geological Survey ( USGS ) handling ‘some operational responsiblities’ thereof.

Albuquerque Seismological Laboratory ( ASL ) –

Albuquerque Seismological Laboratory ( ASL ) supports operation and maintenance of seismic networks for the U.S. Geological Survey ( USGS ) portion of the Global Seismographic Network ( GSN ) and Advanced National Seismic System ( ANSS ) Backbone network.

ASL runs the Advanced National Seismic System ( ANSS ) depot facility supporting the Advanced National Seismic System ( ANSS ) networks.

ASL also maintains the PASSCAL Instrument Center ( PIC ) facility at the University of New Mexico Tech ( Socorro, New Mexico ) developing, testing, and evaluating seismology monitoring and recording equipment.

Albuquerque Seismological Laboratory ( ASL ) staff are based in ‘both’ Albuquerque, New Mexico and Golden, Colorado.

Top-Down Bottom-Up Data Building Slows Earthquake Notifications

Seismic waveform ( ‘seismic Wave form frequency’ ) data is received by the Global Seismic Network ( GSN ) and Advanced National Seismic System ( ANSS ) BackBone ( BB ) network by electronic transmissions sent ‘slower than real-time’ by sending only ‘near-time data’ ( e.g. tape and compact disc recordings ) to the National Earthquake Information Center ( NEIC ) ‘station’ of the U.S. Geological Survey ( USGS ) ‘officially heralded’ for so-called “rapid earthquake response,”

Unbelieveably is the fact that in-addition to the aforementioned ‘slow Earth Event data delivery process’, an additional number of ‘data receiving stations’ have absolutely ‘no data streaming telemetry’ transmission capabilities whatsoever so, those station data recordings – on ‘tapes’ and ‘compact discs’ – are delivered by ‘other even more time consuming routes’ before that data can even reach the U.S. Geological Survey ( USGS ). In short, all the huge amounts of money being spent goes to ‘increasing computer technologies, sensors, satellites, ‘data stream channel networks’ and ‘secure facility building stations’ from the ‘top, down’ instead of building ‘monitoring stations’ and ‘recording stations’ from the ‘bottom, up’ until the entire earthquake monitoring and notification system is finally built properly. As it curreently stands, the ‘apple cart stations continue being built more and more’ while ‘apple tree stations are not receiving the proper technological nutrients’ to ‘delivery apples ( ‘data’ ) and ‘fed into notification markets’ ( ‘public’ ) where all this could do some good.

U.S. National Security Reviews Delay Already Slow Earthquake Notifications

IRIS Data Management Center ( DMC ) – after processing all incoming data streams from reporting stations around the world – then distributes seismic waveform data ‘back to’ both the Global Seismic Network ( GSN ) and Advanced National Seismic System ( ANSS ) BackBone ( BB ) network operations, but only ‘after seismic waveform data has been ‘thoroughly screened’ by what U.S. national security government Project leadership has deemed its ‘need to control all data’ by “limiting requirements” ( ‘red tape’ ) because ‘all data must undergo’ a long ardous ‘secure data clearing process’ before any data can be released’. Amusingly to some, the U.S. government – in its race to create another ‘official acronym’ of ‘double-speak’ – that national security requirement clearing process’ was ever so aptly named:

“Quality Assurance Framework” ( QUACK )

Enough said.

Let the public decide what to do with ‘those irresponsible officials’, afterall ‘only mass public lives’ are ‘swinging in the breeze’ at the very end-of a now-currently endless ‘dissinformation service rope’ being paid for by the tax-paying public.

In the meantime, while we are all ‘waiting for another Earth Event to take place far beyond, what ( besides this report ) might ‘slap the official horse’, spurring it to move quickly?

How about us? What shhould we do? Perhaps, brushing-up on a little basic knowledge might help.

Inner Earth Deeper Structure Deep Focus Earthquakes Rays And Related Anomalies

There is no substitute for knowledge, seeing information technology ( IT ) at the focal point of many new discoveries aided by supercomputing, modelling and analytics, but common sense does pretty good.

The following information, although an incredibily brief overview on such a wide variety of information topics surrounding a great deal of the in’s and out’s surrounding planet Earth, scratches more than just the surface but deep structure and deep focus impacting a multitude of generations from as far back as 700 years before the birth of Christ ( B.C. ).

Clearly referenced “Encyclopaedia Britannica” general public access information is all second-hand observations of records from other worldwide information collection sources, such as:

– Archives ( e.g. governments, institutions, public and private );

– Symposiums ( e.g. white papers );

– Journals ( professional and technical publications );

– Other information collection sources; and,

– Other information publications.

Encyclopaedias, available in a wide variety of styles and formats, are ’portable catalogs containing a large amount of basic information on a wide variety of topics’ available worldwide to billions of people for increasing their knowledge.

Encyclopedia information formats vary, and through ’volume reading’, within:

– Paper ‘books’ with either ’printed ink’ ( sighted ) or ’embossed dots’ ( Braille );

– Plastic ‘tape cartridges’ ( ‘electromagnetic’ media ) or ‘compact discs’ ( ‘optical’ media ) with ‘electronic device display’; or,

– Electron ‘internet’ ( ‘signal computing’ via ‘satellite’ or ‘telecomputing’ via ’landline’ or ‘node’ networking ) with ‘electronic device display’.

After thoroughly reviewing the Encyclopedia Britannica ‘specific compilation’, independent review found reasonable a facsimile of the original reformatted for easier public comprehension ( reproduced further below ).

Suprisingly, after that Encyclopedia Britannica ‘specific compilation’ information was reformatted for clearer reading comprehension, otherwise inner Earth ‘deep-structure’ geophysical studies formed an amazing correlation with additional factual activities within an equally amazing date chronology of man-made nuclear fracturing reformations of Earth geology geophysical – activities documented worldwide more than 1/2 century ago but somehow forgotten; either by chance or secret circumstance.

How could the Encyclopedia Britannica, or for that matter anyone else, missed something on such a grand scale that is now so obvious?

… [ TEMPORARILY EDITED-OUT FOR REVISION PURPOSES ONLY –  ] …

For more details, about the aforementioned, Click: Here!

Or,

To understand how all this relates, ‘begin with a better basic understanding’ by continuing to read the researched information ( below ):

====

Circa: March 21, 2012

Source:  Encyclopaedia Britannica

Earthquakes

Definition, Earthquake: Sudden shaking of Earth ground caused by passage of seismic waves through Earth rocks.

Seismic waves are produced when some form of energy stored in the Earth’s crust is suddenly released, usually when masses of rock straining against one another suddenly fracture and “slip.” Earthquakes occur most often along geologic faults, narrow zones where rock masses move in relation to one another. Major fault lines of the world are located at the fringes of the huge tectonic plates that make up the Earth’s crust. ( see table of major earthquakes further below )

By the early 20th Century ( 1900s ), little was understood about earthquakes until the emergence of seismology, involving scientific study of all aspects of earthquakes, now yielding answers to long-standing questions as to why and how earthquakes occur.

About 50,000 earthquakes, large enough to be noticed without the aid of instruments, occur every year over the entire Earth, and of these approximately one-hundred ( 100 ) are of sufficient size to produce substantial damage if their centers are near human habitation.

Very great earthquakes, occur on average about once a year, however over centuries these earthquakes have been responsible for millions of human life deaths and an incalculable amount of property damage.

Earthquakes A -Z

Earth’s major earthquakes occur primarily in belts coinciding with tectonic plate margins, apparent since early ( 700 B.C. ) experienced earthquake catalogs, and now more readily discernible by modern seismicity maps instrumentally depicting determined earthquake epicentres.

Most important, is the earthquake Circum-Pacific Belt affecting many populated coastal regions around the Pacific Ocean, namely:

South America;

– North America & Alaska;

Aleutian Islands;

Japan;

New Zealand; and,

New Guinea.

80% of the energy, estimated presently released in earthquakes, comes from those whose epicentres are in the Circum-Pacific Belt belt.

Seismic activity is by no means uniform throughout the belt, and there are a number of branches at various points. Because at many places the Circum-Pacific Belt is associated with volcanic activity, it has been popularly dubbed the “Pacific Ring of Fire.”

A second ( 2nd ) belt, known as the Alpide Belt, passes through the Mediterranean region eastward through Asia and joining the Circum-Pacific Belt in the East Indies where energy released in earthquakes from the Alpide Belt is about 15%of the world total.

There are also seismic activity ‘striking connected belts’, primarily along oceanic ridges including, those in the:

Arctic Ocean;

Atlantic Ocean;

Indian Ocean ( western ); and along,

East Africa rift valleys.

This global seismicity distribution is best understood in terms of its plate tectonic setting.

Forces

Earthquakes are caused by sudden releases of energy within a limited region of Earth rocks, and apparent pressure energy can be released, by:

Elastic strain;

– Gravity;

Chemical Reactions; and / or,

– Massive rock body motion.

Of all these, release of elastic rock strain is most important because this form of energy is the only kind that can be stored in sufficient quantities within the Earth to produce major ground disturbances.

Earthquakes, associated with this type of energy release, are called: Tectonic Earthquakes.

Tectonics

Tectonic plate earthquakes are explained by the so-called elastic rebound theory, formulated by the American geologist Harry Fielding Reid after the San Andreas Fault ruptured in 1906, generating the great San Francisco earthquake.

According to Reid theory of elastic rebound, a tectonic earthquake occurs when energy strains in rock masses have accumulated ( built-up ) to a point where resulting stresses exceed the strength of the rocks where then sudden fracturing results.

Fractures propagate ( travel ) rapidly ( see speeds further below ) through the rock, usually tending in the same direction and sometimes extending many kilometres along a local zone of weakness.

In 1906, for instance, the San Andreas Fault slipped along a plane 270-miles ( 430 kilometers) long, a line alongwhich ground was displaced horizontally as much as 20-feet ( 6 meters ).

As a fault rupture progresses along or up the fault, rock masses are flung in opposite directions, and thus spring back to a position where there is less strain.

At any one point this movement may take place not at-once but rather in irregular steps where these sudden slowings and restartings give rise to vibrations that propagate as seismic waves.

Such irregular properties of fault rupture are now included in ‘physical modeling” and ‘mathematical modeling’ earthquake sources.

Earthquake Focus ( Foci )

Roughnesses along the fault are referred to as asperities, and places where the rupture slows or stops are said to be fault barriers. Fault rupture starts at the earthquake focus ( foci ), a spot that ( in many cases ) is close to being from 5 kilometers to 15 kilometers ‘under the surface where the rupture propagates ( travels )’ in one ( 1 ) or both directions over the fault plane until stopped ( or slowed ) at a barrier ( boundary ).

Sometimes, instead of being stopped at the barrier, the fault rupture recommences on the far side; at other times the stresses in the rocks break the barrier, and the rupture continues.

Earthquakes have different properties depending on the type of fault slip that causes them.

The usual ‘fault model’ has a “strike” ( i.e., direction, from north, taken by a horizontal line in the fault plane ) and a “dip” ( i.e. angle from the horizontal shown by the steepest slope in the fault ).

Movement parallel to the dip is called dip-slip faulting.

In dip-slip faults, if the hanging-wall block moves downward relative to the footwall block, it is called “normal” faulting; the opposite motion, with the hanging wall moving upward relative to the footwall, produces reverse or thrust faulting. The lower wall ( of an inclined fault ) is the ‘footwall’, and laying over the footwall is the hanging wall.

When rock masses slip past each other ( parallel to the strike area ) movement is known as strike-slip faulting.

Strike-slip faults are right lateral or left lateral, depending on whether the block on the opposite side of the fault from an observer has moved to the right or left.

All known faults are assumed to have been the seat of one or more earthquakes in the past, though tectonic movements along faults are often slow, and most geologically ancient faults are now a-seismic ( i.e., they no longer cause earthquakes ).

Actual faulting, associated with an earthquake, may be complex and often unclear whether in one ( 1 ) particular earthquake, where total energy, is being issued from a single ( 1 ) fault plane.

Observed geologic faults sometimes show relative displacements on the order of hundreds of kilometres over geologic time, whereas the sudden slip offsets that produce seismic waves may range from only several centimetres to tens of metres.

During the 1976 Tangshan earthquake ( for example ), a surface strike-slip of about 1 meter was observed along the causative fault east of Beijing, China, and later ( as another example ) during the 1999 Taiwan earthquake the Chelung-pu fault slipped vertically up to 8 meters.

Volcanism & Earthquake Movement

A separate type of earthquake is associated with volcano activity known as a volcanic earthquake.

Although likely, even in such cases, disturbance is officially believed resultant from sudden slip of rock masses adjacent a volcano being consequential release of elastic rock strain energy, however stored energy may be partially of hydrodynamic origin due heat provided by magma flowing movements ( tidal ) throughout underground reservoirs beneath volcanoes or releasing under pressure gas, but then there certainly is a clear corresponding distinction between geographic distribution of volcanoes and major earthquakes particularly within the Circum-Pacific Belt traversing ocean ridges.

Volcano vents, however, are generally several hundred kilometres from epicentres of most ‘major shallow earthquakes’, and it is believed ’many earthquake sources’ occur ‘nowhere near active volcanoes’.

Even in cases where earthquake focus occurs where structures are marked ’directly below volcanic vents’, officially there is probably no immediate causal connection between the two ( 2 ) activities where likely both may be resultant on same tectonic processes.

Earth Fracturing

Artificially Created Inductions

Earthquakes are sometimes caused by human activities, including:

– Nuclear Explosion ( large megaton yield ) detonations underground;

– Oil & Gas wells ( deep Earth fluid injections )

– Mining ( deep Earth excavations );

– Reservoirs ( deep Earth voids filled with incredibly heavy large bodies of water ).

In the case of deep mining, the removal of rock produces changes in the strain around the tunnels.

Slip on adjacent, preexisting faults or outward shattering of rock into where new cavities may occur.

In fluid injection, the slip is thought to be induced by premature release of elastic rock strain, as in the case of tectonic earthquakes after fault surfaces are lubricated by the liquid.

Large underground nuclear explosions have been known to produce slip on already strained faults in the vicinity of test devices.

Reservoir Induction

Of the various earthquake causing activities cited above, the filling of large reservoirs ( see China ) being most prominent.

More than 20 significant cases have been documented in which local seismicity has increased following the impounding of water behind high dams. Often, causality cannot be substantiated, because no data exists to allow comparison of earthquake occurrence before and after the reservoir was filled.

Reservoir-induction effects are most marked for reservoirs exceeding 100 metres ( 330 feet ) in depth and 1 cubic km ( 0.24 cubic mile ) in volume. Three ( 3 ) sites where such connections have very probably occurred, are the:

Hoover Dam in the United States;

Aswan High Dam in Egypt; and.

Kariba Dam on the border between Zimbabwe and Zambia in Africa.

The most generally accepted explanation for earthquake occurrence in such cases assumes that rocks near the reservoir are already strained from regional tectonic forces to a point where nearby faults are almost ready to slip. Water in the reservoir adds a pressure perturbation that triggers the fault rupture. The pressure effect is perhaps enhanced by the fact that the rocks along the fault have lower strength because of increased water-pore pressure. These factors notwithstanding, the filling of most large reservoirs has not produced earthquakes large enough to be a hazard.

Specific seismic source mechanisms associated with reservoir induction have been established in a few cases. For the main shock at the Koyna Dam and Reservoir in India ( 1967 ), the evidence favours strike-slip faulting motion. At both the Kremasta Dam in Greece ( 1965 ) and the Kariba Dam in Zimbabwe-Zambia ( 1961 ), the generating mechanism was dip-slip on normal faults.

By contrast, thrust mechanisms have been determined for sources of earthquakes at the lake behind Nurek Dam in Tajikistan. More than 1,800 earthquakes occurred during the first 9-years after water was impounded in this 317 meter deep reservoir in 1972, a rate amounting to four ( 4 ) times the average number of shocks in the region prior to filling.

Nuclear Explosion Measurement Seismology Instruments

By 1958 representatives from several countries, including the United States and the Russia Soviet Union government, met to discuss the technical basis for a nuclear test-ban treaty where amongst matters considered was feasibility of developing effective means to detect underground nuclear explosions and to distinguish them seismically from earthquakes.

After that conference, much special research was directed to seismology, leading to major advances in seismic signal detection and analysis.

Recent seismological work on treaty verification has involved using high-resolution seismographs in a worldwide network, estimating the yield of explosions, studying wave attenuation in the Earth, determining wave amplitude and frequency spectra discriminants, and applying seismic arrays. The findings of such research have shown that underground nuclear explosions, compared with natural earthquakes, usually generate seismic waves through the body of the Earth that are of much larger amplitude than the surface waves. This telltale difference along with other types of seismic evidence suggest that an international monitoring network of two-hundred and seventy ( 270 ) seismographic stations could detect and locate all seismic events over the globe of magnitude 4.0 and above ( corresponding to an explosive yield of about 100 tons of TNT ).

Earthquake Effects

Earthquakes have varied effects, including changes in geologic features, damage to man-made structures, and impact on human and animal life. Most of these effects occur on solid ground, but, since most earthquake foci are actually located under the ocean bottom, severe effects are often observed along the margins of oceans.

Surface Phenomena

Earthquakes often cause dramatic geomorphological changes, including ground movements – either vertical or horizontal – along geologic fault traces; rising, dropping, and tilting of the ground surface; changes in the flow of groundwater; liquefaction of sandy ground; landslides; and mudflows. The investigation of topographic changes is aided by geodetic measurements, which are made systematically in a number of countries seriously affected by earthquakes.

Earthquakes can do significant damage to buildings, bridges, pipelines, railways, embankments, and other structures. The type and extent of damage inflicted are related to the strength of the ground motions and to the behaviour of the foundation soils. In the most intensely damaged region, called the meizoseismal area, the effects of a severe earthquake are usually complicated and depend on the topography and the nature of the surface materials. They are often more severe on soft alluvium and unconsolidated sediments than on hard rock. At distances of more than 100 km (60 miles) from the source, the main damage is caused by seismic waves traveling along the surface. In mines there is frequently little damage below depths of a few hundred metres even though the ground surface immediately above is considerably affected.

Earthquakes are frequently associated with reports of distinctive sounds and lights. The sounds are generally low-pitched and have been likened to the noise of an underground train passing through a station. The occurrence of such sounds is consistent with the passage of high-frequency seismic waves through the ground. Occasionally, luminous flashes, streamers, and bright balls have been reported in the night sky during earthquakes. These lights have been attributed to electric induction in the air along the earthquake source.

Tsunamis

Following certain earthquakes, very long-wavelength water waves in oceans or seas sweep inshore. More properly called seismic sea waves or tsunamis ( tsunami is a Japanese word for “harbour wave” ), they are commonly referred to as tidal waves, although the attractions of the Moon and Sun play no role in their formation. They sometimes come ashore to great heights—tens of metres above mean tide level—and may be extremely destructive.

The usual immediate cause of a tsunami is sudden displacement in a seabed sufficient to cause the sudden raising or lowering of a large body of water. This deformation may be the fault source of an earthquake, or it may be a submarine landslide arising from an earthquake.

Large volcanic eruptions along shorelines, such as those of Thera (c. 1580 bc) and Krakatoa (ad 1883), have also produced notable tsunamis. The most destructive tsunami ever recorded occurred on December 26, 2004, after an earthquake displaced the seabed off the coast of Sumatra, Indonesia. More than 200,000 people were killed by a series of waves that flooded coasts from Indonesia to Sri Lanka and even washed ashore on the Horn of Africa.

Following the initial disturbance to the sea surface, water waves spread in all directions. Their speed of travel in deep water is given by the formula (√gh), where h is the sea depth and g is the acceleration of gravity.

This speed may be considerable—100 metres per second ( 225 miles per hour ) when h is 1,000 metres ( 3,300 feet ). However, the amplitude ( i.e., the height of disturbance ) at the water surface does not exceed a few metres in deep water, and the principal wavelength may be on the order of hundreds of kilometres; correspondingly, the principal wave period—that is, the time interval between arrival of successive crests—may be on the order of tens of minutes. Because of these features, tsunami waves are not noticed by ships far out at sea.

When tsunamis approach shallow water, however, the wave amplitude increases. The waves may occasionally reach a height of 20 to 30 metres above mean sea level in U- and V-shaped harbours and inlets. They characteristically do a great deal of damage in low-lying ground around such inlets. Frequently, the wave front in the inlet is nearly vertical, as in a tidal bore, and the speed of onrush may be on the order of 10 metres per second. In some cases there are several great waves separated by intervals of several minutes or more. The first of these waves is often preceded by an extraordinary recession of water from the shore, which may commence several minutes or even half an hour beforehand.

Organizations, notably inJapan,Siberia,Alaska, andHawaii, have been set up to provide tsunami warnings. A key development is the Seismic Sea Wave Warning System, an internationally supported system designed to reduce loss of life in thePacific Ocean. Centred inHonolulu, it issues alerts based on reports of earthquakes from circum-Pacific seismographic stations.

Seiches

Seiches are rhythmic motions of water in nearly landlocked bays or lakes that are sometimes induced by earthquakes and tsunamis. Oscillations of this sort may last for hours or even for 1-day or 2-days.

The great Lisbon earthquake of 1755 caused the waters of canals and lakes in regions as far away as Scotland and Sweden to go into observable oscillations. Seiche surges in lakes in Texas, in the southwestern United States, commenced between 30 and 40 minutes after the 1964 Alaska earthquake, produced by seismic surface waves passing through the area.

A related effect is the result of seismic waves from an earthquake passing through the seawater following their refraction through the seafloor. The speed of these waves is about 1.5 km (0.9 mile) per second, the speed of sound in water. If such waves meet a ship with sufficient intensity, they give the impression that the ship has struck a submerged object. This phenomenon is called a seaquake.

Earthquake Intensity and Magnitude Scales

The violence of seismic shaking varies considerably over a single affected area. Because the entire range of observed effects is not capable of simple quantitative definition, the strength of the shaking is commonly estimated by reference to intensity scales that describe the effects in qualitative terms. Intensity scales date from the late 19th and early 20th centuries, before seismographs capable of accurate measurement of ground motion were developed. Since that time, the divisions in these scales have been associated with measurable accelerations of the local ground shaking. Intensity depends, however, in a complicated way not only on ground accelerations but also on the periods and other features of seismic waves, the distance of the measuring point from the source, and the local geologic structure. Furthermore, earthquake intensity, or strength, is distinct from earthquake magnitude, which is a measure of the amplitude, or size, of seismic waves as specified by a seismograph reading ( see below Earthquake magnitude )

A number of different intensity scales have been set up during the past century and applied to both current and ancient destructive earthquakes. For many years the most widely used was a 10-point scale devised in 1878 by Michele Stefano de Rossi and Franƈois-Alphonse Forel. The scale now generally employed in North America is the Mercalli scale, as modified by Harry O. Wood and Frank Neumann in 1931, in which intensity is considered to be more suitably graded.

A 12-point abridged form of the modified Mercalli scale is provided below. Modified Mercalli intensity VIII is roughly correlated with peak accelerations of about one-quarter that of gravity ( g = 9.8 metres, or 32.2 feet, per second squared ) and ground velocities of 20 cm (8 inches) per second. Alternative scales have been developed in bothJapan andEurope for local conditions.

The European ( MSK ) scale of 12 grades is similar to the abridged version of the Mercalli.

Modified Mercalli scale of earthquake intensity

  I. Not felt. Marginal and long-period effects of large earthquakes.

  II. Felt by persons at rest, on upper floors, or otherwise favourably placed to sense tremors.

  III. Felt indoors. Hanging objects swing. Vibrations are similar to those caused by the passing of light trucks. Duration can be estimated.

  IV. Vibrations are similar to those caused by the passing of heavy trucks (or a jolt similar to that caused by a heavy ball striking the walls). Standing automobiles rock. Windows, dishes, doors rattle. Glasses clink, crockery clashes. In the upper range of grade IV, wooden walls and frames creak.

  V. Felt outdoors; direction may be estimated. Sleepers awaken. Liquids are disturbed, some spilled. Small objects are displaced or upset. Doors swing, open, close. Pendulum clocks stop, start, change rate.

  VI. Felt by all; many are frightened and run outdoors. Persons walk unsteadily. Pictures fall off walls. Furniture moves or overturns. Weak plaster and masonry cracks. Small bells ring (church, school). Trees, bushes shake.

  VII. Difficult to stand. Noticed by drivers of automobiles. Hanging objects quivering. Furniture broken. Damage to weak masonry. Weak chimneys broken at roof line. Fall of plaster, loose bricks, stones, tiles, cornices. Waves on ponds; water turbid with mud. Small slides and caving along sand or gravel banks. Large bells ringing. Concrete irrigation ditches damaged.

  VIII. Steering of automobiles affected. Damage to masonry; partial collapse. Some damage to reinforced masonry; none to reinforced masonry designed to resist lateral forces. Fall of stucco and some masonry walls. Twisting, fall of chimneys, factory stacks, monuments, towers, elevated tanks. Frame houses moved on foundations if not bolted down; loose panel walls thrown out. Decayed pilings broken off. Branches broken from trees. Changes in flow or temperature of springs and wells. Cracks in wet ground and on steep slopes.

  IX. General panic. Weak masonry destroyed; ordinary masonry heavily damaged, sometimes with complete collapse; reinforced masonry seriously damaged. Serious damage to reservoirs. Underground pipes broken. Conspicuous cracks in ground. In alluvial areas, sand and mud ejected; earthquake fountains, sand craters.

  X. Most masonry and frame structures destroyed with their foundations. Some well-built wooden structures and bridges destroyed. Serious damage to dams, dikes, embankments. Large landslides. Water thrown on banks of canals, rivers, lakes, and so on. Sand and mud shifted horizontally on beaches and flat land. Railway rails bent slightly.

  XI. Rails bent greatly. Underground pipelines completely out of service.

  XII. Damage nearly total. Large rock masses displaced. Lines of sight and level distorted. Objects thrown into air.

With the use of an intensity scale, it is possible to summarize such data for an earthquake by constructing isoseismal curves, which are lines that connect points of equal intensity. If there were complete symmetry about the vertical through the earthquake’s focus, isoseismals would be circles with the epicentre (the point at the surface of the Earth immediately above where the earthquake originated) as the centre. However, because of the many unsymmetrical geologic factors influencing intensity, the curves are often far from circular. The most probable position of the epicentre is often assumed to be at a point inside the area of highest intensity. In some cases, instrumental data verify this calculation, but not infrequently the true epicentre lies outside the area of greatest intensity.

Earthquake Magnitude

Earthquake magnitude is a measure of the “size” or amplitude of the seismic waves generated by an earthquake source and recorded by seismographs.

Types and nature of these waves are described in Seismic waves ( further below ).

Because the size of earthquakes varies enormously, it is necessary for purposes of comparison to compress the range of wave amplitudes measured on seismograms by means of a mathematical device.

In 1935, American seismologist Charles F. Richter set up a magnitude scale of earthquakes as the logarithm to base 10 of the maximum seismic wave amplitude ( in thousandths of a millimetre ) recorded on a standard seismograph ( the Wood-Anderson torsion pendulum seismograph ) at a distance of 60-miles ( 100 kilometers ) from the earthquake epicentre.

Reduction of amplitudes observed at various distances to the amplitudes expected at the standard distance of 100 kilometers ( 50-miles ) is made on the basis of empirical tables.

Richter magnitudes ML are computed on the assumption the ratio of the maximum wave amplitudes at two ( 2 ) given distances is the same for all earthquakes and is independent of azimuth.

Richter first applied his magnitude scale to shallow-focus earthquakes recorded within 600 km of the epicentre in the southern California region. Later, additional empirical tables were set up, whereby observations made at distant stations and on seismographs other than the standard type could be used. Empirical tables were extended to cover earthquakes of all significant focal depths and to enable independent magnitude estimates to be made from body- and surface-wave observations.

A current form of the Richter scale is shown in the table.

Richter scale of earthquake magnitude

magnitude level

category

effects

earthquakes per year

less than 1.0 to 2.9

micro

generally not felt by people, though recorded on local instruments

more than 100,000

3.0-3.9

minor

felt by many people; no damage

12,000-100,000

4.0-4.9

light

felt by all; minor breakage of objects

2,000-12,000

5.0-5.9

moderate

some damage to weak structures

200-2,000

6.0-6.9

strong

moderate damage in populated areas

20-200

7.0-7.9

major

serious damage over large areas; loss of life

3-20

8.0 and higher

great

severe destruction and loss of life over large areas

fewer than 3

At the present time a number of different magnitude scales are used by scientists and engineers as a measure of the relative size of an earthquake. The P-wave magnitude (Mb), for one, is defined in terms of the amplitude of the P wave recorded on a standard seismograph. Similarly, the surface-wave magnitude (Ms) is defined in terms of the logarithm of the maximum amplitude of ground motion for surface waves with a wave period of 20 seconds.

As defined, an earthquake magnitude scale has no lower or upper limit. Sensitive seismographs can record earthquakes with magnitudes of negative value and have ‘recorded magnitudes up to’ about ‘9.0’ ( 1906 San Francisco earthquake, for example, had a Richter magnitude of 8.25 ).

A scientific weakness is that there is no direct mechanical basis for magnitude as defined above. Rather, it is an empirical parameter analogous to stellar magnitude assessed by astronomers. In modern practice a more soundly based mechanical measure of earthquake size is used—namely, the seismic moment (M0). Such a parameter is related to the angular leverage of the forces that produce the slip on the causative fault. It can be calculated both from recorded seismic waves and from field measurements of the size of the fault rupture. Consequently, seismic moment provides a more uniform scale of earthquake size based on classical mechanics. This measure allows a more scientific magnitude to be used called moment magnitude (Mw). It is proportional to the logarithm of the seismic moment; values do not differ greatly from Ms values for moderate earthquakes. Given the above definitions, the great Alaska earthquake of 1964, with a Richter magnitude (ML) of 8.3, also had the values Ms = 8.4, M0 = 820 × 1027 dyne centimetres, and Mw = 9.2

Earthquake Energy

Energy in an earthquake passing a particular surface site can be calculated directly from the recordings of seismic ground motion, given, for example, as ground velocity. Such recordings indicate an energy rate of 105 watts per square metre (9,300 watts per square foot) near a moderate-size earthquake source. The total power output of a rupturing fault in a shallow earthquake is on the order of 1014 watts, compared with the 105 watts generated in rocket motors.

The surface-wave magnitude Ms has also been connected with the surface energy Es of an earthquake by empirical formulas. These give Es = 6.3 × 1011 and 1.4 × 1025 ergs for earthquakes of Ms = 0 and 8.9, respectively. A unit increase in Ms corresponds to approximately a 32-fold increase in energy. Negative magnitudes Ms correspond to the smallest instrumentally recorded earthquakes, a magnitude of 1.5 to the smallest felt earthquakes, and one of 3.0 to any shock felt at a distance of up to 20 km ( 12 miles ). Earthquakes of magnitude 5.0 cause light damage near the epicentre; those of 6.0 are destructive over a restricted area; and those of 7.5 are at the lower limit of major earthquakes.

The total annual energy released in all earthquakes is about 1025 ergs, corresponding to a rate of work between 10,000,000 million and 100,000,000 million kilowatts. This is approximately one ( 1 ) 1,000th the ‘annual amount of heat escaping from the Earth interior’.

90% of the total seismic energy comes from earthquakes of ‘magnitude 7.0 and higher’ – that is, those whose energy is on the order of 1023 ergs or more.

Frequency

There also are empirical relations for the frequencies of earthquakes of various magnitudes. Suppose N to be the average number of shocks per year for which the magnitude lies in a range about Ms. Then log10 N = abMs fits the data well both globally and for particular regions; for example, for shallow earthquakes worldwide, a = 6.7 and b = 0.9 when Ms > 6.0. The frequency for larger earthquakes therefore increases by a factor of about 10 when the magnitude is diminished by one unit. The increase in frequency with reduction in Ms falls short, however, of matching the decrease in the energy E. Thus, larger earthquakes are overwhelmingly responsible for most of the total seismic energy release. The number of earthquakes per year with Mb > 4.0 reaches 50,000.

Earthquake Occurrences & Plate Tectonic associations

Global seismicity patterns had no strong theoretical explanation until the dynamic model called plate tectonics was developed during the late 1960s. This theory holds that the Earth’s upper shell, or lithosphere, consists of nearly a dozen large, quasi-stable slabs called plates. The thickness of each of these plates is roughly 50-miles ( 80 km ). Plates move horizontally relative to neighbouring plates at a rate of 0.4 to 4 inches ( 1-cm to 10-cm ) per year over a shell of lesser strength called the asthenosphere. At the plate edges where there is contact between adjoining plates, boundary tectonic forces operate on the rocks, causing physical and chemical changes in them. New lithosphere is created at oceanic ridges by the upwelling and cooling of magma from the Earth’s mantle. The horizontally moving plates are believed to be absorbed at the ocean trenches, where a subduction process carries the lithosphere downward into the Earth’s interior. The total amount of lithospheric material destroyed at these subduction zones equals that generated at the ridges. Seismological evidence ( e.g. location of major earthquake belts ) is everywhere in agreement with this tectonic model.

Earthquake Types:

– Shallow Earthquakes;

– Intermediate Earthquakes;

– Deep Focus ( Deep-Foci ) Earthquakes; and

– Deeper Focus ( Deeper-Foci ) Earthquakes.

Earthquake sources, are concentrated along oceanic ridges, corresponding to divergent plate boundaries.

At subduction zones, associated with convergent plate boundaries, deep-focus earthquakes and intermediate focus earthquakes mark locations of the upper part of a dipping lithosphere slab.

Focal ( Foci ) mechanisms indicate stresses aligned with dip of the lithosphere underneath the adjacent continent or island arc.

IntraPlate Seismic Event Anomalies

Some earthquakes associated with oceanic ridges are confined to strike-slip faults, called transform faults offset ridge crests. The majority of earthquakes occurring along such horizontal shear faults are characterized by slip motions.

Also in agreement with plate tectonics theory is high seismicity encountered along edges of plates where they slide past each other. Plate boundaries of this kind, sometimes called fracture zones include, the:

San Andreas Fault system in California; and,

– North Anatolian fault system in Turkey.

Such plate boundaries are the site of interplate earthquakes of shallow focus.

Low seismicity within plates is consistent with plate tectonic description. Small to large earthquakes do occur in limited regions well within the boundaries of plates, however such ‘intraplate seismic events’ can be explained by tectonic mechanisms other than plate boundary motions and their associated phenomena.

Most parts of the world experience at least occasional shallow earthquakes – those that originate within 60 km ( 40 miles ) of the Earth’s outer surface. In fact, the great ‘majority of earthquake foci ( focus ) are shallow’. It should be noted, however, that the geographic distribution of smaller earthquakes is less completely determined than more severe quakes, partly because the ‘availability of relevant data dependent on distribution of observatories’.

Of the total energy released in earthquakes, 12% comes from intermediate earthquakes—that is, quakes with a focal depth ranging from about 60 to 300 km. About 3 percent of total energy comes from deeper earthquakes. The frequency of occurrence falls off rapidly with increasing focal depth in the intermediate range. Below intermediate depth the distribution is fairly uniform until the greatest focal depths, of about 700 km (430 miles), are approached.

Deeper-Focus Earthquakes

Deeper-focus earthquakes commonly occur in patterns called Benioff zones dipping into the Earth, indicating presence of a subducting slab where dip angles of these slabs average about 45° – with some shallower – and others nearly vertical.

Benioff zones coincide with tectonically active island arcs, such as:

– Aleutian islands;

– Japan islands;

– Vanuatu islands; and

– Tonga.

Island arcs are, normally ( but not always ) associated, with:

Ultra-Deep Sea Ocean Trenches, such as the:

South America ( Andes mountain system ).

Exceptions to this rule, include:

– Romania ( East Europe ) mountain system; and

Hindu Kush mountain system.

Most Benioff zones,  deep-earthquake foci and intermediate-earthquake foci are usually found within a narrow layer, however recent more precise hypocentral locations – in Japan and elsewhere – indicate two ( 2 ) distinct parallel bands of foci only 12 miles ( 20 kilometers ) apart.

Aftershocks, Swarms and Foreshocks

Major or even moderate earthquake of shallow focus is followed by many lesser-size earthquakes close to the original source region. This is to be expected if the fault rupture producing a major earthquake does not relieve all the accumulated strain energy at once. In fact, this dislocation is liable to cause an increase in the stress and strain at a number of places in the vicinity of the focal region, bringing crustal rocks at certain points close to the stress at which fracture occurs. In some cases an earthquake may be followed by 1,000 or more aftershocks a day.

Sometimes a large earthquake is followed by a similar one along the same fault source within an hour or perhaps a day. An extreme case of this is multiple earthquakes. In most instances, however, the first principal earthquake of a series is much more severe than the aftershocks. In general, the number of aftershocks per day decreases with time.

Aftershock frequency, is ( roughly ):

Inversely proportional to time since occurrence of largest earthquake in series.

Most major earthquakes occur without detectable warning, but some principal earthquakes are preceded by foreshocks.

Japan 2-Years of Hundreds of Thousands Of Earthquakes

In another common pattern, large numbers of small earthquakes may occur in a region for months without a major earthquake.

In the Matsushiro region of Japan, for instance, there occurred ( between August 1965 and August 1967 ) a ‘series of earthquakes’ numbering in the hundreds of thousands – some sufficiently strong ( up to Richter magnitude 5.0 ) causing property damage but no casualties.

Maximum frequency? 6,780 small earthquakes on just April 17, 1966.

Such series of earthquakes are called earthquake swarms.

Earthquakes, associated with volcanic activity often occur in swarms – though swarms also have been observed in many nonvolcanic regions.

Study of Earthquakes

Seismic waves

Principal types of seismic waves

Seismic Waves ( S Waves ), generated by an earthquake source, are commonly classified into three ( 3 ) ‘leading types’.

The first two ( 2 ) leading types, propagate ( travel ) within the body of the Earth, are known as:

P ( Primary ) Seismic Waves; and,

S ( Secondary ) Seismic Waves ( S / S Waves ).

The third ( 3rd ) leading types, propagate ( travel ) along surface of the Earth, are known as:

L ( Love ) Seismic Waves; and,

R ( Rayleigh ) Seismic Waves.

During the 19th Century, existence of these types of seismic waves were mathematically predicted, and modern comparisons show close correspondence between such ‘theoretical calculations‘ and ‘actual measurements’ of seismic waves.

P seismic waves travel as elastic motions at the highest speeds, and are longitudinal waves transmitted by both solid and liquid materials within inner Earth.

P waves ( particles of the medium) vibrate in a manner ‘similar to sound waves’ transmitting media ( alternately compressed and expanded ).

The slower type of body wave, the S wave, travels only through solid material. With S waves, the particle motion is transverse to the direction of travel and involves a shearing of the transmitting rock.

Focus ( Foci )

Because of their greater speed, P waves are the first ( 1st ) to reach any point on the Earth’s surface. The first ( 1st ) P-wave onset ‘starts from the spot where an earthquake originates’. This point, usually at some depth within the Earth, is called the focus (also known as ) hypocentre.

Epicenter

Point ‘at the surface’ ( immediately ‘above the Focus / Foci’ ) is known as the ‘epicenter’.

Love waves and Rayleigh waves, guided by the free surface of the Earth, trail after P and S waves have passed through the body of planet Earth.

Rayleigh waves ( R Waves ) and Love waves ( L Waves ) involve ‘horizontal particle motion’, however ‘only Rayleigh waves exhibit ‘vertical ground displacements’.

Rayleigh waves ( R waves ) and Love ( L waves ) travel ( propagate ) and disperse into long wave trains, when occurring away-from ‘alluvial basin sources’, at substantial distances cause much of the Earth surface ground shaking felt during earthquakes.

Seismic Wave Focus ( Foci ) Properties

At all distances from the focus ( foci ), mechanical properties of rocks, such as incompressibility, rigidity and density play roles, in:

– Speed of ‘wave travel’;

– Duration of ‘wave trains’; and,

– Shape of ‘wave trains’.

Layering of the rocks and the physical properties of surface soil also affect wave characteristics.

In most cases, ‘elastic behaviors occur in earthquakes, however strong shaking ( of surface soils from the incident seismic waves ) sometimes result in ‘nonelastic behavior’, including slumping ( i.e., downward and outward movement of unconsolidated material ) and liquefaction of sandy soil.

Seismic wave that encounters a boundary separating ‘rocks of different elastic properties’ undergo reflection and refraction where a special complication exists because conversion between wave types usually also occur at such a boundary where an incident P or S wave can yield reflected P and S waves and refracted P and S waves.

Between Earth structural layers, boundaries give rise to diffracted and scattered waves, and these additional waves are partially responsible for complications observed in ground motion during earthquakes.

Modern research is concerned with ‘computing, synthetic records of ground motion realistic comparisons with observed actual ground shaking’, using wave theory in complex structures.

Grave Duration Long-Periods, Audible Earthquake Frequencies, and Other Earth Anomalies

Frequency range of seismic waves is widely varied, from being ‘High Frequency’ ( HF ) as an ‘audible range’ ( i.e. greater than > 20 hertz ) to, as Low Frequency ( LF ) as subtle as ‘free oscillations of planet Earth’ – with grave Long-Periods being 54-minutes ( see below Long-Period oscillations of the globe ).

Seismic wave attenuations in rock imposes High-Frequency ( HF ) limits, and in small to moderate earthquakes the dominant frequencies extend in Surface Waves from about 1.0 Hz to 0.1 Hertz.

Seismic wave amplitude range is also great in most earthquakes.

Displacement of ground ranges, from: 10−10 to 10−1 metre ( 4−12 to 4-inches ).

Great Earthquake Speed

Great Earthquake Ground Speed Moves Faster Than 32-Feet Per Second, Squared ( 9.8 Metres Per Second, Squared ) –

In the greatest earthquakes, ground amplitude of predominant P waves may be several centimetres at periods of 2-seconds to 5-seconds, however very close to seismic sources of ‘great earthquakes’, investigators measured ‘large wave amplitudes’ with ‘accelerations of the ground exceeding ( speed of gravity ) 32.2 feet per second squared ( 9.8 meters per second, squared ) at High Frequencies ( HF ) and ground displacements of 1 metre at Low Frequencies ( LF ).

Seismic Wave Measurement

Seismographs and Accelerometers

Seismographs ‘measure ground motion’ in both ‘earthquakes’ and ‘microseisms’ ( small oscillations described below ).

Most of these instruments are of the pendulum type. Early mechanical seismographs had a pendulum of large mass ( up to several tons ) and produced seismograms by scratching a line on smoked paper on a rotating drum.

In later instruments, seismograms ( also known as seismometers ) recorded via ‘rays of light bounced off a mirror’ within a galvanometer using electric current from electromagnetic induction ‘when the pendulum of the seismograph moved’.

Technological developments in electronics have given rise to ‘higher-precision pendulum seismometers’ and ‘sensors of ground motion’.

In these instruments electric voltages produced by motions of the pendulum or the equivalent are passed through electronic circuitry to ‘amplify ground motion digitized for more exactness’ readings.

Seismometer Nomenclature Meanings

Seismographs are divided into three ( 3 ) types of instruments knowingly confused by the public because of their varied names, as:

– Short-Period;

– Intermediate-Period ( also known as Long-Period );

– Long-Period ( also known as Intermediate-Period );

– Ultra-Long-Period ( also known as Broadband or Broad-Band ); or,

– Broadband ( also known as Ultra Long-Period or UltraLong-Period ).

Short-Period instruments are used to record P and S body waves with high magnification of the ground motion. For this purpose, the seismograph response is shaped to peak at a period of about 1-second or less.

Intermediate-period instruments, the type used by the World-Wide Standardized Seismographic Network ( WWSSN ) – described in the section Earthquake observatories – had about a 20-second ( maximum ) response.

Recently, in order to provide as much flexibility as possible for research work, the trend has been toward the operation of ‘very broadband seismographs’ digitizing representation of signals. This is usually accomplished with ‘very long-period pendulums’ and ‘electronic amplifiers’ passing signals in the band between 0.005 Hz and 50 Hertz.

When seismic waves close to their source are to be recorded, special design criteria are needed. Instrument sensitivity must ensure that the largest ground movements can be recorded without exceeding the upper scale limit of the device. For most seismological and engineering purposes the wave frequencies that must be recorded are higher than 1 hertz, and so the pendulum or its equivalent can be small. For this reason accelerometers that measure the rate at which the ground velocity is changing have an advantage for strong-motion recording. Integration is then performed to estimate ground velocity and displacement. The ground accelerations to be registered range up to two times that of gravity. Recording such accelerations can be accomplished mechanically with short torsion suspensions or force-balance mass-spring systems.

Because many strong-motion instruments need to be placed at unattended sites in ordinary buildings for periods of months or years before a strong earthquake occurs, they usually record only when a trigger mechanism is actuated with the onset of ground motion. Solid-state memories are now used, particularly with digital recording instruments, making it possible to preserve the first few seconds before the trigger starts the permanent recording and to store digitized signals on magnetic cassette tape or on a memory chip. In past design absolute timing was not provided on strong-motion records but only accurate relative time marks; the present trend, however, is to provide Universal Time ( the local mean time of the prime meridian ) by means of special radio receivers, small crystal clocks, or GPS ( Global Positioning System ) receivers from satellite clocks.

Prediction of strong ground motion and response of engineered structures in earthquakes depends critically on measurements of the spatial variability of earthquake intensities near the seismic wave source. In an effort to secure such measurements, special arrays of strong-motion seismographs have been installed in areas of high seismicity around the world.

Large-aperture seismic arrays (linear dimensions on the order of about 1/2 mile ( 0.6 mile ) to about 6 miles ( 1 kilometer to 10 kilometers ) of strong-motion accelerometers now used to improve estimations of speed, direction of propagation and types of seismic wave components.

Particularly important for full understanding of seismic wave patterns at the ground surface is measurement of the variation of wave motion with depth where to aid in this effort special digitally recording seismometers have been ‘installed in deep boreholes’.

Ocean-Bottom Measurements

70% of the Earth’s surface is covered by water so, ocean-bottom seismometers augment ( add to ) global land-based system of recording stations.

Field tests have established the feasibility of extensive long-term recording by instruments on the seafloor.

Japan has a ‘semi-permanent seismograph’ system of this type placed on the seafloor off the Pacific Ocean eastcoast of centralHonshu,Japan in 1978 by means of a ‘cable’.

Because of mechanical difficulties maintaining ‘permanent ocean-bottom instrumentation’, different systems have been considered.

They ‘all involve placement of instruments on the ocean bottom’, though they employ various mechanisms for data transmission.

Signals may be transmitted to the ocean surface for retransmission by auxiliary apparatus or transmitted via cable to a shore-based station. Another system is designed to release its recording device automatically, allowing it to float to the surface for later recovery.

Ocean bottom seismograph use should yield much-improved global coverage of seismic waves and provide new information on the seismicity of oceanic regions.

Ocean-bottom seismographs will enable investigators to determine the details of the crustal structure of the seafloor and, because of the relative ‘thinness of the oceanic crust‘, should make possible collection of clear seismic information about Earth’s upper mantle.

Ocean bottom seismograph systems are also expected to provide new data, on Earth:

– Continental Shelf Plate Boundaries;

– MicroSeisms ( origins and propagations ); and,

– Ocean to Continent behavior margins.

MicroSeisms Measurements

MicroSeisms ( also known as ) ‘small ground motions’ are commonly recorded by seismographs. Small weak seismic wave motions ( also known as ) MicroSeisms are ‘not generated by earthquakes’ but in some instances can complicate accurate earthquake measurement recording. MicroSeisms are of scientific interest because their form relates to Earth surface structure.

Microseisms ( some ) have ‘local cause’, for example:

Microseisms due to traffic ( or machinery ) or local wind effects, storms and rough surf against an extended steep coastline.

Another class of microseisms exhibits features that are very similar on records traced at earthquake observatories that are widely separated, including approximately simultaneous occurrence of maximum amplitudes and similar wave frequencies. These microseisms may persist for many hours and have more or less regular periods of about five to eight seconds.

The largest amplitudes of such microseisms are on the order of 10−3 cm ( 0.0004 inch ) and ‘occur in coastal regions’. Amplitudes also depend to some extent on local geologic structure.

Some microseisms are produced when ‘large standing water waves are formed far out at sea’. The period of this type of microseism is ‘half’ of the Standing Wave.

Observations of Earthquakes

Earthquake Observatories

During the late 1950s, there were only about seven-hundred ( 700 ) seismographic stations worldwide, equipped with seismographs of various types and frequency responses – few instruments of which were calibrated; actual ground motions could not be measured, and ‘timing errors of several seconds’ were common.

The World-Wide Standardized Seismographic Network ( WWSSN ), became the first modern worldwide standardized system established to remedy that situation.

Each of the WWSSN had six ( 6 ) seismograph stations with three ( 3 ) short-period and three ( 3 ) long-period seismographs with timing and accuracy maintained by quartz crystal clocks, and a calibration pulse placed daily on each record.

By 1967, the WWSSN consisted of about one-hundred twenty ( 120 ) stations throughout sixty ( 60 ) countries, resulting in data to provide the basis for significant advances in research, on:

– Earthquakes ( mechanisms );

– Plate Tectonics ( global ); and,

– Deep-Structure Earth ( interior ).

By the 1980s a further upgrading of permanent seismographic stations began with the installation of digital equipment by a number of organizations.

Global digital seismograph station networks, now in operation, consist of:

– Seismic Research Observatories ( SRO ) within boreholes drilled 330 feet ( 100 metres ) deep in Earth ground; and,

– Modified high-gain long-period earthquake observatories located on Earth ground surfaces.

The Global Digital Seismographic Network in particular has remarkable capability, recording all motions from Earth ocean tides to microscopic ground motions at the level of local ground noise.

At present there are about 128 sites. With this system the long-term seismological goal will have been accomplished to equip global observatories with seismographs that can record every small earthquake anywhere over a broad band of frequencies.

Epicentre Earthquakes Located

Many observatories make provisional estimates of the epicentres of important earthquakes. These estimates provide preliminary information locally about particular earthquakes and serve as first approximations for the calculations subsequently made by large coordinating centres.

If an earthquake’s epicentre is less than 105° away from an earthquake observatory, the epicentre position can often be estimated from the readings of three ( 3 ) seismograms recording perpendicular components of the ground motion.

For a shallow earthquake the epicentral distance is indicated by the interval between the arrival times of P and S waves; the azimuth and angle of wave emergence at the surface indicated by somparing sizes and directions of the first ( 1st ) movements indicated by seismograms and relative sizes of later waves – particularly surface waves.

Anomaly

It should be noted, however, that in certain regions the first ( 1st ) wave movement at a station arrives from a direction differing from the azimuth toward the epicentre. This ‘anomaly is usually explained’ by ‘strong variations in geologic structures’.

When data from more than one observatory are available, an earthquake’s epicentre may be estimated from the times of travel of the P and S waves from source to recorder. In many seismically active regions, networks of seismographs with telemetry transmission and centralized timing and recording are common. Whether analog or digital recording is used, such integrated systems greatly simplify observatory work: multichannel signal displays make identification and timing of phase onsets easier and more reliable. Moreover, online microprocessors can be programmed to pick automatically, with some degree of confidence, the onset of a significant common phase, such as P, by correlation of waveforms from parallel network channels. With the aid of specially designed computer programs, seismologists can then locate distant earthquakes to within about 10 km (6 miles) and the epicentre of a local earthquake to within a few kilometres.

Catalogs of earthquakes felt by humans and of earthquake observations have appeared intermittently for many centuries. The earliest known list of instrumentally recorded earthquakes with computed times of origin and epicentres is for the period 1899 – 1903, afterwhich cataloging of earthquakes became more uniform and complete.

Especially valuable is the service provided by the International Seismological Centre ( ISC ) in Newbury, UK that monthly receives more than 1,000,000 seismic readings from more than 2,000 seismic monitoring stations worldwide and preliminary estimates locations of approximately 1,600 earthquakes from national and regional agencies and observatories.

The ISC publishes a monthly bulletin about once ( 1 ) every 2-years. The bulletin, when published, provides ‘all available information that was’ on each of more than 5,000 earthquakes.

Various national and regional centres control networks of stations and act as intermediaries between individual stations and the international organizations.

Examples of long-standing national centers include, the:

Japan Meteorological Agency; and,

U.S. National Earthquake Information Center ( NEIC ), a subdivision of the U.S. Geological Survey ( USGS ).

Centers, such as the aforementioned, normally make ‘local earthquake estimates’, of:

– Magnitude;

– Epicentre;

– Time origin; and,

– Focal depth.

Global seismicity data is continually accessible via Incorporated Research Institutions for Seismology ( IRIS ) website.

An important research technique infers the character of faulting ( in an earthquake ) from recorded seismograms.

For example, observed distributions ( of the directions of the first onsets in waves arriving at the Earth’s surface ) have been effectively used.

Onsets are called “compressional” or “dilatational,” according to whether the direction is ‘away from’ or ‘toward’ the focus, respectively.

A polarity pattern becomes recognizable when the directions of the P-wave onsets are plotted on a map – there are broad areas in which the first onsets are predominantly compressions, separated from predominantly dilatational areas by nodal curves near which the P-wave amplitudes are abnormally small.

In 1926 the American geophysicist Perry E. Byerly used patterns of P onsets over the entire globe to infer the orientation of the fault plane in a large earthquake. The polarity method yields two P-nodal curves at the Earth’s surface; one curve is in the plane containing the assumed fault, and the other is in the plane ( called the auxiliary plane ) that passes through the focus and is perpendicular to the forces of the plane.

The recent availability of worldwide broad-based digital recording enabled computer programs written estimating the fault mechanism and seismic moment based on complete pattern of seismic wave arrivals.

Given a well-determined pattern at a number of earthquake observatories, it is possible to locate two ( 2 ) planes, one ( 1 ) of which is the plane containing the fault.

Earthquake Prediction

Earthquake Observations & Interpretations

Statistical earthquake occurrences are believed theorized, not widely accepted nor detecting periodic cycles, records of which old periodicities in time and space for major / great earthquakes cataloged are as old as 700 B.C. with China holding the ‘world’s most extensive catalog’ of approximately one ( 1 ) one-thousand ( 1,000 ) destructive earthquakes where ‘magnitude ( size )’ measurements were assessed based on ‘damage reports’ and experienced periods of ‘shaking’ and ‘other observations’ determining ‘intensity’ of those earthquakes.

Earthquake Attributions to Postulation

Precursor predictability approaches involve what some believe is sheer postulating what the initial trigger mechanisms are that force Earth ruptures, however where this becomes bizarre is where such forces have been attributed, to:

Weather Severity;

– Volcano Activity; and,

– Ocean Tide Force ( Moon ).

EXAMPLE: Correlations between physical phenomena assumed providing trigger mechanisms for earthquake repetition.

Professionals believe such must always be made to discover whether a causative link is actually present, and they further believe that to-date: ‘no cases possess any trigger mechanism’ – insofaras ‘moderate earthquakes’ to ‘large earthquakes’ unequivocally finding satisfaction with various necessary criteria.

Statistical methods also have been tried with populations of regional earthquakes with such suggested, but never established generally, that the slope b of the regression line between the logarithm of the number of earthquakes and the magnitude for a region may change characteristically with time.

Specifically, the claim is that the b value for the population of ‘foreshocks of a major earthquake’ may be ‘significantly smaller’ than the mean b value for the region averaged ‘over a long interval of time’.

Elastic rebound theory, of earthquake sources, allows rough prediction of the occurrence of large shallow earthquakes – for example – Harry F. Reid gave a crude forecast of the next great earthquake near San Francisco ( theory also predicted, of-course, the place would be along the San Andreas Fault or associated fault ). Geodetic data indicated that during an interval of 50 years relative displacements of 3.2 metres ( 10-1/2 feet ) had occurred at distant points across the fault. Elastic-rebound maximum offset ( along the fault in the 1906 earthquake ) was 6.5 metres. Therefore, ( 6.5 ÷ 3.2 ) × 50 or about 100-years would again elapse before sufficient strain accumulated for the occurrence of an earthquake comparable to that of 1906; premises being regional strain will grow uniformly and various constraints have not been altered by the great 1906 rupture itself ( such as by the onset of slow fault slip ).

Such ‘strain rates’ are now, however being more adequately measured ( along a number of active faults, e.g.San Andreas Fault) using networks of GPS sensors.

Earthquake Prediction Research

For many years prediction research has been influenced by the basic argument that ‘strain accumulates in rock masses in the vicinity of a fault, resulting in crustal deformation.

Deformations have been measured in ‘horizontal directions’ along active faults via ‘trilateration’ and ‘triangulation’ and in ‘vertical directions’ via ‘precise leveling and tiltmeters’.

Investigators ( some ) believe ‘ground-water level changes occur prior to earthquakes’ with variations of such reports from China.

Ground water levels respond to an array of complex factors ( e.g. ‘rainfall’ ) where such would have to be removed if changes in water level changes were studied in relation to earthquakes.

Phenomena Precursor Premonitories

Dilatancy theory ( i.e., volume increase of rock prior to rupture ) once occupied a central position in discussions of premonitory phenomena of earthquakes, but now receives less support based on observations that many solids exhibit dilatancy during deformation. For earthquake prediction, significance of dilatancy, if real, effects various measurable quantities of crustal Earth, i.e. seismic velocity, electric resistivity and ground and water levels. Consequences of dilatancy for earthquake prediction are summarized in the table ( below ):

The best-studied consequence is the effect on seismic velocities. The influence of internal cracks and pores on the elastic properties of rocks can be clearly demonstrated in laboratory measurements of those properties as a function of hydrostatic pressure. In the case of saturated rocks, experiments predict – for shallow earthquakes – that dilatancy occurs as a portion of the crust is stressed to failure, causing a decrease in the velocities of seismic waves. Recovery of velocity is brought about by subsequent rise of the pore pressure of water, which also has the effect of weakening the rock and enhancing fault slip.

Strain buildup in the focal region may have measurable effects on other observable properties, including electrical conductivity and gas concentration. Because the electrical conductivity of rocks depends largely on interconnected water channels within the rocks, resistivity may increase before the cracks become saturated. As pore fluid is expelled from the closing cracks, the local water table would rise and concentrations of gases such as radioactive radon would increase. No unequivocal confirming measurements have yet been published.

Geologic methods of extending the seismicity record back from the present also are being explored. Field studies indicate that the sequence of surface ruptures along major active faults associated with large earthquakes can sometimes be constructed.

An example is the series of large earthquakes inTurkeyin the 20th Century, which were caused mainly by successive westward ruptures of the North Anatolian Fault.

Liquefaction effects preserved in beds of sand and peat have provided evidence ( using radiometric dating methods ) for large paleoearthquakes back more than 1,000 years in many seismically active zones, including the U.S. Northwest Pacific Ocean Coastal Region.

Less well-grounded precursory phenomena, particularly earthquake lights and animal behaviour, sometimes draw more public attention than the precursors discussed above.

Unusual lights in the sky reported, and abnormal animal behaviour, preceding earthquakes are known to seismologists – mostly in anecdotal form.

Both phenomena, are usually explained away in terms of ( prior to earthquakes ) there being:

– Gaseous emmissions from Earth ground;

– Electric stimuli ( various ), e.g. HAARP, etcetera, from Earth ground; and,

– Acoustic stimuli ( various ), e.g. Seismic Wave subsonic emmissions from Earth ground.

At the present time, there is no definitive experimental evidence supporting reported claims of animals sometimes sensing an approaching earthquake.

… [ CENSORED-OUT ] …

Earthquake Hazard Reduction Methods

Considerable work has been done in seismology to explain the characteristics of the recorded ground motions in earthquakes. Such knowledge is needed to predict ground motions in future earthquakes so that earthquake-resistant structures can be designed.

Although earthquakes cause death and destruction via such secondary effects ( i.e. landslides, tsunamis, fires and fault rupture ), the greatest losses ( human lives and property ) result from ‘collapsing man-made structures’ amidst violent ground shaking.

The most effective way to mitigate ( minimize ) damage from earthquakes – from an engineering standpoint – is to design and construct structures capable of withstanding ‘strong ground motions’.

Interpreting recorded ground motions

Most ‘elastic waves’ recorded ( close to an extended fault source ) are complicated and difficult to interpret uniquely.

Understanding such, near-source motion, can be viewed as a 3 part problem.

The first ( 1st ) part stems from ‘elastic wave generations’ radiating ( from the slipping fault ) as the ‘moving rupture sweeps-out an area of slip’ ( along the fault plane ) – within a given time.

Wave pattern production dependencies on several parameters, such as:

Fault dimension and rupture velocity.

Elastic waves ( various types ) radiate, from the vicinity of the moving rupture, in all directions.

Geometric and frictional properties of the fault, critically affect wave pattern radiation from it.

The second ( 2nd ) part of the problem concerns the passage of the waves through the intervening rocks to the site and the effect of geologic conditions.

The third ( 3rd ) part involves the conditions at the recording site itself, such as topography and highly attenuating soils. All these questions must be considered when estimating likely earthquake effects at a site of any proposed structure.

Experience has shown that the ground strong-motion recordings have a variable pattern in detail but predictable regular shapes in general ( except in the case of strong multiple earthquakes ).

EXAMPLE: Actual ground shaking ( acceleration, velocity and displacement ) recorded during an earthquake ( see figure below ).

In a strong horizontal shaking of the ground near the fault source, there is an initial segment of motion made up mainly of P waves, which frequently manifest themselves strongly in the vertical motion. This is followed by the onset of S waves, often associated with a longer-period pulse of ground velocity and displacement related to the near-site fault slip or fling. This pulse is often enhanced in the direction of the fault rupture and normal to it. After the S onset there is shaking that consists of a mixture of S and P waves, but the S motions become dominant as the duration increases. Later, in the horizontal component, surface waves dominate, mixed with some S body waves. Depending on the distance of the site from the fault and the structure of the intervening rocks and soils, surface waves are spread out into long trains.

Expectant Seismic Hazard Maps Constructed

In many regions, seismic expectancy maps or hazard maps are now available for planning purposes. The anticipated intensity of ground shaking is represented by a number called the peak acceleration or the peak velocity.

To avoid weaknesses found in earlier earthquake hazard maps, the following general principles are usually adopted today:

The map should take into account not only the size but also the frequency of earthquakes.

The broad regionalization pattern should use historical seismicity as a database, including the following factors: major tectonic trends, acceleration attenuation curves, and intensity reports.

Regionalization should be defined by means of contour lines with design parameters referred to ordered numbers on neighbouring contour lines ( this procedure minimizes sensitivity concerning the exact location of boundary lines between separate zones ).

The map should be simple and not attempt to microzone the region.

The mapped contoured surface should not contain discontinuities, so that the level of hazard progresses gradually and in order across any profile drawn on the map.

Developing resistant structures

Developing engineered structural designs that are able to resist the forces generated by seismic waves can be achieved either by following building codes based on hazard maps or by appropriate methods of analysis. Many countries reserve theoretical structural analyses for the larger, more costly, or critical buildings to be constructed in the most seismically active regions, while simply requiring that ordinary structures conform to local building codes. Economic realities usually determine the goal, not of preventing all damage in all earthquakes but of minimizing damage in moderate, more common earthquakes and ensuring no major collapse at the strongest intensities. An essential part of what goes into engineering decisions on design and into the development and revision of earthquake-resistant design codes is therefore seismological, involving measurement of strong seismic waves, field studies of intensity and damage, and the probability of earthquake occurrence.

Earthquake risk can also be reduced by rapid post-earthquake response. Strong-motion accelerographs have been connected in some urban areas, such as Los Angeles, Tokyo, and Mexico City, to interactive computers.

Recorded waves are correlated with seismic intensity scales and rapidly displayed graphically on regional maps via the World Wide Web.

Exploration of the Earth’s interior with seismic waves

Seismological Tomography

Deep Structure Earth seismological data from several sources, including:

– Nuclear explosions containing P-Waves and S-Waves;

– Earthquakes containing P-Waves and S-Waves;

– Earth ‘surface wave dispersions’ from ‘distant earthquakes’; and,

– Earth ‘planetary vibration’ from ‘Great Earthquakes’

One of the major aims of seismology was to infer a minimum set of properties surrounding the planet interior of Earth that might explain recorded seismic ‘wave trains’ in detail.

Deep Structure Earth exploration made ‘tremendous progress during the first half of the 20th Century ( 1900s – 1950s ), realizing goals was severely limited until the 1960s because of laborious effort required just to evaluate theoretical models and process large amounts of recorded earthquake data.

Today’s application of supercomputer high-speed data processing enormous quantities of stored data and information retrieval capabilities opened information technology ( IT ) passageways leading to major advancements in the way data is manipulated ( data handling ) for advanced theoretical modeling, research analytics and developmental prototyping.

Earth structure realistic modeling studies by researchers since the middle 1970s include continental and oceanic boundaries, mountains and river valleys rather than simple structures such as those involving variation only with depth, and various technical developments have benefited observational seismology.

EXAMPLE: Deep Structure Earth significant exploration using 3D ( three dimensional ) imaging with equally impressive display ( monitor ) equipment possible from advanced microprocessor architecture redesign, new discoveries of materials and new concepts making seismic exploratory techniques developed by petroleum industry adaptations ( e.g. seismic reflection ) highly recognized as adopted procedures.

Deep Structure Earth major methods for determining planet interior is detailed analysis of seismograms of seismic waves; noting earthquake readings additionally provide estimates of, Earth internal:

Wave velocities;

– Density; and,

– Parameters of ‘elasticity’ ( stretchable ) and ‘inelasticity’ ( fixed ).

Earthquake Travel Time

Primary procedure is to measure the travel times of various wave types, such as P and S, from their source to the recording seismograph. First, however, identification of each wave type with its ray path through the Earth must be made.

Seismic rays for many paths of P and S waves leaving the earthquake focus F are shown in the figure.

Deep-Focus Deep-Structure Earth Coremetrics

Rays corresponding to waves that have been reflected at the Earth’s outer surface (or possibly at one of the interior discontinuity surfaces) are denoted as PP, PS, SP, PSS, and so on. For example, PS corresponds to a wave that is of P type before surface reflection and of S type afterward. In addition, there are rays such as pPP, sPP, and sPS, the symbols p and s corresponding to an initial ascent to the outer surface as P or S waves, respectively, from a deep focus.

An especially important class of rays is associated with a discontinuity surface separating the central core of the Earth from the mantle at a depth of about 2,900 km (1,800 miles) below the outer surface. The symbol c is used to indicate an upward reflection at this discontinuity. Thus, if a P wave travels down from a focus to the discontinuity surface in question, the upward reflection into an S wave is recorded at an observing station as the ray PcS and similarly with PcP, ScS, and ScP. The symbol K is used to denote the part (of P type) of the path of a wave that passes through the liquid central core. Thus, the ray SKS corresponds to a wave that starts as an S wave, is refracted into the central core as a P wave, and is refracted back into the mantle, wherein it finally emerges as an S wave. Such rays as SKKS correspond to waves that have suffered an internal reflection at the boundary of the central core.

The discovery of the existence of an inner core in 1936 by the Danish seismologist Inge Lehmann made it necessary to introduce additional basic symbols. For paths of waves inside the central core, the symbols i and I are used analogously to c and K for the whole Earth; therefore, i indicates reflection upward at the boundary between the outer and inner portions of the central core, and I corresponds to the part (of P type) of the path of a wave that lies inside the inner portion. Thus, for instance, discrimination needs to be made between the rays PKP, PKiKP, and PKIKP. The first of these corresponds to a wave that has entered the outer part of the central core but has not reached the inner core, the second to one that has been reflected upward at the inner core boundary, and the third to one that has penetrated into the inner portion.

By combining the symbols p, s, P, S, c, K, i, and I in various ways, notation is developed for all the main rays associated with body earthquake waves.

Hidden Inner Earth Deep Structure Anomalies

The symbol J, introduced to correspond with S waves located within Earth’s inner core, is only evidence if such ( if ever ) be found for such waves. Use of times of travel along rays to infer a hidden structure is analogous to the use of X-rays in medical tomography. The method involves reconstructing an image of internal anomalies from measurements made at the outer surface. Nowadays, hundreds of thousands of travel times of P and S waves are available in earthquake catalogs for the tomographic imaging of the Earth’s interior and the mapping of internal structure.

Inner Earth Deep Structure

Thinest & Thickest Part of Earth’s Crust

Inner Earth, based on earthquake records and imaging studies, are officially represented, as:

A solid layer flowing patterns of a mantle, at its ’thickest point’ being about 1,800-miles ( 2,900 kilometers ) thick, although at its ‘thinest point’ less than 6-miles ( 10 kilometers ) beneath the ocean seafloor bed beneath the surface of the ultra-deep sea.

The thin surface rock layer surrounding the mantle is the crust, whose lower boundary is called the Mohorovičić discontinuity. In normal continental regions the crust is about 30 kilometers to 40 km thick; there is usually a superficial low-velocity sedimentary layer underlain by a zone in which seismic velocity increases with depth. Beneath this zone there is a layer in which P-wave velocities in some places fall from 6 to 5.6 km per second. The middle part of the crust is characterized by a heterogeneous zone with P velocities of nearly 6 to 6.3 km per second. The lowest layer of the crust ( about 10 km thick ) has significantly higher P velocities, ranging up to nearly 7 km per second.

In the deep ocean there is a sedimentary layer that is about 1 km thick. Underneath is the lower layer of the oceanic crust, which is about 4 km thick. This layer is inferred to consist of basalt that formed where extrusions of basaltic magma at oceanic ridges have been added to the upper part of lithospheric plates as they spread away from the ridge crests. This crustal layer cools as it moves away from the ridge crest, and its seismic velocities increase correspondingly.

Below Earth’s mantle at its ‘thickest point’ exists a shell depth of 1,800 miles ( 2,255 km ), which seismic waves indicate, has liquid property form, and at Earth’s ‘shallowest point’ only 6-miles ( 10 kilometers ) located beneath the ultra-deep seafloor of the planet ocean.

At the very centre of the planet is a separate solid core with a radius of 1,216 km. Recent work with observed seismic waves has revealed three-dimensional structural details inside the Earth, especially in the crust and lithosphere, under the subduction zones, at the base of the mantle, and in the inner core. These regional variations are important in explaining the dynamic history of the planet.

Long-Period Global Oscillations

Sometimes earthquakes can be so great, the entire planet Earth will vibrate like a ringing bell’s echo, with the deepest tone of vibration recorded by modern man on planet Earth is a period of measurement where the length of time between the arrival of successive crests in a wave train has been 54-minutes considered by human beings as ‘grave’ ( an extremely significant danger ).

Knowledge of these vibrations has come from a remarkable extension in the ‘range of periods of ground movements’ now able to be recorded by modern ‘digital long-period seismographs’ spanning the entire allowable spectrum of earthquake wave periods, from: ordinary P waves ( with periods of tenths of seconds ) to vibrations ( with periods on the order of 12-hours and 24-hours ), i.e. those movements occuring within Earth ocean tides.

The measurements of vibrations of the whole Earth provide important information on the properties of the interior of the planet. It should be emphasized that these free vibrations are set up by the energy release of the earthquake source but continue for many hours and sometimes even days. For an elastic sphere such as the Earth, two types of vibrations are known to be possible. In one type, called S modes, or spheroidal vibrations, the motions of the elements of the sphere have components along the radius as well as along the tangent. In the second [ 2nd ] type, which are designated as T modes or torsional vibrations, there is shear but no radial displacements. The nomenclature is nSl and nTl, where the letters n and l are related to the surfaces in the vibration at which there is zero motion. Four ( 4 ) examples are illustrated in the figure. The subscript n gives a count of the number of internal zero-motion ( nodal ) surfaces, and l indicates the number of surface nodal lines.

Several hundred types of S and T vibrations have been identified and the associated periods measured. The amplitudes of the ground motion in the vibrations have been determined for particular earthquakes, and, more important, the attenuation of each component vibration has been measured. The dimensionless measure of this decay constant is called the quality factor Q. The greater the value of Q, the less the wave or vibration damping. Typically, for oS10 and oT10, the Q values are about 250.

The rate of decay of the vibrations of the whole Earth with the passage of time can be seen in the figure, where they appear superimposed for 20 hours of the 12-hour tidal deformations of the Earth. At the bottom of the figure these vibrations have been split up into a series of peaks, each with a definite frequency, similar to that of the spectrum of light.

Such a spectrum indicates the relative amplitude of each harmonic present in the free oscillations. If the physical properties of the Earth’s interior were known, all these individual peaks could be calculated directly. Instead, the internal structure must be estimated from the observed peaks.

Recent research has shown that observations of long-period oscillations of the Earth discriminate fairly finely between different Earth models. In applying the observations to improve the resolution and precision of such representations of the planet’s internal structure, a considerable number of Earth models are set up, and all the periods of their free oscillations are computed and checked against the observations. Models can then be successively eliminated until only a small range remains. In practice, the work starts with existing models; efforts are made to amend them by sequential steps until full compatibility with the observations is achieved, within the uncertainties of the observations. Even so, the resulting computed Earth structure is not a unique solution to the problem.

Extraterrestrial Seismic Phenomena

Space vehicles have carried equipment onto the our Moon and Mars surface recording seismic waves from where seismologists on Earth receive telemetry signals from seismic events from both.

By 1969, seismographs had been placed at six sites on the Moon during the U.S. Apollo missions. Recording of seismic data ceased in September 1977. The instruments detected between 600 and 3,000 moonquakes during each year of their operation, though most of these seismic events were very small. The ground noise on the lunar surface is low compared with that of the Earth, so that the seismographs could be operated at very high magnifications. Because there was more than one station on the Moon, it was possible to use the arrival times of P and S waves at the lunar stations from the moonquakes to determine foci in the same way as is done on the Earth.

Moonquakes are of three types. First, there are the events caused by the impact of lunar modules, booster rockets, and meteorites. The lunar seismograph stations were able to detect meteorites hitting the Moon’s surface more than 1,000 km (600 miles) away. The two other types of moonquakes had natural sources in the Moon’s interior: they presumably resulted from rock fracturing, as on Earth. The most common type of natural moonquake had deep foci, at depths of 600 to 1,000 km; the less common variety had shallow focal depths.

Seismological research on Mars has been less successful. Only one of the seismometers carried to the Martian surface by the U.S. Viking landers during the mid-1970s remained operational, and only one potential marsquake was detected in 546 Martian days.

Historical Major Earthquakes

Major historical earthquakes chronological listing in table ( below ).

 

Major Earthquake History

Year

Region / Area

Affected

* Mag.

Intensity

Human Death

Numbers

( approx. )

Remarks

c. 1500 BCE

Knossos,

Crete

(Greece)

X

One of several events that leveled the capital of Minoan civilization, this quake accompanied the explosion of the nearby volcanic islandof Thera.

27 BCE

Thebes

(Egypt)

This quake cracked one of the statues known as the Colossi of Memnon, and for almost two centuries the “singing Memnon” emitted musical tones on certain mornings as it was warmed by the Sun’s rays.

62 CE

Pompeii

and Herculaneum

(Italy)

X

These two prosperous Roman cities had not yet recovered from the quake of 62 when they were buried by the eruption of Mount Vesuvius in 79.

115

AntiochAntakya,

(Turkey)

XI

A centre of Hellenistic and early Christian culture, Antiochsuffered many devastating quakes; this one almost killed the visiting Roman emperor Trajan.

1556

Shaanxi

( province )

China

IX

830,000

Deadliest earthquake ever recorded, possible.

1650

Cuzco

(Peru)

8.1

VIII

Many ofCuzco’s Baroque monuments date to the rebuilding of the city after this quake.

1692

Port Royal (Jamaica)

2,000

Much of thisBritish West Indiesport, a notorious haven for buccaneers and slave traders, sank beneath the sea following the quake.

1693

southeasternSicily,

(Italy)

XI

93,000

Syracuse, Catania, and Ragusa were almost completely destroyed but were rebuilt with a Baroque splendour that still attracts tourists.

1755

Lisbon,Portugal

XI

62,000

The Lisbon earthquake of 1755 was felt as far away asAlgiers and caused a tsunami that reached theCaribbean.

1780

Tabriz

(Iran)

7.7

200,000

This ancient highland city was destroyed and rebuilt, as it had been in 791, 858, 1041, and 1721 and would be again in 1927.
1811 – 1812

NewMadrid,Missouri

(USA)

7.5 – 7.7

XII

A series of quakes at the New Madrid Fault caused few deaths, but the New Madrid earthquake of 1811 – 1812 rerouted portions of the Mississippi River and was felt fromCanada to theGulf of Mexico.

1812

Caracas

(Venezuela)

9.6

X

26,000

A provincial town in 1812,Caracasrecovered and eventually becameVenezuela’s capital.

1835

Concepción,

(Chile)

8.5

35

British naturalist Charles Darwin, witnessing this quake, marveled at the power of the Earth to destroy cities and alter landscapes.

1886

Charleston,South Carolina

(USA)

IX

60

This was one of the largest quakes ever to hit the easternUnited States.

1895

Ljubljana

(Slovenia)

6.1

VIII

ModernLjubljanais said to have been born in the rebuilding after this quake.

1906

San Francisco,California

(USA)

7.9

XI

700

San Franciscostill dates its modern development from the San Francisco earthquake of 1906 and the resulting fires.

1908

Messina and Reggio di Calabria,Italy

7.5

XII

110,000

These two cities on theStraitofMessinawere almost completely destroyed in what is said to beEurope’s worst earthquake ever.

1920

Gansu

( province )

China

8.5

200,000

Many of the deaths in this quake-prone province were caused by huge landslides.

1923

Tokyo-Yokohama,

(Japan)

7.9

142,800

Japan’s capital and its principal port, located on soft alluvial ground, suffered severely from the Tokyo-Yokohama earthquake of 1923.

1931

Hawke Bay,New Zealand

7.9

256

The bayside towns of Napier and Hastings were rebuilt in an Art Deco style that is now a great tourist attraction.

1935

Quetta (Pakistan)

7.5

X

20,000

The capital of Balochistan province was severely damaged in the most destructive quake to hitSouth Asiain the 20th century.

1948

Ashgabat (Turkmenistan)

7.3

X

176,000

Every year,Turkmenistancommemorates the utter destruction of its capital in this quake.

1950

Assam,India

8.7

X

574

The largest quake ever recorded inSouth Asiakilled relatively few people in a lightly populated region along the Indo-Chinese border.

1960

Valdivia

and

Puerto Montt,

(Chile)

9.5

XI

5,700

The Chile earthquake of 1960, the largest quake ever recorded in the world, produced a tsunami that crossed the Pacific Ocean toJapan, where it killed more than 100 people.

1963

Skopje,Macedonia

6.9

X

1,070

The capital ofMacedoniahad to be rebuilt almost completely following this quake.

1964

Prince William Sound,Alaska,U.S.

9.2

131

Anchorage, Seward, and Valdez were damaged, but most deaths in the Alaska earthquake of 1964 were caused by tsunamis inAlaska and as far away asCalifornia.

1970

Chimbote,Peru

7.9

70,000

Most of the damage and loss of life resulting from the Ancash earthquake of 1970 was caused by landslides and the collapse of poorly constructed buildings.

1972

Managua,Nicaragua

6.2

10,000

The centre of the capital ofNicaraguawas almost completely destroyed; the business section was later rebuilt some 6 miles (10 km) away.

1976

Guatemala City,Guatemala

7.5

IX

23,000

Rebuilt following a series of devastating quakes in 1917–18, the capital ofGuatemalaagain suffered great destruction.

1976

Tangshan,

(China)

7.5

X

242,000

In the Tangshan earthquake of 1976, this industrial city was almost completely destroyed in the worst earthquake disaster in modern history.

1985

Michoacán state and Mexico City,Mexico

8.1

IX

10,000

The centre of Mexico City, built largely on the soft subsoil of an ancient lake, suffered great damage in the Mexico City earthquake of 1985.

1988

Spitak and Gyumri,Armenia

6.8

X

25,000

This quake destroyed nearly one-third ofArmenia’s industrial capacity.

1989

Loma Prieta,California,U.S.

7.1

IX

62

The San Francisco–Oakland earthquake of 1989, the first sizable movement of the San Andreas Fault since 1906, collapsed a section of the San Francisco–Oakland Bay Bridge.

1994

Northridge,

California

(USA)

6.8

IX

60

Centred in the urbanized San Fernando Valley, the Northridge earthquake of 1994 collapsed freeways and some buildings, but damage was limited by earthquake-resistant construction.

1995

Kobe,

(Japan)

6.9

XI

5,502

The Great Hanshin Earthquake destroyed or damaged 200,000 buildings and left 300,000 people homeless.

1999

Izmit,Turkey

7.4

X

17,000

The Izmit earthquake of 1999 heavily damaged the industrial city ofIzmit and the naval base at Golcuk.

1999

Nan-t’ou county,Taiwan

7.7

X

2,400

The Taiwan earthquake of 1999, the worst to hitTaiwan since 1935, provided a wealth of digitized data for seismic and engineering studies.

2001

Bhuj,

Gujarat

( state )

India

8.0

X

20,000

The Bhuj earthquake of 2001, possibly the deadliest ever to hitIndia, was felt acrossIndia andPakistan.

2003

Bam

(Iran)

6.6

IX

26,000

This ancientSilk Roadfortress city, built mostly of mud brick, was almost completely destroyed.

2004

Aceh

( province )

Sumatra

(Indonesia)

9.1

200,000

The deaths resulting from this offshore quake actually were caused by a tsunami originating in the Indian Ocean that, in addition to killing more than 150,000 inIndonesia, killed people as far away asSri Lanka andSomalia.

2005

Azad Kashmir

(Pakistanadministered )

( Kashmir )

7.6

VIII

80,000

The Kashmir earthquake of 2005, perhaps the deadliest shock ever to strikeSouth Asia, left hundreds of thousands of people exposed to the coming winter weather.

2008

Sichuan

( province )

(China

7.9

IX

69,000

The Sichuan earthquake of 2008 left over 5 million people homeless across the region, and over half of Beichuan city was destroyed by the initial seismic event and the release of water from a lake formed by nearby landslides.

2009

L’Aquila,

(Italy)

6.3

VIII

300

The L’Aquila earthquake of 2009 left more than 60,000 people homeless and damaged many of the city’s medieval buildings.

2010

Port-au-Prince,

(Haiti)

7.0

IX

316,000

The Haiti earthquake of 2010 devastated the metropolitan area ofPort-au-Prince and left an estimated 1.5 million survivors homeless.

2010

Maule,

(Chile)

8.8

VIII

521

The Chile earthquake of 2010 produced widespread damage inChile’s central region and triggered tsunami warnings throughout the Pacific basin.

2010

Christchurch,(New Zealand)

7.0

VIII

180

Most of the devastation associated with the Christchurch earthquakes of 2010–11 resulted from a magnitude-6.3 aftershock that struck on February 22, 2011.

2011

Honshu,

(Japan)

9.0

VIII

20,000

The powerful Japan earthquake and tsunami of 2011, which sent tsunami waves across the Pacific basin, caused widespread damage throughout easternHonshu.

2011

Erciş

And

Van,

(Turkey)

7.2

IX

The Erciş-Van earthquake of 2011 destroyed several apartment complexes and shattered mud-brick homes throughout the region.
  Data Sources: National Oceanic and Atmospheric Administration ( NOAA ), National Geophysical Data Center ( NGDC ), Significant Earthquake Database ( SED ), a searchable online database using the Catalog of Significant Earthquakes 2150 B.C. – 1991 A.D. ( with Addenda ), and U.S. Geological Survey ( USGS ), Earthquake Hazards Program.  * Measures of magnitude may differ from other sources.

ARTICLE

AdditionalReading

Earthquakes are covered mainly in books on seismology.

Recommended introductory texts, are:

Bruce A. Bolt, Earthquakes, 4th ed. (1999), and Earthquakes and Geological Discovery (1993); and,

Jack Oliver, Shocks and Rocks: Seismology and the Plate Tectonics Revolution (1996).

Comprehensive books on key aspects of seismic hazards, are:

Leon Reiter, Earthquake Hazard Analysis – Issues and Insights (1990); and,

Robert S. Yeats, Kerry Sieh, and Clarence R. Allen, The Geology of Earthquakes (1997).

A history of discrimination, between:

Underground nuclear explosions and natural earthquakes, is given by:

Bruce A. Bolt, “Nuclear Explosions and Earthquakes: The Parted Veil” ( 1976 ).

More advanced texts that treat the theory of earthquake waves in detail, are:

Agustín Udías, Principles of Seismology (1999);

Thorne Lay and Terry C. Wallace, Modern Global Seismology (1995);

Peter M. Shearer, Introduction to Seismology (1999); and,

K.E. Bullen and Bruce A. Bolt, An Introduction to the Theory of Seismology, 4th ed. (1985).

LINKS

Year in Review

Britannica provides coverage of “earthquake” in the following Year in Review articles.

Bhutan  ( in  Bhutan )

geophysics  ( in  geophysics )

Japan

Kyrgyzstan  (in  Kyrgyzstan)

Nepal  (in  Nepal)

New Zealand  (in  New Zealand )

Chile  (in  Chile: Year In Review 2010)

China  (in  China: Year In Review 2010)

“Engineering for Earthquakes”  ( in  Engineering for Earthquakes: Year In Review 2010 (earthquake) )

geophysics  (in  Earth Sciences: Year In Review 2010)

Haiti  (in  Haiti: Year In Review 2010; in  Haiti earthquake of 2010 )

Mauritius  (in  Mauritius: Year In Review 2010)

New Zealand  (in  New Zealand: Year In Review 2010 )

Bhutan  (in  Bhutan: Year In Review 2009)

Costa Rica  (in  Costa Rica: Year In Review 2009 )

geophysics  (in  Earth Sciences: Year In Review 2009)

Indonesia  (in  Indonesia: Year In Review 2009)

Italy  (in  Italy: Year In Review 2009; in  Vatican City State: Year In Review 2009 )

Samoa  (in  Samoa: Year In Review 2009)

“Major Earthquake Shakes China’s Sichuan Province, A”  ( in  A Major Earthquake Shakes China’s Sichuan Province: Year In Review 2008 (earthquake) )

China  (in  China: Year In Review 2008; in  United Nations: Year In Review 2008 )

Congo, Democratic Republic of the  (in  Democratic Republic of the Congo: Year In Review 2008)

geology  (in  Earth Sciences: Year In Review 2008)

geophysics  (in  Earth Sciences: Year In Review 2008 )

geophysics  (in  Earth Sciences: Year In Review 2007)

paleontology  (in  Life Sciences: Year In Review 2007)

Peru  (in  Peru: Year In Review 2007 )

geophysics  (in  Earth Sciences: Year In Review 2006)

glaciers  (in  Earth Sciences: Year In Review 2006)

Mozambique  (in  Mozambique: Year In Review 2006)

archaeology  ( in  Anthropology and Archaeology: Year In Review 2005 )

geophysics  (in  Earth Sciences: Year In Review 2005)

India  (in  India: Year In Review 2005 )

Pakistan(in  Pakistan: Year In Review 2005 )

“Cataclysm in Kashmir”  (in  Cataclysm in Kashmir: Year In Review 2005 (Jammu and Kashmir))

geophysics  (in  Earth Sciences: Year In Review 2004)

Japan  (in  Japan: Year In Review 2004)

tsunami  (in  The Deadliest Tsunami: Year In Review 2004 (tsunami))

geophysics  (in  Earth Sciences: Year In Review 1996)

geophysics  (in  Earth and Space Sciences: Year In Review 1995)

LINKS

Other Britannica Sites

Get involved Share

Articles from Britannica encyclopedias for elementary and high school students.

Earthquake – Children’s Encyclopedia ( Ages 8-11 ) – During an earthquake, huge masses of rock move beneath the Earth’s surface and cause the ground to shake. Earthquakes occur constantly around the world. Often they are too small for people to feel at all. Sometimes, however, earthquakes cause great losses of life and property.

Earthquake – Student Encyclopedia ( Ages 11 and up ) – Sudden shaking of the ground that occurs when masses of rock change position below Earth’s surface is called an earthquake. The shifting masses send out shock waves that may be powerful enough to alter the surface, thrusting up cliffs and opening great cracks in the ground.

The topic earthquake is discussed at the following external Web sites.

Citations

To cite this page: MLAAPAHarvardChicago Manual of Style

MLA Style: “earthquake.” Encyclopædia Britannica. Encyclopædia Britannica Online. Encyclopædia Britannica Inc., 2012. Web. 21 Mar. 2012. http://www.britannica.com/EBchecked/topic/176199/earthquake

Reference

http://www.britannica.com/EBchecked/topic/176199/earthquake/247989/Shallow-intermediate-and-deep-foci?anchor=ref105456

– – – –

Feeling ‘educated’? Think you’re out-of the earthquake and tsunami water subject?

March 23, 2012 news, however contradicts decades of professional scientific knowledge and studies so, if you were just feeling ‘overly educated’ about earthquakes and tsunamis – don’t be. You’re now lost at sea, in the same proverbial ‘boat’, with all those global government scientific and technical ( S&T ) professionals who thought they understood previous information surrounding earthquakes and tsunamis.

After comparing Japan 9.0 ‘earthquake directional arrows’, depicted on the charts ( further above ), with ocean currents, tidal charts and trade winds from the global jet stream there’s a problem that cannot be explained when on March 23, 2012 British Columbia, Canada reported its northwest Pacific Ocean coastal sea waters held a 100-foot fishing boat ‘still afloat’ – more than 1-year after the Japan tsunami from its 9.0 earthquake on March 11, 2011.

[ IMAGE ( above ): 11MAR11 Japan 9.0 earthquake tsunami vistim fishing boat ( 50-metre ) found more than 1-year later still adrift in the Pacific Ocean – but thousands of miles away – off North America Pacific Ocean west coastal territory of Haida Gwaii, British Columbia, Canada ( Click on image to enlarge ) ]

– – – –

Source: CBS News – British Columbia ( Canada )

Tsunami Linked Fishing Boat Adrift Off B.C.

Nobody Believed Aboard 50-Meter Vessel Swept Away In 2011 Japanese Disaster CBC News

March 23, 2012 21:35 ( PST ) Updated from: 23MAR12 18:59 ( PST )

A Japanese fishing boat that was washed out to sea in the March 2011 Japanese tsunami has been located adrift off the coast of British Columbia ( B.C. ), according to the federal Transport Ministry.

The 50-metre vessel was spotted by the crew of an aircraft on routine patrol about 275 kilometres off Haida Gwaii, formerly known as the Queen Charlotte Islands, ministry spokeswoman Sau Sau Liu said Friday.

“Close visual aerial inspection and hails to the ship indicate there is no one on board,” Liu said. “The owner of the vessel has been contacted and made aware of its location.”

U.S. Senator Maria Cantwell, ofWashington, said in a release that the boat was expected to drift slowly southeast.

“On its current trajectory and speed, the vessel would not [ yet ] make landfall for approximately 50-days,” Cantwell said. Cantwell did not specify where landfall was expected to be.

First large debris

The boat is the first large piece of debris found following the earthquake and tsunami that struckJapanone year ago.

Scientists, at the University of Hawaii say a field of about 18,000,000 million tonnes of debris is slowly being carried by ocean currents toward North America. The field is estimated to be about 3,200 kilometres long and 1,600 kilometres wide.

Scientists have estimated some of the debris would hit B.C. shores by 2014.

Some people on the west coast of Vancouver Island believe ‘smaller pieces of debris have already washed ashore there’.

The March 11, 2011, tsunami was generated after a magnitude 9.0 earthquake struck off the coast of northern Japan. The huge waves and swells of the tsunami moved inland and then retreated back into the Pacific Ocean, carrying human beings, wreckage of buildings, cars and boats.

Nearly 19,000 people were killed.

Reference

http://www.cbc.ca/news/canada/british-columbia/story/2012/03/23/bc-fishing-boat-tsunami-debris.html?cmp=rss

– – – –

Submitted for review and commentary by,

Kentron Intellect Research Vault

E-MAIL: KentronIntellectResearchVault@Gmail.Com

WWW: http://KentronIntellectResearchVault.WordPress.Com

References

http://earthquake.usgs.gov/earthquakes/world/japan/031111_M9.0prelim_geodetic_slip.php
http://en.wikipedia.org/wiki/Moment_magnitude_scale
http://www.gsi.go.jp/cais/topic110315.2-index-e.html
http://www.seismolab.caltech.edu
http://www.tectonics.caltech.edu/slip_history/2011_taiheiyo-oki
http://supersites.earthobservations.org/ARIA_japan_co_postseismic.pdf
ftp://sideshow.jpl.nasa.gov/pub/usrs/ARIA/README.txt
http://speclib.jpl.nasa.gov/documents/jhu_desc
http://earthquake.usgs.gov/regional/pacnw/paleo/greateq/conf.php
http://www.passcal.nmt.edu/content/array-arrays-elusive-ets-cascadia-subduction-zone
http://wcda.pgc.nrcan.gc.ca:8080/wcda/tams_e.php
http://www.pnsn.org/tremor
http://earthquake.usgs.gov/earthquakes/recenteqscanv/Quakes/quakes_all.html
http://nthmp.tsunami.gov
http://wcatwc.arh.noaa.gov
http://www.pnsn.org/NEWS/PRESS_RELEASES/CAFE/CAFE_intro.html
http://www.pnsn.org/WEBICORDER/DEEPTREM/summer2009.html
http://earthquake.usgs.gov/prepare
http://www.passcal.nmt.edu/content/usarray
http://www.iris.washington.edu/hq
http://www.iris.edu/dms/dmc
http://www.iris.edu/dhi/clients.htm
http://www.iris.edu/hq/middle_america/docs/presentations/1026/MORENO.pdf
http://www.unavco.org/aboutus/history.html
http://earthquake.usgs.gov/monitoring/anss
http://earthquake.usgs.gov/regional/asl
http://earthquake.usgs.gov/regional/asl/data
http://www.usarray.org/files/docs/pubs/US_Data_Plan_Final-V7.pdf
http://neic.usgs.gov/neis/gis/station_comma_list.asc
http://earthquake.usgs.gov/research/physics/lab
http://earthquake.usgs.gov/regional/asl/data
http://pubs.usgs.gov/gip/dynamic/Pangaea.html
http://coaps.fsu.edu/scatterometry/meeting/docs/2009_august/intro/shimoda.pdf
http://coaps.fsu.edu/scatterometry/meeting/past.php
http://eqinfo.ucsd.edu/dbrecenteqs/anza
http://www.ceri.memphis.edu/seismic
http://conceptactivityresearchvault.wordpress.com/2011/03/28/global-environmental-intelligence-gei
http://www.af.mil/information/factsheets/factsheet_print.asp?fsID=157&page=1
https://login.afwa.af.mil/register/
https://login.afwa.af.mil/amserver/UI/Login?goto=https%3A%2F%2Fweather.afwa.af.mil%3A443%2FHOST_HOME%2FDNXM%2FWRF%2Findex.html
http://www.globalsecurity.org/space/library/news/2011/space-110225-afns01.htm
http://www.wrf-model.org/plots/wrfrealtime.php

Earth Events Forecast

Eyjafjallajokull Iceland volcano
[ PHOTO ( above ): click to enlarge image ]

Earth Events Forecast
by, Kentron Intellect Research Vault ( KIRV )

March 7, 2011 17:32:08 ( PST ) Updated ( February 27, 2011 )

LOS ANGELES – November 10, 2011 – The U.S. Congress House of Representatives Committee on Science and the U.S. Congress House of Representatives Sub-Committee on Environment, Technology and Standards held a hearing on October 30, 2003 about ‘Space Weather’ and the roles and responsibilities of various government agencies held as to information collection, dissemination and use of ‘space weather’ data from the U.S. National Oceanic and Atmospheric Administration ( NOAA ), U.S. National Aeronautic and Space Agency ( NASA ), U.S. Air Force Weather Agency and representatives of government contractors in the private-sector. Amongst questions presented, were the following:

– “What is the ‘proper level of funding for agencies’ involved with ‘space environmental predictions’?” and,

– “What is the importance of ‘predictions to industry and commerce’?”

Interestingly, the U.S. Congress House of Representatives Sub-Committee hearing – from October 19, 2003 through November 7, 2003 – happened to be during the same time period when the Sun’s strongest Coronal Mass Ejections ( CME ) or “Solar Flares” were erupting; greater than at any time in 30-years.

Those late 2003 enormous solar flare outbursts consisted of Solar Energetic Particle ( SEP ) energy that triggered intense Solar Energetic Particle Events ( SEPE ) on Earth resulting in Geomagnetic Storms that ‘instantly brought serious disruptions’ to U.S. national infrastructure electricity grids.

Solar Particle Element ( SPE ) releases cause Geomagnetic Induced Currents ( GIC ) – also known as – “Auroral Currents” consisting of high-energy electrons ( ‘invisible’ unless in Earth’s ’upper atmospheric region’ where the ”Aurora Borealis” or “Northern Lights” appear ) made ‘visible’ in Earth’s ‘lower atmospheric region’ bombarded resulting in serious impacts on ’national electricity infrastructure grids‘ becoming instantly overcharged ( overloaded ) by what is referred to as a Solar Energetic Particle Event ( SEPE ), and during latter 2003 such was this U.S. national emergency throughout North Europe where amongst many utility companies SYDKRAFT ( Sweden ) reported:

– Electricity Generation Supply Failure; – Blackouts ( ‘no electricity’ & ‘no communications’ ); and, – Transformer Explosions ( ‘coils’ therein for ‘retaining electricity’ ) damaged by overloads.

NASA issued a Space Flight Early Warning Alert, because:

– Solar Radiation level heightening caused NASA to issue its directive ordering International Space Station ( ISS ) astronauts to ‘enter their onboard solar radiation shelter’ within that Earth orbiting spacecraft.

All passenger airlines worldwide were contacted and issued an ‘unprecedented ( never before ) order’ to take ‘evasive maneuvers’ away from ‘flying within’ or ‘scheduled to fly within’, ‘both’:

Polar routes ( Earth’s ‘upper hemisphere’ and ‘lower hemisphere’ ); and,

Equator routes ( Earth’s ‘lower hemisphere’ ).

– Aircraft destinations in ‘both aforementioned routes’ normally occur in ‘high Earth latitudes’, however these flights were ‘purposely redirected’ to, avoid:

– Solar ‘radiation level exposure’ hazards for all ‘passengers’ and ‘crew members’; – Damage to aircraft from electronic system failure; and, – Communication and navigation failures from satellite damages.

Costs, for airlines, was up to $100,000 dollars for ‘each flight’ re-routed.

– Satellites, plus deep space missions, reported ‘numerous anomalies at all orbit levels’.

Later, NASA GSFC Space Science Mission Operations Team estimated a minimum of 59% of all ‘space science missions’ were impacted by the solar flares of late 2003.

– Casualties ( from solar flares ) in space, costs amongst many, were the:

– ADEOS 2 spacecraft, which cost $640,000,000 million dollars to build plus its onboard $150,000,000 million dollar NASA SeaWinds instrument, all lost by serious impact from these solar flares.

Total damage costs, resultant from the late 2003 impact of this Solar Energetic Particle Event ( SEPE ), was ‘never publicly reported’.

– –

Solar flare outbursts, varying in intensity and variety, impact Earth global business operations ( to varying degrees ) ‘vulnerable to serious space weather’ from serious high-level solar flare activity.

Solar Energetic Particle Events ( SEPE ), which ‘affects everyone on Earth’, should mobilize governments because of, how:

– Quickly ( from a ‘few hours’ to ’4-days’ ) ‘space weather’ hits Earth; and,

– Human life can be ‘instantly impacted’ because of ‘society operational heavy reliance’ from ‘Earth based electronics’ to ‘space based communication equipment’.

Since at least 2001, ‘other superpower country governments’ ( e.g. China, Russia, France, Japan, etc. ) have been providing ‘public service early warnings” for ‘serious’ “space weather” in order to ‘preserve the human life of citizens’.

By 2004, tragically the United States of America ( USA ) National Research Council ( NRC ) Committee on Solar and Space Physics ( CSSP ) had only ‘begun to consider’ but ‘had not yet even considered’ any ‘need for ‘human impact assessments’ or ‘economic impact systematicals’ surrounding the subject of serious ‘solar flare’ impacts being an immediate ‘key critical need’ for everyone’s reliance on ‘space weather information news’ to begin reigning supreme over all publicly broadcasted ( television and radio ) news stories; ‘space weather assessments’ have been in ‘existence for decades’ – and at ‘considerable taxpayer costs’ for NASA and NOAA ‘programs’ and ‘projects’.

The United States of America government could easily provide ‘widely broadcasted public service educational announcements’ ( in multiple languages ) on a “Space Weather Channel”, providing:

1. Space Weather Education;

2. Space Weather Preparedness; and,

3. Space Weather Alert Warnings.

To-date ( 26FEB11 ), the U.S. has ‘exhibited absolutely no regard for its citizens’ surrounding ‘space weather’ importance with, ‘no’:

A. Lessons on ‘serious space weather;

B. Planning for ‘serious space weather’; and,

B. Announcements for ‘serious space weather’.

Think-tanks, under government contracts, studied and provided information to government how to order ‘national highway safety bridge improvements’ and thereby raise taxes – and by many other means for a variety of other reasons too.

Alternatively, citizens could receive an ‘immediate extension on surviving life’ upon experiencing a “National Serious Space Weather Event” with a ‘government public service television channel broadcast’ ( 24-hours a day, 7-days a week, 365-days a year ) for ‘all citizens’ ( elders & youth ) dealing more effectively by ‘single-handedly managing their own affairs’ ( as best they can ) during the onslaught of a Solar Energetic Particle Event ( SEPE ), by:

1. Learning.

a. Solar Flare High-Energy Particle Hazards; b. Earth Magnetic Core Trigger Mechanisms; c. Earth Axis Plate Shifts ( Earthquake & Tsunamis ); and, d. Volcanic ( Eruptions & Thermal Atmospheric Lightning Storms ).

2. Understanding.

a. Natural Disasters ( Type Dependent ); and, b. Planning ( Geography, Elevations, Weather ).

3. Planning.

a. Survival Methods ( rudimentary basics – adults & children ).

Learning – kept real simple – is important because, during an actual emergency ‘common sense’ fails to act sometimes because ‘emergencies are not routine matters’ but rarely exercised by most, except well-trained ‘emergency first responders’ whom are unfortunately ‘not trained on what to do during all types of disasters’, and this is why it is so ‘very important for everyone to know what to do’ – ‘when’ during ‘what type of disaster’.

Many do ‘not exercise proper judgment’ during an emergency, and that ‘knowledge failure’ creates ‘life threatening problems’ for ‘them’ and ‘their loved ones’ whom do know either because most ‘do not know how to effectively deal with each type of disaster’ since such are rarely experienced.

There are some basic ‘emergency common sense lessons’ easily learned that quickly become second-nature if ‘at least taught’ using a ‘proper presentation medium’ to reach people’s understandings so they ‘can know precisely what to do’ because disasters are ‘type dependent’ where ‘no disaster preparedness is the same’ but varies according to type.

Fatal Mistakes During Disasters

Citizen’s ‘disaster emergency preparedness’ seriously saves lives but because of a ‘lack of understanding’ there continues to be a significant number of ‘serious mistakes made by ‘many believing they are exercising logical steps at-first to save their life’ – until they die from having made that mistake – because they have ‘not been taught certain ramifications of type-dependent disasters’.

For serious space weather, a prime example of this is a Solar Energetic Particle Event ( SEPE ) occurring during a Winter cold weather where electricity becomes completely shut-off and people bring their ‘barbeque pits indoors’ where they can ‘create wood fires’. The end result is, ‘all occupants of the dwelling die overnight’ because ‘constant wood-fire burning depletes oxygen levels that become overpowered with carbon monoxide levels’.

Somewhat similar are those who retreat to their vehicle – inside a closed garage – where the engine is started to run the heater to stay warm, but a steady stream of ‘vehicle exhaust fumes slowly enter the interior of the car’ – unnoticeably to vehicle occupants who get put to sleep and their death from being slowly asphyxiated.

The U.S. government, for years, has been providing a very expensive ( to taxpayers ) NASA television channel broadcast ( 24-hours a day, 7-days a week, year round ) that continues ‘interrupted’ – whether there is a NASA spacecraft in space or not ( sometimes with a ‘blackened screen only filled with trajectory lines for hours on-end’ ), which ‘does absolutely nothing for the preservation of human life of any taxpaying American citizen’; yet American taxpayers continue paying for this ‘government wasteful spending’ nonsense.

Why No Government Lessons Teach The Public Anything

Global panic, is the only reason ‘why’ most were ‘not warned in-advance’, and as the following ( below ) points-out, there is ‘absolutely no time left for people to wait for their government warning’ because ‘no warning will ever be issued’ – people are going to just live-out their lives until the end comes soon.

No longer a ‘doomsday prediction’, because the following information ( below ) is based on ‘official information compiled’ clearly demonstrating ‘why’ it has now become logical for everyone to ‘self-mandate learning how to self-prepare’ for a ‘serious space weather disaster’ that will soon affect the very ground stood upon today that was ‘secretly predicted for key citizens’ as far back as at least 2004.

The Government Is Always Correct, Right? Wrong!

Had anyone from the public been tasked to closely monitor not only mainstream news media broadcasts as to certain events, but also obtained background studies from ‘official U.S. government data’ supplied by both NASA and NOAA Space Weather Prediction Center providing data from several spacecraft, had college studies in ‘plate tectonics’, plus performed significant comparative analysis with ‘ultra-deep sea arc volcanic explosive phreatic magmatic activity’ the ‘government secret would have been revealed a lot sooner – after ‘ascertaining it all in a final understanding’.

What began on February 8, 2011 with the U.S. Department of Homeland Security elevating its Threat Level to “YELLOW” for a “Significant National Weather Alert” issued by the U.S. Federal Emergency Management Agency ( FEMA ) based on NASA and NOAA information left only a few to wonder ‘why’ many U.S. government agencies are all climbing into the same information bed at one time. Why are they all so nervous? Might they have somehow forgotten to tell the ‘little people’ ( public ) some ‘new information’ that is now ‘old news’ to only but a few in government?

Recent facts about Earth ‘true north’ has ‘quickly shifted’ dramatically in the ‘opposite direction’ of what ‘government reported as its “earlier prediction” that was recently ‘officially admitted’ as having ‘earlier been incorrect’.

Why has such information been recently released by mainstream news broadcast reports?

Airline pilots noticed their aircraft navigation compass headings required changes to navigate according to airport runway heading indicator signs and neither agree.

The anomaly instantly saw airline pilots and navigators talking amongst themselves worldwide about their ‘compass headings no longer point “True North,” now requiring navigation corrections on aircraft onboard computers and even airport runways as recently reported in February 2011.

Many do not realize that if an airplane or ship ‘navigation is incorrect by only a few degrees’ that entire land masses will be missed and fuel will run-out leaving passengers either stranded or dead from airlines running out-of fuel in mid-air.

Recently, it was discovered that the entire Earth axis has ‘definitely shifted’ and ‘much quicker’ than expected.

What was ‘unexpected’ was, this Earth axis shift was ‘totally in the opposite direction’ of what ‘government standards of measurement’ provided for ‘scientific data accuracy outcomes’.

That was only a few ‘small parts’ of what has gone seriously wrong about a ‘much greater secret’ never leaked to the press.

What recently came next, in February 2011, was yet another secret leaked just last week: The Earth axis not only shifted a total of 4-degrees, but it did-so due ‘East’ but not ‘West’ as the government again predicted.

Unless government ‘did not make any mistakes for what it predicted earlier to the public’ could it have secretly wanted to see people worldwide taken by surprise? If not, then ‘how could the government make so many serious scientific mistakes’?

Pole Shift Reported By Mainstream News Broadcasts

In February 2011, when ‘airports began changing runway compass number headings’ displayed on ‘extremely large portrait-size shaped signs’ standing upright at the end of runways, news of such was leaked to the mainstream press outlets that saw broadcast television news reports announcing a “sudden pole shift” in the Earth.

Science Fiction “Deep Space 9″ Strange ISS Payloads & Movements

Since at least 2004, government secrets surrounding ‘space weather’ ( extremely serious solar flare activity ) is very capable of ‘triggering magnetic mechanisms’ deep within the magnetic core of the Earth – the effects of which extend upward to the surface in something most are unaware of called ‘ultra-deep sea volcanoes that erupt so incredibly explosive ( volcanic explosive phreatic magamatics or phreatomagmatic eruptions ) causing Earth’s crust plate tectonics to begin moving as a result of ‘ultra-deep magmatic and magnetic force increases from increased high-energy particles received and stored in Earth’s core’ – something that ‘geoscientific astrophysicists’ know plenty about.

This is believed the ‘secret reason’ behind ‘why’, since 2004, the following ( below ) has been conducted, and precisely ‘why’ the United States of America ( USA ), Russia ( CIS ), Japan and French government timelines have been ‘secretly speeding’ “unmanned space freighters” ( see NASA news below ) with ‘continual non-stop secret payload amounts’ from Earth up into the International Space Station ( ISS ) that is experiencing ‘significant increases in its fuel payloads’ NASA says is “designed to boost the International Space Station ( ISS ) further into orbit” should it become necessary for the ISS to be moved further away from the Earth.

Questions have recently begun to surface, which see ‘no answers being provided to the public by any government agency service’:

– “Why is the International Space Station ( ISS ) being considered ‘moved further away from Earth’?”

– “Could there be a government secret that brought about a plan to send the International Space Station ( ISS ) ‘farther away from Earth into space’ so the NASA “Robonaut 2” ( also known as ) “R2” ‘may eventually be the only thing performing any physical work aboard’, but that may only be to ‘maintain an earlier recorded Earth event’ that to-date has not yet occurred?”

– “Is the ISS secretly expected to ‘hitch a galactic ride’ due to a ‘gravity well’ created by another interstellar body ( asteroid, comet, planet, or dwarf star ) entering our Solar System planetary plane soon?”

– “Is the ISS being ‘secretly missioned to record data of a cataclysmic Earth Extinction Level Event ( ELE ) so, in the future ‘other space goers will know that we once existed and had progressed as far as we did in technology and outer space use? If so, does a ‘limited few on Earth’ – knowing this – believe such ‘interstellar historical information’ may be valued by others coming upon it later? If so, then what is about to happen on Earth that people are not being told?”

Geoscience and astrophysics data facts, once researched and extrapolated for comparative analysis, points directly to an Earth Extinction Level Event ( ELE ) coming soon, and taking all that data and coupling it with known facts that ‘certain superpower governments’ and building deep underground military-protected bases

Not Science Fiction But Geoscience Astrophysics Reality

Before February 16, 2011 an ‘earlier secret payload’ of ‘extreme long-range optical sensors and data recorders’ was delivered to the International Space Station ( ISS ) to ‘record from space only’ an ‘extremely powerful global seismic shock’ ( earthquake ) that ‘Earth seismic detectors will be unable to fully and accurately record’ overall Earth plate tectonic movement the likes which has never been recorded during the history of man.

On February 17, 2011 the aforementioned monitoring equipment consisting of optical imaging long-range sensing detectors and data recording equipment rendered the International Space Station ( ISS ) a Space-based Planet Monitoring System ( PMS ) focusing on Earth ground events coming, was finally completed by two ( 2 ) Russia cosmonauts positioning the space-based external sensors ( SES ) outside the International Space Station ( ISS ) to record a major Solar Energetic Particle Event ( SEPE ) impact triggering affects in yet other Earth mechanisms beginning, with:

Earth axis ‘pole shift’ as ‘high-energy particles invisibly penetrate the magnetic core of the Earth’, much of which danger ‘primarily enters en-masse through planet Earth’s ‘now open’ Magnetosphere no longer protecting the entire circumference of the Earth due to a hole in it, followed by explosive volcanic phreatomagmatic eruptions forcing great tectonic movement with resultant ‘major earthquakes’, ‘thermal lightning storms’, and great waterway upheavals of lakes, major rivers and oceanic tsunamis; and event that – for the surface of the Earth – will likely result in what is known as an Extinction Level Event ( ELE ).

Known well in-advance, by only a few, a greater growing number of what are called ‘conspiracy nuts’ and ‘doomsday prophets’ have been surmising for decades that ‘some unknown but catastrophic event or chain of events is going to occur on Earth’.

Now, recent information – contained in this report – could explain the primary reason behind why so many government protected ‘ultra-deep underground facilities’ are being ‘stockpiled with only the purest of natural agricultural produce seeds’ ( recently obtained from a few Native American Indian tribes ), and why ‘ultra-deep subterranean bases have alternative suboceanic tunnel system entry and exit access points as well’.

Deep underground ( subterranean ) ‘bases’ have been photographed and even ‘recently videotaped on a high-quality cellphone camera’ used by a ‘truck driver making secret deliveries’ deep down inside one ( 1 ) facility carved into gigantic cavern warehouse size rooms with blast proof doors and extremely huge tunnel access protected by armed government military soldiers. Hence, the nickname “DUMB” ( ” Deep Underground Military Base ” ).

Deep underground ( subterranean ) ‘tunnels’ have also been recently filmed with still-image cameras providing amazingly crystal clear color photographs showing tunnels ‘carved out-of solid rock’ where huge equipment believed propelled by what are known as “Terron Drives” ‘cut-through and liquefy rock cuts simultaneously providing a seamless seal that is totally impenetrable by anything known to man; including multiple back-to-back ‘bunker buster’ directed nuclear bomb strikes.

On February 24, 2011 NASA Discovery space shuttle left Earth with the first ( 1st ), and probably last for a long while, ‘humanoid robot’ nicknamed “Robonaut” ( also known as ) “R-2″ that will be placed under a ‘top secret assignment’ aboard the International Space Station ( ISS ).

It is believed the purpose of the “Robonaut” ‘secret assignment’ will be to ‘eventually takeover ISS space-based data recording’ of ‘multiple Earth events unfolding’ whereupon from ‘extreme solar flare high-energy particle irradiation exposure’ after the Earth “occurrence” thereupon eventually ‘rendering prohibition of human life ever to be stationed aboard the International Space Station ( ISS )’.

The rapid increases in ISS payload, recent types of payloads brought onboard the ISS, and ISS projected movement is ‘absolutely chilling’.

Official documented reports by government coupled with ‘genuine news facts reported’ from mainstream media television broadcasts, leaves information analysis up to be ascertained by the public.

Here are only ‘a very few’ but ‘key critical officially recorded’ recent ‘facts’ – but ‘not all’ from what is additionally contained within the Concept Activity Research Vault database. ( see below )

– –

Source ( secondary ): MSNBC.Com – Space

[ PHOTO ( above ): The European Space Agency ( ESA ) ATV-2 “Johannes Kepler” docks at the International Space Station ( ISS ) on February 24, 2011 after an 8-day flight. ]

Cargo Ship Gets To Space Station Ahead Of Shuttle Discovery

Source ( primary ): Space.Com

Robotic Spacecraft Delivers 7 Tons Of Supplies For Crew by, Tarik Malik [ Space.Com Managing Editor ]

February 24, 2011 18:10:33 Thursday ( Updated: 2 hours 21 minutes ago )

A huge European cargo ship linked up with the International Space Station Thursday, delivering tons of supplies for the outpost’s crew just ‘hours before’ NASA’s planned launch of the shuttle Discovery [ STS-133 ].

The robotic spacecraft ( the ‘size of a double-decker bus’ ) – docked with the space station ( ISS ) [ on February 24, 2011 ] at 10:59 a.m. EST ( 15:59 GMT ) as the two [ 2 ] vehicles [ ATV-2 & ISS ] soared high over the Atlantic Ocean – known as the Automated Transfer Vehicle 2 ( ATV-2 ) – hefty space freighter is packed with 7 tons [ 14,000 pounds ] of supplies for the station’s [ ISS ] 6 person crew.

“Contact confirmed, capture confirmed,” Russia cosmonauts on the station [ ISS ] radioed mission control [ <?> NASA / ESA <?> ] after the successful docking.

The ATV-2, named the ” Johannes Kepler ” [ http://www.space.com/10884-huge-european-spaceship-launches.html ], is the second [ 2nd ] ‘robotic cargo spacecraft’ built by the European Space Agency [ ESA ] to ferry supplies to the International Space Station [ ISS ]. It launched atop a [ France ] Ariane 5 rocket on February 16, 2011 – beating NASA space shuttle [ Discovery STS 133 ].

The cargo ship [ ATV 2 ] hooked-up with the space station [ ISS ] in less than ’6-hours before’ the Discovery [ STS 133 ] space shuttle planned launch [ http://www.space.com/10937-space-shuttle-discovery-final-launch-preview.html ] toward the orbiting laboratory [ ISS ] – clearing the way for the orbiter’s [ Discovery space shuttle ] flight, NASA officials said.

NASA had already fueled Discovery [ space shuttle ] – for a 4:50 p.m. EST ( 21:50 GMT ) launch from the Kennedy Space Center in Cape Canaveral, Florida – at the time of the docking in space.

The six [ 6 ] astronauts – set to launch on Discovery – were apparently keeping track of the ATV-2 docking even as they geared-up for their own flight [ Discovery ].

“ATV docking complete! ISS just got bigger,” Discovery astronaut Nicole Stott wrote in a Twitter post. “Congrats to ESA and all the station partners around the world!

‘If’ Discovery ‘launches on time’, the shuttle will arrive at the International Space Station [ ISS ] on Saturday [ 26FEB11 ] – and will be the final flight of space shuttle Discovery [ http://www.space.com/10938-space-shuttle-discovery-legacy-numbers.html ] before the orbiter is retired later this year [ 2011 ].

When NASA’s shuttle Discovery docks at the station, spacecraft and robotic arms from all five of the major international space agencies building the $100 billion space station will be at the orbiting laboratory at the same time.

NASA and its Russian Federal Space Agency ( FSA ) partners are discussing the possibility of staging a photo session to capture the space station scene.

That idea, would send station [ ISS ] crew members ‘around the orbiting lab [ ISS external visual inspection photos ]‘ – on a flight in a Russia Soyuz spacecraft, from which they would snap ‘photographs’.

A final decision, on whether to go ahead with the photo opportunity, is expected during Discovery’s mission, NASA officials have said.

Europe’s Space Freighters –

Like their name suggests, Europe’s Automated Transfer Vehicles [ ATV ] [ http://www.space.com/10859-heaviest-atv-orbit.html ] are ‘unmanned spacecraft’ designed to ‘fly themselves to a docking port’ on the aft [ anterior ] end of the space station [ ISS ].

After months parked at the station [ ISS ], they un-dock, and are then ‘intentionally destroyed’ – burned up in Earth’s atmosphere.

ATV spacecraft look like large cylindrical objects – with four [ 4 ] solar arrays – giving them a ‘dragonfly’ like appearance – they are about 35-feet long and 14.7-feet wide.

The first [ 1st ] vehicle of the fleet [ Automated Transfer Vehicles – space freighters ] was the ATV-1 [ known as the ] ” Jules Verne ” – launched to the space station [ ISS ] in March 2008 – concluded its mission in September of that year [ 2008 ].

ESA [ European Space Agency ] officials have said each ATV [ Automated Transfer Vehicle – space freighter ] mission costs about $600,000,000 million, and the ‘entire fleet’ cost about $2,000,000,000 billion just to develop, according to Spaceflight Now.

The cargo, packed inside the ATV-2 Johannes Kepler, includes:

– 3,527 pounds of ‘equipment’ and ‘other dry supplies’; – 2,200 pounds of ‘rocket propellant’ ( for the station [ ISS ] thrusters ); – 220 pounds of ‘oxygen’ ( for the station [ ISS ] crew ); and, – 10,700 pounds of propellant stored aboard the ATV-2 craft itself ( to ‘help the space station [ ISS ] boost orbit’ – from time-to-time ), NASA officials said today 24FEB11.

January 2011 Other Deliveries To ISS –

The ATV-2 Johannes Kepler is expected to spend about 4-months docked at the International Space Station.

The ATV-2, followed the earlier arrival of ‘two’ [ 2 ] ‘other unmanned cargo ships’ [ space freighters ] – sent to the space station [ ISS ] last month [ January 2011 ]:

– H-2 Transfer Vehicle ( Japan ); and,

– Progress 41 ( Russia ).

After the NASA space shuttle fleet retires, this year [ 2011 ], the United States plans to rely, on spacecraft, from:

Russia; Europe; and, Japan.

As the U.S. only means to ‘ferry crews and cargo’ to the International Space Station [ ISS ] – until American built commercial [ private-sector ] space vehicles become available.

After Discovery [ space shuttle ] launches today [ 24FEB11 ], NASA plans to fly two [ 2 ] more shuttle missions [ Endeavor STS-134 ( 14APR11 ) & Atlantis STS-135 ( 28JUN11 ); the latter returning a ‘failed ammonia pump module’ ] – and ‘then retire the fleet’ [ space shuttles ] for good after [ their ] 30-years of service.

SPACE.com Managing Editor, Tariq Malik, on: Twitter@tariqjmalik

Staff writer Denise Chow@denisechow ) is providing mission coverage of Discovery’s final space voyage from Cape Canaveral, Fla.

– Gallery – Building the International Space Station [ http://www.space.com/50-building-international-space-station.html ]; and,

– Video – Europe’s Heaviest Spaceship: ATV Johannes Kepler [ http://www.space.com/10859-heaviest-atv-orbit.html ];

Reference

http://www.msnbc.msn.com/id/41761320/ns/technology_and_science-space/?GT1=43001

– –

Source: MSNBC.Com

TV Video Clip

Cosmonauts Install Earthquake Predictor On ISS by, Al Stirrett ( NBC News reporter )

February 24, 2011

On February 16, 2011 two ( 2 ) Russia cosmonauts – aboard the International Space Station ( ISS ) – ‘began installing outside the ISS’, the following:

– ‘Earthquake prediction and seismic forecast’ monitoring via ‘experimental optic video camera sensor’ ( the MSNBC.Com TV broadcast pronounced the monitoring program in ‘Russia language phonetics’ as “Radio Metria,” which in English translates to “radiometry” ); and;

– ‘Terrestrial lightning storms’ monitoring via ‘experimental optical radiation sensors’ ( multiple ).

On February 23, 2011 both installations were completed ‘outside the ISS’, which began providing both ‘long-range optical image data streams’ and ‘long-range scientific sensing measurements.

These data monitorization feeds are being received ‘in’ Moscow, Russia – according to the MSNBC.Com TV video report ( 24FEB11 ).

Reference

http://www.msnbc.msn.com/id/21134540/vp/41626743#41626743

Image: http://msnbcmedia1.msn.com/j/MSNBC/Components/Video/110216/a_ast_space_110216.thumb-m.jpg

http://www.msnbc.msn.com/id/41761320/ns/technology_and_science-space/?GT1=43001

– –

Source: MSNBC.COM & Associated Press ( AP )

Space

February 24, 2011 13:58 PM ( PST ) – Updated 42-minutes ago

Space Shuttle Discovery Blasts Off On Its Final Flight by, MSNBC.Com Staff and Associated Press news service reports

Problem with ‘range safety’ [ Discovery space shuttle ] adds ‘last minute drama’ to space station mission

CAPE CANAVERAL, Florida — After a last-minute glitch, the space shuttle Discovery lifted off Thursday on its final voyage, heading for the International Space Station with 6 astronauts and 1 humanoid robot.

The countdown proceeded smoothly, until there were just minutes left before liftoff.

Range safety officers reported that their ‘command system display was not working’, which could have forced a halt to the count.

“Calm down,” shuttle launch director Mike Leinbach told his colleagues.

He asked for some extra time to get the problem fixed — and received the go-ahead.

The countdown had to be held at T-minus-5 minutes, but the range safety system was fixed with just seconds left to make Thursday’s launch window.

It was an extra bit of drama for a mission that was initially scheduled for launch last November [ 2010 ]. At that time, ‘fueling problems’ held up the launch, forcing months of repairs.

No such problems arose this time.

“Enjoy the ride,” mission control told Discovery commander Steven Lindsey as the clock wound down toward zero. “For those watching, get ready to witness the majesty and power of Discovery as she lifts off one last time,” Lindsey replied.

Launch came at 4:53 p.m. ET, 3-minutes later than scheduled.

“The final liftoff of Discovery,” launch commentator Mike Curie said, “a tribute to the dedication, hard work and pride of America’s space shuttle team.”

The shuttle is heading to the International Space Station [ ISS ] with a load of supplies, a new orbital closet, and a humanoid robot.

Docking is set for Saturday [ 26FEB11 ].

World’s Most Traveled Spaceship

Discovery is NASA’s most traveled space shuttle, putting in nearly three decades [ 30-years ] of service.

Now it’s slated to become the first [ 1st ] of NASA’s three [ 3 ] remaining  shuttles to retire.

This is the 39th flight for Discovery, which has logged 143,000,000 million miles ( 230,000,000 million kilometers ) since its first [ 1st ] mission in 1984.

After retirement, the orbiter is expected to go on display at the Smithsonian Institution Udvar-Hazy Center, an annex of the National Air and Space Museum.

NASA estimated that 40,000 guests were on hand for Discovery’s farewell launch, including a small contingent from Congress and Florida’s new governor, Rick Scott.

Watching with special interest, from Mission Control in Houston, was astronaut Timothy Kopra, who was supposed to be the flight’s lead spacewalker. He was hurt in a bicycle crash last month and was replaced by Stephen Bowen, who will become the first [ 1st ] astronaut to fly ‘back-to-back shuttle missions’.

In addition to Lindsey and Bowen, Discovery’s crew includes pilot Eric Boe, spacewalker Benjamin Alvin Drew Jr. and mission specialists Michael Barratt and Nicole Stott. All 6 astronauts are veterans.

Stott and Barratt served long-term stints on the space station in 2009.

Onlookers Flock To Watch Launch

Roads leading to the launch site were jammed with cars parked two and three deep; recreational vehicles snagged prime viewing spots along the Banana River well before dawn. Businesses and governments joined in, their signs offering words of encouragement. “The heavens await Discovery,” a Cocoa Beach church proclaimed. Groceries stocked up on extra red, white and blue cakes with shuttle pictures. Stores ran out of camera batteries.

The launch team also got into the act. A competition was held to craft the departing salutation from Launch Control; Kennedy Space Center’s public affairs office normally comes up with the parting line. Souvenir photos of Discovery were set aside for controllers in the firing room. Many posed for group shots.

Leinbach noted that it would be “tough” to see Discovery soar one last time. “What will be most difficult will be on landing day when we know that that’s the end of her mission completely,” he said.

Discovery will spend 11-days in orbit — on top of the 352-days it’s already spent circling the planet — and will rack up another 4,500,000 million miles ( 7,200,000 million kilometers ).

Its achievements include delivering the Hubble Space Telescope to orbit, carrying the first Russian cosmonaut to launch on a U.S. spaceship, returning Mercury astronaut John Glenn to orbit, and bringing shuttle flights back to life after the Challenger [ space shuttle ] and Columbia [ space shuttle ] accidents.

“She’s been an amazing machine,” Leinbach said Wednesday. “She’s done everything we’ve asked of her.”

Discovery’s crew will deliver and install a closet-like compartment full of space station supplies. The Italy manufactured module was named Leonardo, after the Italian artist / inventor Da Vinci.

Packed inside the compartment is “Robonaut 2″ or R2, set to become the ‘first humanoid robot in space’. The experimental machine — looking human from the waist up — [ < ! > ] ‘will remain boxed until after Discovery departs’[ < ! > ].

Up at the space station [ ISS ], meanwhile, the 6 person crew welcomed a European cargo ship [ yet another “space freighter” ] that was ‘launched last week’ from French Guiana. The robotic [ ‘unmanned’ ] spaceship [ “space freighter” ] docked successfully, ‘just 6-hours before’ Discovery lift-off, keeping the shuttle countdown on-track.

“Busy day in space,” [ ISS ] station commander Scott Kelly noted in a Twitter tweet.

NASA is under a [ U.S. ] Presidential Directive to retire the shuttle fleet this [ 2011 ] summer, letting private companies [ private sector ] take over trips [ in space ] to orbit, and focus on ‘getting astronauts to asteroids and Mars’.

There has been ‘considerable disagreement, among lawmakers and the space community’, on ‘how best to accomplish this’.

“Godspeed Discovery,” retired space shuttle program manager Wayne Hale said in a Twitter update Thursday. “Prayers for a safe flight and wisdom for decision makers.”

Reference

http://www.msnbc.msn.com/id/41757735/ns/technology_and_science-space/

– –

The U.S. Geological Survey ( USGS ) provides a relatively recent ( 2004 ) report indicating “strong solar storm activity” ( also known as ) Coronal Mass Ejection Flares ( also known as ) “Solar Wind” from the Sun that brings Geomagnetic Storms to Earth experiences magnetosphere ( ionosphere ) interaction resonance cavity field fluctuations [ Berdichevsky and Dmitriev ( 2002 ) ] of apparent magnetic resistivity has had little ‘long term monitoring’, which was only done ‘passively in the past’ by [ Eisel and Egbert ( 2002 ) ] and [ Zhao and Qian ( 1992 ) ]. This report studies strong geomagnetic storms bringing “strong solar particle radiation” and – with it – strong “magnetic effects” on Earth tectonic plate movement ( earthquake ) reactions.

In this USGS relatively recent report, the only variations anywhere near as large as those reported in the paper of Fraser-Smith et al. ( 1990 ), ‘occurred during times of enhanced solar activity’. It is important to note this because of the fact that now there exists a ‘huge hole’ in the center of Earth’s magnetosphere ( ionosphere ) where there will be ‘absolutely no protection against other consequential reactions’ when Earth is hit by ‘significant solar flare amounts of high-energy ( energetic ) ‘electron’ particles that will enter through it, eventually be absorbed into Earth’s ‘liquid molten magnetic core magma flowing upward into volcanoes’ that erupt explosively outward pushing Earth crust plates up and on-top-of one another.

If a significant solar flare, like the one recently predicted by NASA indicating it is already something “we all need to be concerned about,” strikes Earth in its ‘most vulnerable spot’ ( magnetosphere / ionosphere ), weights of Earth’s crust plates moving up, over and on-top-of other plates will ‘weigh Earth down in highly concentrated new locations’ triggering an Earth “wobble” on its axis that may also see Earth’s magnetic Polar region shift a significant number of degrees even further than it recently has where Earth may then remain for many-many years to come, and ‘this’ could very well result in the aforementioned serious events occurring sooner here on Earth.

Geoscientists and astrophysicists whom have studied events for many years know Earth’s ‘geomagnetic activity’ is ‘typically stable’, however recent studies at astrophysics observatories have been analyzing ‘significant record periods of Earth seismic instability’ during which ’serious solar flare storms’ have been hitting Earth.

Is this what ‘a few governments are secretly preparing for’, the ‘possibility occurring soon’ triggering more events on Earth than just a Solar Energetic Particle Event ( SEPE )?

Unfortunately, recent research and analysis indicates, “yes.”

 

Submitted for review and open public commentary by,

 

Kentron Intellect Research Vault ( KIRV )
E-MAIL: KentronIntellectResearchVault@gmail.com
WWW: http://KentronIntellectResearchVault.WordPress.Com

References

http://earthquake.usgs.gov/research/external/reports/05HQGR0077.pdf
http://www.jstor.org/pss/74163
http://lasp.colorado.edu/education/journalists/solar_dynamics_ws/papers/lowres%20Severe%20Space%20Weather%20FINAL.pdf
http://conceptactivityresearchvault.wordpress.com/2011/01/02/solar-energetic-particle-event-effects
http://upintelligence.wordpress.com/2010/12/09/secret-hfse-properties-part-1
http://upintelligence.wordpress.com/2010/12/26/secret-hfse-properties-part-3
http://earthquake.usgs.gov/earthquakes/recenteqsww/
http://earthquake.usgs.gov/earthquakes/recenteqscanv/
http://en.wikipedia.org/wiki/Types_of_volcanic_eruptions

 

MSN Warns Disasters

MSN Warns Disasters

 

MSN Warns Disasters by, Concept Activity Research Vault

March 7, 2012 18:22:42 ( PST ) Updated ( Originally Published: May 16, 2011 )

Los Angeles – May 16, 2011 – MSN Slate News reported ( read article below ) that a host of disasters are coming, which ‘the public should not become overly worried about’, but suggests throwing a celebration-like “18th Century Weekend” in-advance so ‘people can experience what a solar flare disaster might be like to live through’.

While the suggested Medieval celebratory affair ‘concept’ is ‘unique’, MSN suggesting the public ‘stock up on batteries’ was a bit off because apparently the journalist did not realize ‘batteries become drained’ subsequent to an environmental anomaly ‘overcharging’ from an ambient auroral current attributable to what occurs during a Solar Energetic Particle Event ( SEPE );  ‘candles’ or ‘lumeniscent gel sticks’ ( shake lights ) would work amidst such, however mainstream news media broadcasts and print media, without thoroughly researching facts first, have a habit of passing inaccurate information on to the general public, and at the same time, doing it mostly in a whimsical fashion so it can be easily swallowed by the public. That type of reporting does ‘not’ help the public, but only serves to provide the illusion that what is being reported about will probably never happen. Big mistake!

News reporting, as a public service, should take far more care when reporting about emergency disaster preparedness on ‘what to do’ and just ‘how to prepare’; especially when it comes to mentioning a ‘significant’ solar flare ( also known as ) a Solar Energetic Particle Event ( SEPE ) that could quickly and very seriously disable the national electricity infrastructure without warning.

To let CARV readers review how MSN Slate News recently put it to the general public, we cordially invite ‘you’ ( our readers ) to review the MSN Slate News report ( below ) so, you can be the judge on whom to rely on for delivering your emergency disaster preparedness information from.

– –

Source: MSN Slate News

Meltdowns. Floods. Tornadoes. Oil spills. Grid crashes. Why more and more things seem to be going wrong, and what we can do about it.

The Century of Disasters by, Joel Achenbach

May 13, 2011 5:56 PM ( EST )

This will be the century of disasters.

In the same way that the 20th century was the century of world wars, genocide, and grinding ideological conflict, the 21st century will be the century of natural disasters and technological crises and unholy combinations of the two.

It will be the century when the things we count on to go right will – for whatever reason – go wrong.

Late last month ( April 2011 ), as the Mississippi River rose in what is destined to be the worst flood in decades, residents of Alabama and other states rummaged through the debris of a historic tornado outbreak.

Physicists at a meeting in Anaheim, California had a discussion about the dangers posed by the Sun.

Solar flares, scientists believe, are a disaster waiting to happen. Thus one of the sessions at the American Physical Society annual meeting was devoted to discussing the hazard of electromagnetic pulses ( EMP ) caused by solar flares – or terrorist attacks. Such pulses ( EMP ) could fry transformers and knock out the electrical grid over much of the nation. Last year the Oak Ridge National Laboratory released a study saying the damage might take years to fix and cost trillions of dollars.

But maybe even that is not the disaster people should be worrying about.

Maybe they should worry instead about the “ARkStorm.” That’s the name the U.S. Geological Survey ( USGS ) Multihazards Demonstration Project ( MDP ) gave to a hypothetical storm that would essentially turn much of the California Central Valley into a bathtub. It has happened before, in 1861 – 1862, when it rained for 45-days continously. USGS explains, “The ARkStorm draws heat and moisture from the tropical Pacific, forming a series of “Atmospheric Rivers” ( AR ) that approach the ferocity of hurricanes and then slam into the United States West Coast over several weeks.” The result, the USGS determined, could be a flood that would cost $725,000,000 billion in direct property losses and economic impact.

While pondering this, don’t forget the Cascadia subduction zone, the plate boundary off the coast of the Pacific Northwest, that could generate a tsunami much like the one that devastated Japan in March 2011. The Cascadia subduction zone, runs from Vancouver Island to northern California, last rupturing in a major tsunami spawning earthquake on January 26, 1700. It could break at any moment, with catastrophic consequences.

All of these things have the common feature of low probability and high consequence.

They are known as “black swan” events.

They are unpredictable in any practical sense.

There are also things ordinary people probably should not worry about on a daily basis.

You can’t fear the Sun.

You cannot worry a rock will fall out of the sky and smash the Earth, or that the ground will open up and swallow you like a vitamin.

A key element of maintaining one’s sanity is ‘knowing how to ignore risks’ that are highly improbable at any given point in time.

And yet in the coming century, these or other ‘black swan events’ will seem to occur with surprising frequency.

There are several reasons for this.

We have chosen to engineer the planet.

We have built vast networks of technology.

We have created systems that, in general, work very well, but are still vulnerable to catastrophic failures.

It is harder and harder for any one person, institution, or agency to perceive all the interconnected elements of the technological society.

Failures can cascade.

There are unseen weak points in the network.

Small failures can have broad consequences.

Most importantly, we have more people and more stuff standing in the way of calamity.

We are not suddenly having more earthquakes, but there are now 7,000,000,000 billion of us, a majority living in cities.

In 1800, only Beijing, China could count 1,000,000 inhabitants, but at last count there were 381 cities with at least 1,000,000 people.

Many are MegaCities in seismically hazardous places like Mexico City, Caracas, Venezuela; Tehran, Iran and Kathmandu amongst those with a lethal combination of weak infrastructure ( unreinforced masonry buildings ) and shaky foundations.

Natural disasters will increasingly be accompanied by technological crises, and the other way around.

In March 2011, the Japan earthquake triggered the Fukushima Dai-Ichi nuclear power plant meltdown.

Last year ( 2010 ), a technological failure on the Deepwater Horizon drilling rig – in the Gulf of Mexico – led to the environmental crisis of the oil spill. ( I chronicle the Deepwater Horizon blowout and the ensuing crisis management in a new book: A Hole at the Bottom of the Sea: The Race to Kill the BP Oil Gusher. )

In both the Deepwater Horizon and Fukushima disasters, the safety systems were not nearly as robust as the industries believed.

In these technological accidents, there are hidden pathways for the gremlins to infiltrate the operation.

In the case of Deepwater Horizon, a series of decisions by BP ( oil company ) and its contractors led to a loss of well control — the initial blowout. The massive blowout preventer on the sea floor was equipped with a pair of pinchers known as ‘blind shear rams’. They were supposed to cut the drillpipe and shear the well. The forensic investigation indicated the initial eruption of gas buckled the pipe and prevented the blind shear rams from getting a clean bite on it so, the “backup” plan — of cutting the pipe — was effectively eliminated in the initial event; the loss of well control.

Fukushima also had a backup plan that was not far enough back. The nuclear power plant had backup generators – in case the grid went down – but the generators were on ‘low’ ground and were blasted by the tsunami.

Without electricity the power company had no way to cool the nuclear fuel rods.

In a sense, it was a very simple problem: a power outage.

Some modern reactors coming online have passive cooling systems for backups that rely on gravity and evaporation to circulate the cooling water.

Charles Perrow, author of Normal Accidents, told me that computer infrastructure is a disaster in the making.

“Watch out for failures in cloud computing,” he said by e-mail, “They will have consequences for medical monitoring systems and much else.”

Technology also mitigates disasters, of course.

Pandemics remain a threat, but modern medicine can help us stay a step ahead of evolving microbes.

Satellites and computer models helped meteorologists anticipate the deadly storms of April 27, 2011 and warn people to find cover in advance of the twisters.

Better building codes save lives in earthquakes. Chile, which has strict building codes, was hit with a powerful earthquake last year ( 2010 ) but suffered only a fraction of the fatalities and damage that impoverished Haiti endured just weeks earlier.

The current ( 2011 ) Mississippi flood is an example of technology at work for better and for worse.

As I write, the Army Corps of Engineers are poised to open the Morganza spillway and flood much of the Atchafalaya basin. That’s not a “disaster” but a solution of sorts, since the alternative is the flooding of cities downstream and possible levee failure. Of course, the levees might still fail. We’ll see. But this is how the system is ‘supposed’ to work.

On the other hand, the broader drainage system of the Mississippi River watershed is set up in a way that it makes floods more likely. Corn fields, for example in parts of the upper Midwest, have been “tiled” with pipes that carry excess rainwater rapidly to the rip-rap ( small stone ladden ) streams and onward down to rivers lined with levees. We gave up natural drainage decades ago.

The Mississippi is like a catheter, at this point. Had nature remained in charge, the river would have mitigated much of its downstream flooding by spreading into natural floodplains further up river ( and the main channel would have long ago switched to the Atchafalaya river basin — see John McPhee “The Control of Nature” — and New Orleans would no longer be a riverfront city).

One wild card for how disastrous this century will become is climate change.

There’s been a robust debate on the blogs about whether the recent weather events ( tornadoes and floods ) can be attributed to climate change.

It is a briar patch of an issue and I’ll exercise my right to skip past it for the most part.

But I think it’s clear that climate change will exacerbate natural disasters in general in coming years, and introduce a new element of risk and uncertainty into a future in which we have plenty of risks and uncertainties already. This, we don’t need.

And by the way, any discussion of “geoengineering” as a solution to climate change needs to be examined with the understanding that engineering systems can and will fail.

You don’t want to bet, the future of the planet, on an elaborate technological fix in which everything has to work perfectly. If failure is not an option, maybe you ‘should not’ try-it to begin-with.

So if we cannot engineer our-way out-of our ‘engineered disasters’, and if ‘natural disasters’ are going to keep pummeling us – as they have since the dawn of time — what is our strategy? Other than, you-know, despair? Well, that has always worked for me, but here are a few more practical thoughts to throw in the mix:

First [ 1st ], we might want to try some regulation by people with no skin in the game. That might mean, for example, government regulators who make as much money as the people they’re regulating. Or it could even mean a ‘private-sector regulatory apparatus policing the industry’, cracking down on rogue operators. The point is, we don’t want every risky decision made by people with pecuniary interests.

Second [ 2nd ], we need to keep things in perspective. The apparent onslaught of disasters does not portend the end of the world. Beware of ‘disaster hysteria in the news media’. The serial disasters of the 21st century will be – to some extent – a matter of perception. It will feel like we are bouncing from disaster-to-disaster in-part because of the shrinking of the world and the ubiquity of communications technology. Anderson Cooper and Sanjay Gupta are always in a disaster zone somewhere – demanding to know why the cavalry [ emergency first responders ] has not showed up.

Third [ 3rd ], we should think in terms of ‘how we can boost’ our “societal resilience;” the buzz-word in the ‘disaster preparedness industry’.

Think of what you would do, and what your community would do, after a disaster.

You cannot always dodge the disaster, but perhaps you can still figure-out how to recover quickly.

How would we ‘communicate’ if we got [ solar ] flared by the Sun and the [ electricity ] grid went down over 2/3rds of the country?

How would we even know what was going on?

Maybe we need to have the occasional “18th Century weekend” – to see how people might get through a couple of days without the [ electricity ] grid, cell [ telephone ] towers, cable TV [ television ], iTunes downloads – the full Hobbesian nightmare. And make an emergency plan: Buy some ‘batteries’ [ < ? > NOTE: solar flare effects, during a Solar Energetic Particle Event ( SEPE ), renders ‘all batteries dead’. ] and jugs of water – just for starters.

Figure-out how things around you work.

Learn about your community infrastructure.

Read about science, technology, engineering and ‘do not worry if you do not understand all the jargon’.

And then – having done that – go on about your lives, pursuing happiness on a planet that, though sometimes dangerous, is by-far the best one we’ve got.

Reference

http://www.slate.com/id/2294013/pagenum/all/#p2

– –

Hopefully, people will take an opportunity to read the CARV report on Solar Energetic Particle Event Effects so they can ‘really know what to prepare for soon’,  ’before celebrating’ an “18th Century weekend” affair – complete with “batteries” – as suggested by the MSN Slate News article ( above ).

Although the aforementioned Slate News article indicates, “Solar flares, scientists believe, are a disaster waiting to happen. Thus one of the sessions at the American Physical Society annual meeting was devoted to discussing the hazard of electromagnetic pulses ( EMP ) caused by solar flares – or terrorist attacks. Such pulses ( EMP ) could fry transformers and knock out the electrical grid over much of the nation. Last year the Oak Ridge National Laboratory released a study saying the damage might take years to fix and cost trillions of dollars. But maybe even that is not the disaster people should be worrying about,” – we actually ‘may’ have ‘something “people should be worrying about,” as MSNBC puts it, or “concerned about,” according to NASA, in-lieu of the following MSNBC Space.Com report ( below ):

– – – –

 

Source: MSNBC.COM

 

Huge Solar Flare’s Magnetic Storm May Disrupt Satellites, Power Grids

 

March 7, 2012 13:19 p.m. Eastern Standard Time ( EST )

 

A massive solar flare that erupted from the Sun late Tuesday ( March 6, 2012 ) is unleashing one of the most powerful solar storms in more than 5-years, ‘a solar tempest that may potentially interfere with satellites in orbit and power grids when it reaches Earth’.

 

“Space weather has gotten very interesting over the last 24 hours,” Joseph Kunches, a space weather scientist at the National Oceanic and Atmospheric Administration ( NOAA ), told reporters today ( March 7, 2012 ). “This was quite the Super Tuesday — you bet.”

 

Several NASA spacecraft caught videos of the solar flare as it hurled a wave of solar plasma and charged particles, called a Coronal mass Ejection ( CME ), into space. The CME is not expected to hit Earth directly, but the cloud of charged particles could deliver a glancing blow to the planet.

 

Early predictions estimate that the Coronal Mass Ejections ( CMEs ) will reach Earth tomorrow ( March 8, 2012 ) at 07:00 a.m. Eastern Standard Time ( EST ), with the ‘effects likely lasting for 24-hours and possibly lingering into Friday ( March 9, 2012 )’, Kunches said.

 

The solar eruptions occurred late Tuesday night ( March 6, 2012 ) when the sun let loose two ( 2 ) huge X-Class solar flares that ‘ranked among the strongest type’ of sun storms. The biggest of those 2 flares registered as an X Class Category 5.4 solar flare geomagnetic storm on the space weather scale, making it ‘the strongest sun eruption so far this year’.

 

Typically, Coronal Mass Ejections ( CMEs ) contain 10,000,000,000 billion tons of solar plasma and material, and the CME triggered by last night’s ( March 6, 2012 ) X-Class Category 5.4 solar flare is ‘the one’ that could disrupt satellite operations, Kunches said.

 

“When the shock arrives, the expectation is for heightened geomagnetic storm activity and the potential for heightened solar radiation,” Kunches said.

 

This heightened geomagnetic activity and increase in solar radiation could impact satellites in space and ‘power grids on the ground’.

 

Some high-precision GPS ( Global Positioning Satellite ) users could also be affected, he said.

 

“There is the potential for ‘induced currents in power grids’,” Kunches said. “‘Power grid operators have all been alerted’. It could start to ’cause some unwanted induced currents’.”

 

Airplanes that fly over the polar caps could also experience communications issues during this time, and some commercial airliners have already taken precautionary actions, Kunches said.

 

Powerful solar storms can also be hazardous to astronauts in space, and NOAA is working close with NASA’s Johnson Space Center to determine if the six ( 6 ) spacecraft residents of the International Space Station ( ISS ) need to take shelter in more protected areas of the orbiting laboratory, he added.

 

The flurry of recent space weather events could also supercharge aurora displays ( also known as the Northern Lights and Southern Lights ) for sky-watchers at high latitudes.

 

“Auroras are probably the treat that we get when the sun erupts,” Kunches said.

 

Over the next couple days, Kunches estimates that brightened auroras could potentially be seen as far south as the southern Great Lakes region, provided the skies are clear.

 

Yesterday’s ( March 6, 2012 ) solar flares erupted from the giant active sunspot AR1429, which spewed an earlier X Class Category 1.1 solar flare on Sunday ( March 4, 2012 ). The CME from that one ( 1 ) outburst mostly missed Earth, passing Earth by last night ( March 6, 2012 ) at around 11 p.m. EST, according to the Space Weather Prediction Center ( SWPC ), which is jointly managed by NOAA and the National Weather Service ( NWS ).

 

This means that the planet ( Earth ) is ‘already experiencing heightened geomagnetic and radiation effects in-advance’ of the next oncoming ( March 8, 2012 thru March 9, 2012 ) Coronal Mass Ejection ( CME ).

 

“We’ve got ‘a whole series of things going off’, and ‘they take different times to arrive’, so they’re ‘all piling on top of each other’,” Harlan Spence, an astrophysicist at the University of New Hampshire, told SPACE.com. “It ‘complicates the forecasting and predicting’ because ‘there are always inherent uncertainties with any single event’ but now ‘with multiple events piling on top of one another’, that ‘uncertainty grows’.”

 

Scientists are closely monitoring the situation, particularly because ‘the AR1429 sunspot region remains potent’. “We think ‘there will be more coming’,” Kunches said. “The ‘potential for more activity’ still looms.”

 

As the Sun rotates, ‘the AR1429 region is shifting closer to the central meridian of the solar disk where flares and associated Coronal Mass Ejections ( CMEs ) may ‘pack more a punch’ because ‘they are more directly pointed at Earth’.

 

“The Sun is waking up at a time in the month when ‘Earth is coming into harms way’,” Spence said. “Think of these ‘CMEs somewhat like a bullet that is shot from the sun in more or less a straight line’. ‘When the sunspot is right in the middle of the sun’, something ‘launched from there is more or less directed right at Earth’. It’s kind of like how getting sideswiped by a car is different than ‘a head-on collision’. Even still, being ‘sideswiped by a big CME can be quite dramatic’.” Spence estimates that ‘sunspot region AR 1429 will rotate past the central meridian in about 1-week’.

 

The sun’s activity ebbs and flows on an 11-year cycle. The sun is in the midst of Solar Maximum Cycle 24, and activity is expected to ramp up toward the height of the Solar Maximum in 2013.

 

Reference

 

http://www.msnbc.msn.com/id/46655901/

 

– – – –

Do we need “Planetary Protection?” NASA has a specific website, referenced here ( below ) as do others ( below ), including The Guardians of the Millennium.

Submitted for review and commentary by,

Concept Activity Research Vault E-MAIL: ConceptActivityResearchVault@Gmail.Com WWW: http://ConceptActivityResearchVault.WordPress.Com

Reference

http://planetaryprotection.nasa.gov/about/ [ Planetary Protection ) http://www.lpi.usra.edu/captem/ [ CAPTEM ] http://www.nrl.navy.mil/pao/pressRelease.php?Y=2008&R=39-08r [ U.S. Naval Research Laboratory ] http://hesperia.gsfc.nasa.gov/sftheory/imager.htm [ RHESSI ]

 

Global Environmental Intelligence ( GEI )

Global Environmental Intelligence - GEI

Global Environmental Intelligence ( GEI )
by, Kentron Intellect Research Vault ( KIRV )

TUCSON, Arizona – March 7, 2012 – Nebraska is part of America’s heartland where less than fifty ( 50 ) people are ‘officially designated’ – by the U.S. Air Force Weather Agency ( AFWA ) – to analyze the pulsebeat of ‘doom and gloom’ through ‘scientific terms’ calculated within a new ‘strategic center’ using the latest ultra high-tech computer system, software application program modules and networking capabilities for what ‘global government national infrastructures’ expect, based on what the National Aeronautic Space Administration ( NASA ) warned would hit Earth during the current ”Solar Maximum” ( Cycle 24 ) sending a ‘significant’ “Solar Energetic Particle Event” ( SEPE ) to Earth that “we all need to be concerned about.”

What possibly could go wrong?

We’re all protected, aren’t we?

We are ‘not protected against natural events’ that we have ‘no control over’, but governments have been doing their best to monitor activities, which is one ( 1 ) reason ‘why’ the United States Air Force ( USAF ) ‘subordinate organization’ known as its Air Force Weather Agency ( AFWA ) analyzes significant scientific data from horizons above and below the Earth, searching for clues as to what ‘natural significant event’ ( i.e. Solar Energetic Particle Event ( SEPE ), tectonic plate shift ( continental earthquakes ),  magnetic pole shift ( climate change ), or something else as dramatic will occur.

The sole “Mission” of the U.S. Air Force Weather Agency ( USAFWA ) is tasked as the only U.S. government Agency ‘officially designated to analyze’ incredible amounts of scientific doom and gloom information the public might think came from ‘doomsday prophets’ concerned about Earth calamities going to happen soon.

Space Weather Earth Forecasting

Current official information ( received 24-hours a day and 365-days a year ) by the U.S. Air Force Weather Agency ( USAFWA ) and its host Directorate of Operational Weather Squadrons ( OWS ), group of ‘solar observatories’, ‘satellites’, ‘spacecraft’, ‘interstellar lightwave spectral and radio-frequency ( RF ) imaging sensors’, ‘telescopes’ with a wide array of ‘cameras’ as well as ‘human observers’ are continuing to report enough worrisome information that would scare many of us, but – ‘officially’ – the public will never know ‘what’ is about to befall them or ‘when’ that will occur.

After a considerable amount of research into this subject, the U.S. Air Force was caught ‘publicly admitting’ they are able to “know within minutes” as to ‘what significant Earth Event is coming’ and just ‘when it will arrive’ – anywhere between a few hours up-to days in-advance’, but the public will never be informed. Why?

Doom & Gloom or Fantasy Island?

Undoubtedly, classified ( in the ‘interest of national security’ ), would be the officially given reason to the public. Some might ask, “But, for ‘whose’ “security?” People, for the most part, are ‘not secure’ – especially when facing a pending injurious consequence resulting in ruination of life as they may have come to know their’s – and most gravitate toward wanting to “be with family” ( or “loved ones” ) as any pending cataclysmic Earth Event draws nearer to them all. The U.S. Air Force Weather Agency ( AFWA ), however realizes that being occupants of planet Earth is ‘not the same thing as’ “Fantasy Island” or a trip to “McDonaldland” as the rest of the public at-large has come to know ‘their version’ of what planet Earth is for ‘them’ and their children.

Preparedness

What may even be more interesting to the public is to review official U.S. data surrounding what U.S. Air Force commands ‘are already preparing for’, what ‘they already have set in-place’ for ‘everyone living in the Continental United States ( CONUS )’, and ‘how they are going to accomplish’ their secret-sensitive “Mission” on ‘people’ throughout the United States. This is ‘definitely not’ a ‘science fiction movie’ or ‘Orwellian Theory’ within this report, but strictly what the U.S. Air Force Weather Agency and another U.S. Air Force ‘subordinate organization’, the Air and Space Operations ( ASO ), Director of Weather ( USAF Deputy Chief of Staff ) has ‘recently laid-out publicly’, but for the ‘public’ to decipher ‘all of what was so officially presented’ many were lost in the government psychobabble resulting in no impact on the public as to the ‘seriousness’ for what they need to ‘begin preparing for’.

Television Commercial ‘Official Advertisement’ Already Warned Public!

Recently ( 2010 and 2011 ), the U.S. Federal Emergency Management Agency ( FEMA ) began advertising its own advertising commercial on American television. The FEMA TV ad commercial ’warns the public to prepare’ for an “unexpected Event” that “can suddenly turn your life – and those around you – ‘upside down’.”

FEMA’s TV commercial advertisement ( see video clip – below ) is easily recalled by those who watch / watched it – depicting a ‘family inside a home’ where all of a sudden everything inside the home’ begins floating up in the air; as though ‘gravity was somehow suddeny lost’. This ‘professionally produced television commercial advertisement’ from FEMA ‘stylishly warns people to prepare for any unexpected natural disaster Earth Event, however FEMA did not describe ‘what type of event strike for people to expect’.

The sadest part about this is that the U.S. government actually considers this television commercial advertisement an “official public warning to prepare for a national disaster” while simultaneously the U.S. government hopes the commercial will ‘not’ create ‘panic’ or ‘chaos’ in the streets that would disrupt ‘skyrocketing profits’ received by ‘global elite’ and/or ‘big business’ through their stock market portfolios. In short, this alluding to a national disaster is nothing short of being the ‘Biggest Show On Earth’ continuing profits from basic living expenses saddling the ‘little people’, ‘worthless eaters’, and ‘consumers of “blue gold” ( water ) the global elite predict will rise – more-so than precious metals – after such an Earth Disaster Event ( EDE ).

Survival By Chance or Circumstance

One of the most worrisome concerns, after reviewing this report ( filled with official U.S. government information’ ), is: ‘What’, ‘when’ and ‘how’ U.S. government ‘decision-makers’ react to their ‘official alert’. Many will undoubtedly and instantly gather with ‘their families’, however ‘what about the rest of us’ gathering with ‘our families? We will not be provided the same ‘early alert’ to even know – until after it ( significant Earth Event ) already takes place! Is this ‘fair’? Is this ‘just’? Is this ‘humane’? Or, is this pure and unadulterated ‘elitist selfishness’ that government affords for only a ‘select few’ lucky souls fortunate enough to be provided with sufficient warning in-advance and additionally told were to go and what to do in order to escape the wrath of such a cataclysmic Earth Event? Few people realize that those ‘decision-makers’ have already been instructed in-advance as to ‘what’ they need to do and ‘when’ they need to do it.

AFWA Notifies 350 Military Installations About Earth Disaster Event

Even more frightening is, ‘what’ the U.S. military has ‘already been commanded’ as to ‘how to accomplish’ their “Mission” based on a highly-classified U.S. Presidential Executive Order ( EO ) turning the U.S. population over to U.S. military control. The U.S. Air Force Weather Agency ( AFWA ) was also ‘officially designated’ as the “Agency” ‘controlling internet ( cyberspace ) shutdown’, according to the report ( below ) and further researching the capabilities of its ‘subordinate organization’ ( un-named ) “Strategic Center.”

Current Reporting Revelations

This report contains ‘no old news’ but ‘new official U.S. government revelations’, which cause the public to re-think where they are and what is coming. The only information this report does ‘not’ provide publicly are the ‘names’, ‘addresses, ‘phone numbers’, and ‘photographs’ of those ‘key individuals’ whom are ‘already designated’ as those ‘ready’, ‘willing’ and ‘able’ to ‘perform as ordered on-command’ – even if that command is ‘unable to be given’ after a cataclysmic Earth Event!

Reviewers Need To Know

There are only two ( 2 ) ‘official U.S. government reports’ provided ( below ) in this report, which contains internet links to more ‘official information’ on this subject.

Knowing the public at-large, and hot it predominantly enjoys ‘entertaining flowery written articles’ – they never bother to  ‘research’ anything further about’ – this report was purposely kept ‘short and sweet’ ( no flowers – no frills ). It includes an ‘official government article’ that was converted into a ‘very simple outline’, and that is followed-up with yet another ‘official government article’ left as-is. Hopefully, some may consume this report far-easier than others at this website so its point on this subject may be even-better publicly received.

The first ( 1st ) report ( included immediately below ) was ‘thoroughly analyzed’ in its ‘original format’, and then – based on ‘even more detailed official information’ discovered – was ‘re-formatted into an even-more proper perspective’, and – for the sake of making this report far-more easily comprehendible for public review – it was finally ‘converted into a simple outline format’ but only because of the fashion by-which government complicated the original public information release’; saying everything but not saying it very well for the public to easily understand.

The second ( 2nd ) report ( further below ) was left in its ‘original format’ because anyone reviewing it can quickly understand the nature of the ‘true problems’ surrounding what we all are going to be facing soon enough.

You be the judge, as to ‘what’ you will do, to ‘prepare’ for the ‘officially expected’ Earth Event. Enjoy the report ( below ) and be sure to click on the embedded links to learn more about ‘what effects’ are ‘coming’ and ‘what’ the U.S. Department of Homeland Security ( DHS ) is preparing for.

Tell a friend about what you review ( below ) and your reply may be, “You can’t trust what you read on the internet.” If anyone does ‘not trust’ this report ( below ) – or anything else on this website – they should click on the ‘official government website links’ ( provided at the bottom of this report, and other links found herein ) – if they are equipped with an ‘attention-span’ ( longer than a bug ) or ‘not easily distracted’ by something as simple as a ‘horn honk’ – then they can ‘research it all’ for themselves and try to prove this information and you are incorrect.

Many will be amazed at how much these official government revelations will help them to prepare for more than ‘entertainment’.

– – – –

Source: United States Air Force ( USAF ) Weather Agency ( AFWA )

Air Force Weather Agency ( AFWA ) 106 Peacekeeper Drive, Suite 2 Offutt Air Force Base, Nebraska 68113-4039 USA TEL: +1 (402) 232-8166 ( Public Affairs ) TEL: DSN TEL: 272-8166 ( DSN )

HISTORY –

United States Air Force Weather Agency ( USAFWA ) – aka – Air Force Weather Agency ( AFWA ) claims heritage extrapolated back to World War I, when the U.S. Army ( USA ) ‘subordinate organization’ Signal Corps ( ASC ) ‘subordinate organization’ “Meteorological Service” ( SCMC ) provided ‘weather support’ for U.S. Army defense aircraft pilots.

Amidst the legacy of the U.S. Secretary of War, ( known ‘today’ as ) U.S. Department of Defense ( DoD ), it – having realized its ‘subordinate organization’ U.S. Army had ‘increasing military personnel numbers’ due to an ‘increasing defense stockpile inventory’ of U.S. Army ‘militarized aircraft’ – created a ‘then-new subordinate organization’ known as the U.S. Army ( USA ) Air Corps ( AAC ) that became well-known as the U.S. Army Air Corps ( USAAC ).

On July 1, 1937 the U.S. Secretary of War ‘reorganized’ the U.S. Army ‘subordinate organization’ Signal Corps ( ASC ) ‘subordinate organization’ Metereological Service ( MS ) ‘out-of’ the Signal Corps ( SC ) and ‘in-to’ the U.S. Army ( USA ) ‘subordinate organization’ Air Corps ( ASC ) as its ‘then-new subordinate organization’ Meteorological Service ( ACMS ).

On April 14, 1943 the U.S. Army Air Corps ( USAAC ) ‘subordinate organization’ Meteorological Service ( MS ) was ‘renamed’ “Weather Wing” ( WW ) and ‘physically moved’ to an Asheville, North Carolina location – where ‘today’ the U.S. Air Force Weather Agency ( USAFWA ) has its 14th Operational Weather Squadron ( OWS ) located.

In 1945, the U.S. Army Air Corps ( USAAC ) Weather Wing ( ACWW ) was ‘renamed’ the U.S. Army Air Corps “Weather Service” ( ACWS ).

In early 1946, the U.S. Army Air Corps Weather Service ( ACWS ) was ‘physically moved’ from Asheville, North Carolina to the U.S. Army Air Corps Langley Field Station – where ‘today’ Langley Air Force Base ( LAFB ) is located.

On March 13, 1946 the U.S. Army Air Corps Weather Service ( ACWS ) was ‘reorganized’ under the U.S. Army Air Corps ( USAAC ) ‘new subordinate organization’ Air Transport Command ( ATC ) where its ‘then-new subordinate organization’ “Weather Service” ( ACWS ) was ‘renamed’ “Air Weather Service” ( AWS ).

Later-on, in 1946, the U.S. Army Air Corps ( USAAC ) Air Transport Command ( ATC ) Air Weather Service ( AWS ) was ‘physically moved’ to Gravelly Point, Virginia.

In 1947, the U.S. Army Air Corps ( USAAC ) was ‘renamed’ United States Air Force ( USAF ) whereupon the “Air Weather Service” ( AWS ) became ‘directly under it’ and assumed ‘sole responsibility’ over ‘all global weather reporting’ and ‘all global weather forecasting’ for ‘both’ the U.S. Air Force and U.S. Army.

In 1948, the USAF ‘newly activated’ Military Air Transport Service ( MATS ) – ( known ‘today’ as ) Military Airlift Command ( MAC ) – then-received its ‘newly-transferred subordinate organization’ Air Weather Service ( AWS ) that was ‘removed from being directly under’ USAF ‘headquarters’. USAF Military Air Transport Service ( MATS ) ‘subordinate organization’ Air Weather Service ( AWS ) was then ‘physically moved’ to Andrews Air Force Base ( AAFB ) in Maryland.

In 1958, the USAF Military Airlift Command ( MAC ) ‘subordinate organization’ Air Weather Service ( AWS ) was then ‘physically moved’ to Scott Air Force Base ( SAFB ) in Illinois where – for almost 40-years – it remained.

In 1991, the USAF Military Airlift Command ( MAC ) ‘subordinate organization’ Air Weather Service ( AWS ) was ‘redesignated’ as a “field operating agency” and then ‘reorganized’ directly back under USAF ‘headquarters’.

On October 15, 1997 the USAF Air Weather Service ( AWS ) was ‘redesignated’ as the “Air Force Weather Agency” ( AFWA ) and ‘physically moved’ to Offutt Air Force Base ( OAFB ) in Nebraska where it remains ‘today’ – at least for the moment.

– –

United States Air Force ( USAF )

USAF Weather Agency ( USAFWA )

BUDGET –

$183,000,000 million U.S. dollars ( annually ) includes $98,000,000 million dollars for ‘Operations’ and ‘Maintenance’.

ESTABLISHED –

October 15, 1997.

LOCATION –

Offutt Air Force Base ( OAFB ) in Nebraska ( USA ).

REPORTING –

USAF Air and Space Operations ( ASO ), Director of Weather ( DOW ), Deputy Chief of Staff ( DCS ).

MISSION –

Enable U.S. decision makers’ exploitation of all ‘resources’ based-on ‘relevant’ Global Environmental Intelligence ( GEI ) for a ‘global spectrum of military warfare’ by ‘maximizing U.S. power, over:

– Land; – Air; – Space; and, – Internet ( cyberspace ).

PERSONNEL –

1,400 + manned, by:

– Military ( active-duty and reserve ); – Contractors ( government contracts ); and, – Civilians.

ORGANIZATION –

– ? ( # ) Agencies ( staff ); – Two ( 2 ) Weather Groups ( WXG – Coordinators ); – Fourteen + ( 14 + ) Weather Squadrons ( OWS – Directorates ); – Five ( 5 ) Observatories ( solar ); and, – One ( 1 ) Strategic Operation Systems Center ( Unknown – ‘subordinate organization’ ).

– –

USAFWA

Global Environmental Intelligence ( GEI )

DATA –

Global Environmental Intelligence ( GEI ) data is ‘collected’ from a variety of ‘information monitoring sources’ ( Earth based and space based locations ) providing a variety of ‘data feed streams’ of information from a computer operated system network out-of a ‘Strategic Center’ processing indications detecting almost any ‘significant natural negative impact’ that ‘could occur’ on ‘people’ and ‘national infractures’ throughout ‘global geographic regions’ and in ‘outerspace’.

SOURCES –

GEI data is provided by, many sources, amongst-which, its ( subordinate ) Strategic Center that processes data through its Computer Operation Systems Network that receives ‘channeled data streams’ from ‘encoded data telemetry’ sent from ‘ground-base sourced’ and ‘space-based sourced’ detection monitors comprised, of:

– Ground Sensors ( infra-red, spectral and sonic );; – Submerged Sensors ( infra-red, spectral and sonic );; – Space-based Sensors ( lightwave, spectral and sonic ); – Long-Range Telescopes ( radio-frequency wave, optical lightwave and sonic ); – Imaging Cameras ( lightwave, spectral and sonic ); and, – Human Observers ( various means ).

SPECIAL NOTE:

USAF ASO A6 –

The U.S. Air Force ( USAF ) Air and Space Operations ( ASO ) Communications Directorate ( ” A6 ” ) ‘provides’ the ‘policy oversight’ and ‘planning  oversight’ of Command, Control, and Communication Intelligence ( C3I ) for the U.S. Air Force Weather Agency ( USAFWA ).

USAF Air and Space Operations ( ASO ) Communications Directorate ( A6 ) also ‘provides support’ over ‘daily operations’, ‘contingency actions’, and ‘general’ Command, Control, Communication and Computer Intelligence ( C4I ) for the U.S. Air Force Weather Agency ( USAFWA ).

– –

USAFWA

Weather Groups ( WXD ) –

Weather Groups ( WXD ) are specifically designated to, coordinate:

– Global Environmental Intelligence ( GEI ) – Operational Weather Squadrons ( OWS ).

– –

USAFWA

Operational Weather SquadronS ( OWS ) –

STATIONS –

A ‘minimum’ of fourteen ( 14 ) Operational Weather Squadrons ( OWS ) are on-station ( active ) 24 hours per day 7 days per week 365 weeks per year to mitigate ( handle ) Global Environmental Intelligence ( GEI ) issues.

Operational Weather Squadrons ( OWS ) – also known as – Weather Squadrons ( WS ) are ‘designated’, to:

– ‘Process’ Global Environmental Intelligence ( GEI ) data; and, – ‘Provisionally Distribute’ Global Environmental Intelligence ( GEI ) data.

Operational Weather Squadron Global Environmental Intelligence ( OWS GEI )

– –

2nd Weather Group ( WXD )

2ND WXD

OVERSIGHT –

– 2nd ( Strategic Computing Network Center ) Systems Operations Squadron ( SOS ) – Offutt Air Force Base, Nebraska; – 2nd Operational Weather Squadron ( OWS ) – Offutt Air Force Base, Nebraska; and, – 14th Operational Weather Squadron ( OWS ) – Asheville, North Carolina.

2ND WXD Global Environmental Intelligence ( GEI )

DISTRIBUTION –

– U.S. Agency ‘decision-makers’; – U.S. Department of Defense ( DoD ) ‘decision-makers’; – U.S. Allied Foreign Nation ‘decision-makers’; and, – U.S. Joint Operations ‘Warfighters’.

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘develops’ and ‘maintains’ the ‘concepts of operations’ for how U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) supports “most weather sensitive” areas of ‘joint capabilities’.

REPORTS –

– Timely; – Relevant; and, – Specialized.

SOURCES –

Solar Observatories ( Detachments – Det. ), four ( 4 ):

Det. 1 ( 2ND WXG SO Detachment 1 – Learmonth, Australia ); Det. 2 ( 2ND WXG SO Detachment 2 – Sagamore Hill, Massachusetts ); Det. 4 ( 2ND WXG SO Detachment 3 – Holloman Air Force Base – New Mexico ); and, Det. 5 ( 2ND WXG SO Detachment 4 – Palehua, Hawaii ).

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘works in conjunction with’ MAJCOM ‘functional counterpart users’ of products, data and services supplied from U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ).

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) is ‘one ( 1 ) agent’ of the ‘lead Command’ for ‘gathering operational requirements’.

USAGES –

– Planning – Military Operation Missions ( global spectrum ); and, – Executing – Military Operation Missions ( global spectrum ).

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘coordinates’ U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) policy issues.

TASKS –

Four ( 4 ) Operational Weather Squadrons ( OWS ) ‘specifically conduct’ Global Environmental Intelligence ( GEI ):

– Warnings ( 24-hours / 7-days a week / 365-weeks a year ); – Military Mission Briefings ( 24-hours / 7-days a week / 365-weeks a year ) – Forecasts ( 24-hours / 7-days a week / 365-weeks a year ); and, – Analysis ( 24-hours / 7-days a week / 365-weeks a year ).

PROCESSING –

Strategic Center ( cost: $277,000,000 million dollars ) ‘subordinate organization’, provides:

– Computer System Complex; – Computer System Network Productions; – Computer System Applications; – Computer System Operations; – Computer System Sustainment; and, – Computer System Maintenance.

Air Force Weather Enterprise ( AFWE ) is a ‘computer system’ of the U.S. Department of Defense  (DoD ). AFWE computer system access is ‘restricted’ to members of the United States military ( Active Duty, National Guard, or Reserve Forces), U.S. Government, or U.S. government contractors that do business with the U.S. government and require ‘weather information’.

SPECIAL NOTE(S):

USAF ASO A8 –

The U.S. Air Force ( USAF ) Air and Space Operations ( ASO ) Strategic Plans and Programs Directorate ( ” A8 ” ) ‘directs’ the ‘planning’, ‘programming’, ‘budgeting’, ‘development’, ‘acquisition’, ‘engineering’, ‘configuration management’, ‘modification’, ‘installation’, ‘integration’, ‘logistics’ and ‘life cycle maintenance support’ over ‘all’ of the ‘computer processing equipment’ and over ‘all’ of the ‘standard weather systems’.

– –

USAFWA

1st Weather Group ( 1ST WXG ) Directorate

1ST WXG

Global Environmental Intelligence ( GEI )

GEI DISTRIBUTION –

The USAF Air Force Weather Agency ( AFWA ) 1st Weather Group ( 1ST WXG ) Directorate uses ‘four’ ( 4 ) Operational Weather Squadrons ( OWS ) ready 24-hours a day 7-days a week 365-days a year to provide its Global Environmental Intelligence ( GEI ) notification distributions to U.S. ‘military forces’ ( see military branches below ) at 350 ‘specific installations’ located in ‘five’ ( 5 ) Continental United States ( CONUS ) Regions.

OWS GEI NOTIFICATION REGIONS –

SOUTHEASTERN ( CONUS )

9th Operational Weather Squadron ( OWS ) Shaw Air Force Base South Carolina

9TH OWS – Notifies:

– U.S. Air Force; – U.S. Air Force Reserve; – U.S. Air Force National Guard; – U.S. Army; – U.S. Army Reserve; and, – U.S. Army National Guard.

NORTHERN ( CONUS ) AND, NORTHEASTERN ( CONUS )

15th Operational Weather Squadron ( OWS ) Scott Air Force Base Illinois

15TH OWS – Notifies ( Both Regions ):

– U.S. Air Force; – U.S. Air Force Reserve; – U.S. Air Force National Guard; – U.S. Army; – U.S. Army Reserve; and, – U.S. Army National Guard.

WESTERN ( CONUS )

25th Operational Weather Squadron ( OWS ) Davis-Monthan Air Force Base Tucson, Arizona

25TH OWS – Notifies:

– U.S. Air Force; – U.S. Air Force Reserve; – U.S. Air Force National Guard; – U.S. Army; – U.S. Army Reserve; and, – U.S. Army National Guard.

SOUTHERN ( CONUS )

26th Operational Weather Squadron ( OWS ) Barksdale Air Force Base Louisiana

26TH OWS – Notifies:

– U.S. Air Force; – U.S. Air Force Reserve; – U.S. Air Force National Guard; – U.S. Army; – U.S. Army Reserve; and, – U.S. Army National Guard.

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘works with’ USAF Air and Space Operations ( ASO ) staff to integrate U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) Continental Operations ( CONOPS ) with U.S. Air Force ‘plans’ and ‘Programs’.

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘assists’ in the ‘exploitation of weather information’ for ‘warfighting operations’.

– –

The aforementioned four ( 4 ) Operational Weather Squadrons ( OWS ) additionally, provide:

– USAF Weather Agency officer ‘upgrade’ training; and, – Apprentice forecaster ‘initial’ qualification.

– –

USAFWA Manpower & Personnel Directorate ( A1 ) ‘provides’ the global spectrum, of:

– Manpower; – Organization; – Personnel; and, – Training.

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘oversees’ and ‘executes’ U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) Standardization and Evaluation Program for ‘Weather Operations’.

U.S. Air Force ( USAF ) Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘assists’ the USAF Air and Space Operations ( ASO ) staff with managing U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) the ‘process’ of ‘career field-training’ by ‘obtaining training’ and ‘implementing training’ to meet ‘career field-training requirements’.

– –

USAF

Air Force Combat Weather Center ( USAFCWC – aka – AFCWC )

USAF Combat Weather Center ( Hurlburt Field, Florida ) ‘develops’, ‘evaluates’, ‘exploits’ and ‘implements’ a variety of ‘new tactics’, ‘new techniques’, ‘new procedures’ and ‘new technologies’ across U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) ‘enhancing effectiveness’ for U.S.  military force branches, of the:

U.S. Special Forces; U.S. Joint Operations; U.S. Combined Operations; U.S. Air Force; and, U.S. Army.

– – – –

Source: Global Security

Space

AFNS

Space Weather Team Readies For Upcoming Solar Max by, Ryan Hansen ( 55th Wing Public Affairs )

February 25, 2011

NEBRASKA, Offutt Air Force Base – February 25, 2011 – Solar Maximum may sound like the name of a super hero, but it’s certainly no comic book or 3-D movie. Solar Maximum is actually the name for the Sun’s most active period in the solar cycle, consistently producing solar emissions, solar flares and sun spots.

For a little background on the sun’s activities, the star goes through roughly 11-year cycles of where it is very active and also relatively calm. The Sun’s last Solar Maximum occurred in 2000 and it is expected to awaken from its current solar minimum and get more active this year.

According to the members of the 2nd Weather Squadron ( WS ), an active sun can cause all sorts of problems for us. “Solar weather plays a huge part in the warfighter’s mission,” said Staff Sgt. Matthew Money, a forecaster with the space weather flight. “Impacts from solar weather can cause radio blackouts, satellite communication failure, satellite orbit changes, satellite surface charging, or short circuits, and radar clutter.”

That is why the squadron’s worldwide space weather team of roughly 50 active-duty members, civilians and contractors continually analyze, forecast and provide alert notifications for the entire U.S. Department of Defense ( DoD ), as well as a slew of other government agencies.

“When ‘space weather’ causes impacts to earth that meet or exceed warning thresholds our end users are informed within minutes,” said Staff Sgt. Jonathan Lash, space weather flight forecaster. We send out warning bulletins through a computerized distribution system, [ and ] we have other graphical products that show what happened in the past 6-hours around the globe as well as what we expect to happen in the upcoming 6-hours,” he said.

Members of the 2nd Weather Squadron [ OWS ] rely on five ( 5 ) ground-based Solar Observatories, as well as a network of satellites orbiting the earth, to accomplish their mission.

“There aren’t too many opportunities to be the Air Force’s sole provider of something,” said Lt. Colonel Jim Jones, 2nd Weather Squadron [ OWS ] Commander. “In this case, the mission is unique to the entire DoD.”

Solar Observatories are ‘strategically placed’ around the globe in such places, as:

– Australia; – Hawaii; – Italy; – Massachusetts; and, – New Mexico.

They include both optical telescopes and radio telescopes and ensure the Weather Squadron always has one eye, or ear, on the sun.

“The optical telescope network monitors solar surface features,” said Master Sgt. Shane McIntire, the space weather flight chief. “It automatically tracks the sun and directs light to the instruments, which collect data and are controlled by computers. It scans specific regions at a rate of at least twice per minute.”

Through filtered lenses space weather analysts are able to perform flare patrol and view sunspots to determine the magnetic complexity of the region.

“The telescope has special filters that isolate a single optical wavelength,” said Master Sgt. Shane Siebert, who leads the Detachment 4 solar observatory for the 2nd Weather Squadron at Holloman Air Force Base, New Mexico. This wavelength ( 6563 angstroms ) is called ‘hydrogen alpha’ ( H-Alpha ) where the majority of solar activity occurs,” he said, “Analysts monitor this wavelength from sunrise to sunset, and are looking for specific signatures that may lead to solar flares and other adverse activity.”

But not all of the sun’s activities can be captured using optical telescopes.

Some events have a unique radio-frequency signature that can also be measured.

Using a mixture of technology from the 1970s to the present day, radio observatories are able to monitor frequencies in the 25 MhZ to 180 megahertz range, as well as eight ( 8 ) other discrete frequencies. Their digitized output is collected by a computer and then processed and analyzed for solar activity.

“We actually are able to detect the specific strength at a given radio frequency,” said Major Bradley Harbaugh, who commands the Detachment 5 solar observatory for the 2nd Weather Squadron [ OWS ] at Palehua, Hawaii. What we detect are energetic solar emissions in [ specific ] frequency bands or ranges. When detected, we [ are able to describe ] the start time, duration, intensity and type of solar emission. This helps describe the potential impacts by identifying the characteristics of what may impact earth,” he said.

Identifying these solar emissions is crucial to warfighter communication abilities.

“If there is solar energy that increases on your frequency, you can try to talk into your radio, but the noise from the sun will be stronger than your transmission, therefore drowning-out what you are saying,” Major Harbaugh said, “As an operator, you can increase your radio power to try and ‘out-broadcast’ the sun but you are also now broadcasting over a much larger area, making your transmission more susceptible to enemy detection. Therefore, the Sun’s impact must be a consideration when planning a mission.”

The Weather Squadron network of satellites includes those owned and operated by the U.S. Department of Defense ( DoD ), National Aeronautic and Space Administration ( NASA ) and the National Oceanic and Atmospheric Administration ( NOAA ) that include a combination of systems that are dedicated solely to space weather as well as a few utilizing space weather sensors.

“We gather a significant amount of data from satellites,” Sergeant McIntire said. “Imagery from [ satellites ] can augment the ground-based network, providing real-time monitoring of solar features at wavelengths that can’t be seen from the ground.”

Data from all of these sources combined, are continually pushed to the space weather operations center at the U.S. Air Force Weather Agency ( USAFWA ) here.

With this information in hand, the Weather Squadron can produce the most reliable space weather forecast possible, however even with all of this data, producing a space weather forecast is still much more difficult than creating one for terrestrial weather.

“Space weather is a terribly difficult science and it takes a lot of training and experience,” Colonel Jones said.

“Space weather forecasting is very reactive,” Sergeant Money said. “The ‘knowledge and tools are not quite up to par in order to do accurate forecasting’ like we do here on Earth.”

It is also important to note that today the world is much more reliant on space-based assets than they were during the last Solar Maximum, officials said. With cellphones, portable navigation devices and satellite television receivers, all part of our daily lives, a huge solar weather event could wreak havoc on quite a few different platforms.

“The impact of a solar storm in 2000 was probably not as great, due to the lower density of space technology, and the limited number of consumers utilizing the data,” Major Harbaugh said, “However, the ripple from a ‘major solar event now will more likely be felt across a much broader consumer base’ [ the public ] since there are many more assets and many more users of space data.”

However, with improved technology and an increased knowledge of the sun’s activities, the Weather Squadron ( WS ) is more prepared than ever for the upcoming Solar Maximum, Colonel Jones said. “Since the last solar maximum, we’ve upgraded most of our numerical models in terms of both their basic science and the data they ingest,” he said. “That’s a direct result of the advances in sensors and the technology that enables rapid data transfer. We can react faster and see farther than ever before.”

“We already have members within the unit developing forecast techniques based on signatures we see on the sensors,” Sergeant Money said.

So it’s a safe bet, that ‘the next 2-years’ will be ‘hectic’ for the 2nd Weather Squadron [ OWS ].

Their mission, to provide ‘situational awareness to key decision-makers’ will certainly keep ‘them’ busy.

“In the last 30-days alone ( February 2011 ), we’ve had [ more than 30 ] reportable [ solar ] energy events,” Major Harbaugh said. “The workload has ‘already increased’ and ‘will continue to do so’ for probably ‘the next 1-year’ or ’2-years’.”

“About 1-year ago, it was not uncommon for an analyst to only have one ( 1 ) very small Solar Region of the Sun to monitor,” Sergeant Siebert said. “Today, it is normal for analysts to keep fairly busy monitoring 4 Solar Regions to 6 Solar Regions. Studies, of the last Solar Maximum, show that typically 1-day included twenty-two ( 22 ) active Solar Regions – almost 4 times our current workload,” he added.

Regardless, Weather Squadron [ OWS ] ‘space weather’ analysts, forecasters and technicians globally are ready for the upcoming solar fury [ see, e.g. Solar Energetic Particle Event ], Colonel Jones said.

– – – –

The U.S. Air Force obviously needs ‘far-more’:

– More personnel for U.S. Air Force Weather Agency ( USAFWA ) Weather Group ( WXD ) Operational Weather Squadrons ( OWS ) effortings;

– Independent ’governing board’ or ‘blue ribbon watchdog committee’ overseeing an ’independent monitoring agency’ placing the U.S. Air Force Weather Agency ( AFWA ) and its USAF Air and Space Operations ( ASO ) under an independent ‘microscope’; and,

– Public Transparency notifications, would also be nice, but then…

– – – –

Source: MSNBC.COM

Huge Solar Flare’s Magnetic Storm May Disrupt Satellites, Power Grids

March 7, 2012 13:19 p.m. Eastern Standard Time ( EST )

A massive solar flare that erupted from the Sun late Tuesday ( March 6, 2012 ) is unleashing one of the most powerful solar storms in more than 5-years, ‘a solar tempest that may potentially interfere with satellites in orbit and power grids when it reaches Earth’.

“Space weather has gotten very interesting over the last 24 hours,” Joseph Kunches, a space weather scientist at the National Oceanic and Atmospheric Administration ( NOAA ), told reporters today ( March 7, 2012 ). “This was quite the Super Tuesday — you bet.”

Several NASA spacecraft caught videos of the solar flare as it hurled a wave of solar plasma and charged particles, called a Coronal mass Ejection ( CME ), into space. The CME is not expected to hit Earth directly, but the cloud of charged particles could deliver a glancing blow to the planet.

Early predictions estimate that the Coronal Mass Ejections ( CMEs ) will reach Earth tomorrow ( March 8, 2012 ) at 07:00 a.m. Eastern Standard Time ( EST ), with the ‘effects likely lasting for 24-hours and possibly lingering into Friday ( March 9, 2012 )’, Kunches said.

The solar eruptions occurred late Tuesday night ( March 6, 2012 ) when the sun let loose two ( 2 ) huge X-Class solar flares that ‘ranked among the strongest type’ of sun storms. The biggest of those 2 flares registered as an X Class Category 5.4 solar flare geomagnetic storm on the space weather scale, making it ‘the strongest sun eruption so far this year’.

Typically, Coronal Mass Ejections ( CMEs ) contain 10,000,000,000 billion tons of solar plasma and material, and the CME triggered by last night’s ( March 6, 2012 ) X-Class Category 5.4 solar flare is ‘the one’ that could disrupt satellite operations, Kunches said.

“When the shock arrives, the expectation is for heightened geomagnetic storm activity and the potential for heightened solar radiation,” Kunches said.

This heightened geomagnetic activity and increase in solar radiation could impact satellites in space and ‘power grids on the ground’.

Some high-precision GPS ( Global Positioning Satellite ) users could also be affected, he said.

“There is the potential for ‘induced currents in power grids’,” Kunches said. “‘Power grid operators have all been alerted’. It could start to ’cause some unwanted induced currents’.”

Airplanes that fly over the polar caps could also experience communications issues during this time, and some commercial airliners have already taken precautionary actions, Kunches said.

Powerful solar storms can also be hazardous to astronauts in space, and NOAA is working close with NASA’s Johnson Space Center to determine if the six ( 6 ) spacecraft residents of the International Space Station ( ISS ) need to take shelter in more protected areas of the orbiting laboratory, he added.

The flurry of recent space weather events could also supercharge aurora displays ( also known as the Northern Lights and Southern Lights ) for sky-watchers at high latitudes.

“Auroras are probably the treat that we get when the sun erupts,” Kunches said.

Over the next couple days, Kunches estimates that brightened auroras could potentially be seen as far south as the southern Great Lakes region, provided the skies are clear.

Yesterday’s ( March 6, 2012 ) solar flares erupted from the giant active sunspot AR1429, which spewed an earlier X Class Category 1.1 solar flare on Sunday ( March 4, 2012 ). The CME from that one ( 1 ) outburst mostly missed Earth, passing Earth by last night ( March 6, 2012 ) at around 11 p.m. EST, according to the Space Weather Prediction Center ( SWPC ), which is jointly managed by NOAA and the National Weather Service ( NWS ).

This means that the planet ( Earth ) is ‘already experiencing heightened geomagnetic and radiation effects in-advance’ of the next oncoming ( March 8, 2012 thru March 9, 2012 ) Coronal Mass Ejection ( CME ).

“We’ve got ‘a whole series of things going off’, and ‘they take different times to arrive’, so they’re ‘all piling on top of each other’,” Harlan Spence, an astrophysicist at the University of New Hampshire, told SPACE.com. “It ‘complicates the forecasting and predicting’ because ‘there are always inherent uncertainties with any single event’ but now ‘with multiple events piling on top of one another’, that ‘uncertainty grows’.”

Scientists are closely monitoring the situation, particularly because ‘the AR1429 sunspot region remains potent’. “We think ‘there will be more coming’,” Kunches said. “The ‘potential for more activity’ still looms.”

As the Sun rotates, ‘the AR1429 region is shifting closer to the central meridian of the solar disk where flares and associated Coronal Mass Ejections ( CMEs ) may ‘pack more a punch’ because ‘they are more directly pointed at Earth’.

“The Sun is waking up at a time in the month when ‘Earth is coming into harms way’,” Spence said. “Think of these ‘CMEs somewhat like a bullet that is shot from the sun in more or less a straight line’. ‘When the sunspot is right in the middle of the sun’, something ‘launched from there is more or less directed right at Earth’. It’s kind of like how getting sideswiped by a car is different than ‘a head-on collision’. Even still, being ‘sideswiped by a big CME can be quite dramatic’.” Spence estimates that ‘sunspot region AR 1429 will rotate past the central meridian in about 1-week’.

The sun’s activity ebbs and flows on an 11-year cycle. The sun is in the midst of Solar Maximum Cycle 24, and activity is expected to ramp up toward the height of the Solar Maximum in 2013.

Reference

http://www.msnbc.msn.com/id/46655901/

– – – –

Cordially submitted for review and commentary by,

 

Kentron Intellect Research Vault ( KIRV )
E-MAIL: KentronIntellectResearchVault@Gmail.Com
WWW: http://KentronIntellectResearchVault.WordPress.Com

/

/

Research References

http://www.af.mil/information/factsheets/factsheet_print.asp?fsID=157&page=1
https://login.afwa.af.mil/register/
https://login.afwa.af.mil/amserver/UI/Login?goto=https%3A%2F%2Fweather.afwa.af.mil%3A443%2FHOST_HOME%2FDNXM%2FWRF%2Findex.html
http://conceptactivityresearchvault.wordpress.com/2011/01/02/solar-energetic-particle-event-effects
http://www.globalsecurity.org/space/library/news/2011/space-110225-afns01.htm
http://www.wrf-model.org/plots/wrfrealtime.php

 

Fireball Special Projects Program

Fireball Objects

Fireball Special Projects Program
by, Concept Activity Research Vault ( CARV )

August 16, 2011 12:30:08 ( PST ) Updated ( Published: April 11, 2011 )

CALIFORNIA, Los Angeles – August 16, 2011 – When a brilliant object appears twinkles while hanging motionless like a star against a darkened sky, but is ‘not a star’, what is it?

Between 2010 and 2011, residents throughout South Bay cities surrounding Los Angeles in southern California have been trying to figure out what keeps appears to them like an early dark morning star, but is not.

What’s odd about this object’s brilliance is that this ‘bright white light shines from the opposite side of the object facing the horizon where the early morning Sun rises later in the day. The only way the Sun could illuminate the object would be that the object would either have to be ‘transparent’ or equipped with the new advanced military defense ‘light reflection projection technology’ ( LRP ) also known as “Advanced Stealth.”

Does this mysterious hovering object belong to the U.S. Department of Defense ( DOD ) Defense Advanced Research Projects Agency ( DARPA ), U.S. Navy ( USN ), U.S. Air Force ( USAF ), U.S. Navy National Reconnaissance Office ( NRO ), or is it an Unidentified Flying Object ( UFO )?

In early January 2011 an early rising resident of Carson, California assumed the ‘flickering starlight’ he saw against the pitch black sky of early morning was simply a passenger airline with runway lights on its approach to Los Angeles International Airport not far away, but a Torrance, California resident claimed he had been outside sipping on a coffee for almost 2-hours and the starlight object had ‘not moved’ from its position in the darkened sky so, he believed the starlight object was just a ‘star’ and that it would disappear when the Sun rose. Both residents were wrong because as the Sun rose the brilliant starlight remained where it had been positioned in the darkened sky for hours.

One businessman, stepping from his car in a West Carson, California McDonald’s fast food parking lot walked over to see what everyone was looking up at ( the starlight object ) in the early morning darkened sky, and overheard amongst a small crowd of people one woman say, “Maybe it’s a UFO!”

Smiling as though he would settle the dispute he asked, “Don’t you people ever read the newspaper around here? That’s just the U.S. Department of Defense ‘Unmanned Aerial Vehicle’ ( UAV ) assigned to keep watch over the Port of Los Angeles from terrorist attacks.”

A homeless man, sitting nearby drinking his cup of coffee, looked at the businessman and asked, “Does that ‘look’ like any UAV or balloon you ever saw before? The businessman shook his head in a negative reply, whereupon the homeless man further enquired, “Then ‘why’ hasn’t that object moved in the sky for over 2-hours now?”

The businessman, saying nothing further in reply, turned away and went inside to get his early morning McDonald’s breakfast.

Although few bother looking up around at darkened skies, or for that matter after daybreak, there are now more people – especially in southern California – beginning to look up and study what’s hovering overhead them. Are residents expecting to see something new or becoming paranoid after watching so many television news broadcasts about the rash of UFO sightings and lights all over the world?

One man pulled-up in a minivan with Virginia license plates for his early morning breakfast, but also strolled over to where the commotion was going on at whereupon after observing it for about a minute smiled and said, “Heck, we got one of those over in Charlottesville. Shows up like clockwork, every night, ’round 8:00 p.m.,” and then walked away toward his breakfast shaking his head while mumbling, “Durn government spooks – spyin’ on us everywhere!”

As the Virginian left with his McDonald’s bagged breakfast in-hand, he stopped at the table outside to share some of his first-hand experiences with ‘fireballs’ and other ‘strange lights’ in the sky – some that look like ‘aircraft landing lights’ twinkling off in the ‘low horizon distance’ and similar lights appearing like ‘real bright stars’ that suddenly disappear a short time later.

He reported that whatever this mysterious Virginia sighting really ‘was’, there was no public information detailing its ‘flight path trajectory’ or where it landed – and it ‘did land’! In fact, it exploded, according to our Green County Record newspaper in Stanardsville, Virginia spoke to an eyewitness named Judy Kilgus who gave her own experience about what happened just outside a farm neighborhood 7-Eleven convenience store.

Further research, into the Virginian’s story, found the following details:

The 7-Eleven ( store number 21482 ) was located at 1199 29th Infantry Division Memorial Highway ( also known as ) U.S. Route 29 ( also known as ) North Seminole Trail in Madison, Virginia 22727-0024 where Judy Kilgus additionally experienced strange pungent aromatics she identified as an “eerie odor” lingering in the night air after the unidentified flying object ( UFO ) – resembling a “fireball” – crash landed. ( See Newspaper Report – Immediately Below )

 

– –

Source: GREENE COUNTY RECORD – a Medial General newspaper – ( Stanardsville, Virginia, USA )

Experts Offer Guesses On Mysterious Fire Ball Sighting
by, April Taylor – Greene County Record Reporter

April 2, 2009

VIRGINIA, Stanardsville – April 2, 2009 – Judy Kilgus of Stanardsville was driving on [ Virginia, U.S. ] Route 522 [ Sperryville Pike ] near Culpeper [ Virginia ] Sunday [ March 29, 2009 ] night when she saw what she says looked like a burning ball of fire falling from the sky.

“It was me, my mother and one of my brothers, and we were on our way back from Salem [ Virginia ],” Kilgus recalled. She said they were about 3-miles out of Culpeper ( Virginia ] when the “ball” came shooting across ( the sky ).

“At first it gave the appearance of a shooting ‘star’,  and then it was just huge,” she said continuing, This was not like some little light. The colors were brilliant and it was ‘flickering’. It was falling and looked as if it was burning like a ball of fire – reddish with the hint of a green tint.  It was frightening.”

Kilgus said that once the mysterious thing fell, “There was a big light, like an explosion. I said to my mom, ‘Oh, my gosh, it hit something’. I had never seen anything like it. It was traveling very fast.”

Kilgus said she stopped at the 7-Eleven in Madison [ Virginia ], about 15-miles or-so from where she saw the object, and then noticed an “eerie” odor when she got out of her car. She says she’s not sure if the smell is related to what she saw earlier.

Regardless, Kilgus is ‘not the only one who witnessed’ a strange “ball of fire” and big boom that night.

Reports of a “bright light,” and in some places an ‘explosion like sound’, poured into law enforcement offices across eastern Virginia, Maryland and North Carolina on Sunday night [ March 29, 2009 ].

“The phone is ringing off the hook,“ said meteorologist Sonia Mark at the U.S. National Weather Service ( NWS ) Wakefield station.

All of the reports dealt with incidents that occurred ‘about’ 9:45 p.m.

Several calls came to Richmond International Airport, but [ airport ] tower personnel did not see anything unusual related to aircraft, airport spokesman Troy Bell said.

It could have been caused by a ‘meteor’, or even a falling part from a Russia ‘spacecraft’, experts said earlier this week.

“I know it’s one of the two,” said Geoff Chester, an astronomer and public relations officer with the U.S. Naval Observatory in Washington, D.C. “I just can’t tell you definitively’ which one it actually was.” Geoff Chester suggested that a falling Russian booster rocket caused the hub-bub. The booster – a steel cylinder about 25-feet long and 8-feet wide – was part of the Soyuz spacecraft launched Thursday on a mission to the International Space Station [ ISS / MIR ]. The booster was expected to fall toward Earth on a path headed ‘East’ that would have taken it across the Chesapeake Bay [ eastern-most eastcoast ] region Sunday night, Chester said. The booster would have burned in the friction of Earth’s atmosphere and, as it slowed below the speed of sound, it would have released energy that caused a sonic boom, Chester said. “My feeling is this is what people actually saw,” Chester said.

Stefan Bocchino, a spokesman for the U.S. Joint Space Operations Center [ JSOC ] at Vandenberg Air Force Base [ VAFB ] in California, said experts there [ in California ] do ‘not think the light was caused by a manmade object’.

Joint Space Operations Center tracks manmade objects that enter the atmosphere.

U.S. National Weather Service ( NWS ) has ‘ruled out any weather related cause’.

Other experts said the ‘light’ and ‘boom’ sound like the work of a meteor.

Meteors are bits of space rock or gravel that burn and create light when they hit the atmosphere.

“Some very bright ones are known to explode,” creating a sound, said Phillip Ianna, a professor emeritus of astronomy at the University of Virginia.

Meteors typically burn up in the atmosphere. Much ‘less often’, a small piece of the rock will hit Earth.

Steve Chesley, an astronomer with NASA [ National Aeronautics and Space Administration ] said the Sunday [ March 29, 2009 ] phenomena could be the work of a meteor the size of a ‘television set’ or ‘small car’. “These kinds of things hit the [ atmosphere ] once a month,” Chesley said. They usually fall over water or less populated areas and attract less attention. NASA doesn’t track such small objects, Chesley said, and focuses instead on big ones – ‘space rocks half the length of a football field or more’ – that are headed toward Earth. “It’s the big ones we’re worried about, and we need to find them decades in advance,” Chesley said.

The object on Sunday [ March 29, 2009 ] had to be ‘unusually bright’ to be ‘seen in urban areas’ where ‘artificial lights drown out’ most celestial objects, said David Hagan, a staff scientist with the Science Museum of Virginia.

At Record [ Greene County Record ] press time, on Tuesday, ‘experts were still offering various guesses’ as to ‘what the occurrence could have been’.

“One thing’s for sure,” says Kilgus, “It’s something I won’t soon forget. I was glad I saw ‘whatever it was’, but I kind-of ‘wonder what in the world is going on’,” she said. “I thought maybe it was a meteor, but whatever it was, it was ‘strange’.”

Article Reference:

http://www.greene-news.com/gcn/news/local/article/experts_offer_guesses_on_mysterious_fire_ball_sighting/38145/

– – – –

Now, the State of Virginia is infamous for its street and highway names – changing names – within a few miles of each other, making instant headaches for any outsider trying to provide location directions so, as close as can be determined – extrapolated from the newspaper description of the eyewitness ( Judy Kilgus ) account – Google Maps provides a “car” route that coincides with eyewitness Judy Kilgus’ claim she was “3-miles” outside “Culpepper, Virginia” on “Route 522″ ( also known as ) “Sperryville Pike” in “Salem, Virginia.” Interestingly, outsiders would ‘not know’ there are actually ‘two ( 2 ) towns’ named “Salem” in Virginia!

The particular town of “Salem, Virginia” the eyewitness ( Judy Kilgus ) mentions near “Culpepper, Virginia” and ‘that’ “Salem, Virginia” is ‘not easy to locate online’ at Google Maps.

After some ‘prodding’, Google Maps will eventually map ‘from’ “13300 Hunts Shade Drive, Salem, VA” providing its most direct route that quickly exits ( just outside Salem, Virginia ) off “U.S. Route 522″ ( also known as ) “Sperryville Pike” but turns onto “Norman Road” ( also known as ) “Hudson Mill Road” ( also known as ) “Reva Road” that leads to ‘yet another alias highway name’ for “U.S. Route 29″ – an address at 1199 on “29th Infantry Division Memorial Highway” in Madison, VA 22727 ( the 7-Eleven store ) where eyewitness Judy Kilgus claimed detecting a suspicious “eerie odor” outside that particular “7-Eleven” store ( Madison, Virginia ) on “U.S. Route 29.”

That aforementioned description explains why it’s no surprise as to ‘why’ or ‘how’ so many State of Virginia unidentified flying object ( UFO ) flightpath reports can so easily hide ‘landing site locations’ from much of the public observing them there.

The Green County Record newspaper account from eyewitness Judy Kilgus does not ‘detail the location’ ( e.g. NorthEast to SouthWest, etc. ) of where she was at 9:44 p.m. during the UFO fireball sighting.

Was Kilgus already traveling from Salem, Virginia – going through Madison, Virginia ( 16-miles from Salem, Virginia ) – enroute to her home ( Stanardsville, Virginia )?

Did she sight, on Sunday March 29, 2009 at 9:44 p.m., the UFO fireball ‘while she was still in Salem, Virginia’?

Did her sighting, ‘prompt her to leave’ Salem, Virginia earlier than she had planned, which placed her driving in a direction of her ‘home’ ( Stanardsville, Virginia )? If so, did the UFO sighting crash in the direction of where she resided ( Stanardsville, Virginia )?

Without a UFO ‘flightpath trajectory’ provided ( e.g. NorthEast to SouthWest, etc. ) information about this UFO fireball sighting, then perhaps only U.S. Fish & Game or U.S. Forest Service government officials located the UFO fireball landing / crash site. If so, it has undoubtedly been covered-up by now.

Was the March 29, 2009 9:44 p.m. central Virginia UFO fireball sighting really only just a “meteor” that ‘U.S. Space Command ( SPACOM ) government experts claim they were ‘unable to track’? What if it was some king of a rocket or missile launched from another location within the United States?

Could the sighting have been one of many U.S. Air Force Special Programs project spacecraft undergoing ‘classified flight tests’ for the Central Intelligence Agency ( CIA ) that is conducted primarily by the U.S. Department of Defense ( DoD ) National Reconnaissance Office ( NRO )?

Were Virginians used as an unsuspecting audience for part of the test flight for the new U.S. Department Of Defense ( DoD ), U.S. Navy, National Reconnaissance Office ( NRO ) new ‘stealth starlight surveillance dirigible’ – a lighter-than-air vessel believed designed from the 1990 patented Aereon lighter than airship [ similar in design to that of the NASA VentureStar, X-31 and X-33 spaceplanes  [ Image References: http://unwantedpublicity.media.officelive.com/Gallery.aspx ] – that may emit what is tantamount to ‘artificially produced starlight’; flickering akin to ‘aircraft landing lights’ on an aircraft the public would only suspect appears about ready to land but doesn’t?

May even more mysteries continue hiding amidst ‘seemingly peaceful night skies’ where reconnaissance and other spacecraft are now capable of being missioned to commence activation of ‘classified artificial landing strategies’ cleverly resembling a ‘fireball meteor flight’ – complete with a firework flares jettisoned to additionally resemble ‘meteor crash explosions’ too?

Research into conflicting news report information suggests that one ( 1 ) UFO fireball incident may have actually been confused with another UFO fireball on the same date and around the same time of night, a ‘dual event’ in Virginia.

One ( 1 ) eyewitness report – in the central Virginia geographic region – may have been mixed with ‘another press report’ using its central Virginia eyewitness interview in-conjunction with another news wire service report.

Simultaneous report sightings along the eastern seaboard of the United States from up and down the Atlantic Ocean coastline as far south North Carolina ( far northeast ), Virginia ( far east ) and Maryland ( far east ) in the region of the Chesapeake Bay as being the ‘same’ as what was seen in Madison, Virginia as reported by the Stanardsville, Virginia newspaper?

A few miles south of eyewitness Judy Kilgus UFO sighting near the 7-Eleven store ( Madison, Virginia ), also on “U.S. Route 29″ ( also known as ) “South Seminole Trail” at the east corner where 2055 Boulders Road ( Albemarle County ) sees the U.S. National Ground Intelligence Center ( NGIC ).

Adjacent to NGIC ( National Ground Intelligence Center ) is a new construction site for the new U.S. Defense Intelligence Agency ( DIA ) complex buildings as well.

Just a few miles further south – again on “South Seminole Trail” ( Virginia U.S. Route 29 ) – is the NORTHRUP-GRUMMAN Sperry Marine Division where a classified secret-sensitive multi-story windowless bricked-up test site building is perched behind an innocent looking single-story red brick office building – also on “South Seminole Trail” (aka) “U.S. Route 29″ (aka) “Emmett Road” where all these spooky government offices are popping up all around Charlottesville, Virginia.

 

A few miles further north on the west side of “South Seminole Trail” ( Virginia U.S. Route 29 ) – is the old single story red brick face building that the other three ( 3 ) sides are painted white complete with pyramid shaped surveillance cameras mounted high-up on the building, and while it only ‘appears vacant’ is a honeycombed-out floor-plan inside, shared by the U.S. Central Intelligence Agency annex with the DIEBOLD CORPORATION, but you’d never know it to drive by the facility or drive up onto the terraced hill it sits atop and drive around the building either because you’ll only see a few vehicles outside where everyone enters through the north facing single glass door where everything appears just normal. Again, just a few miles north of this facility is the U.S. National Ground Intelligence Center ( NGIC ) buildings on the east side of Seminole Trail ( Virginia State Highway 29 ) about 95-miles southwest of Washington, D.C.

In any event, loud noise explosions – especially when they are conducted over U.S. population areas – should have provided the public with more concrete information, especially what it hit during its crash landing.
Submitted for review and commentary by,

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

References

http://www.space.com/news/090330-rocket-debris.html
http://www.space.com/news/090331-likely-meteor.html
http://www.foxnews.com/story/0,2933,511857,00.html
http://unwantedpublicity.media.officelive.com/Gallery.aspx

/

/

 

NASA UFO Alien Encounters

[ PHOTO ( above ): NASA Space Shuttle STS 115 official mission photo of a an “ExtraTerrestrial BioSynthetic Entity” ( EBSE ), a mobius waveform propulsion system spaceborn infant that is also known by what others refer to as part of a “Space Serpent” that a NASA astronaut refers to as an “eel” ( click to enlarge ) ]

NASA UFO Alien Encounters
by, Concept Activity Research Vault ( CARV )

December 1, 2011 14:22:08 (PST) Updated ( Originally Published: January 27, 2011 )

CALIFORNIA, Los Angeles – December 1, 2011 – NASA? We may ‘all’ have a problem! Over the past several decades multiple space flight missions from the United States and Russia have officially documented strange encounters occuring in outer space with astronauts.

Not just ‘unidentified flying objects’ ( UFO ), but other unreferenceable object encounters such as serpent like entities that appear to be biosynthetic lifeforms, plus other spaceborne phenomena that astronauts referred to as being their ”visitors.”

While all this initially sounds totally unbelievable, official audio and video film clip transmissions ( see below ) between space missioning astronauts and their mission ground control central proving NASA UFO Alien Encounters are officially very real.

Several prominent United States astronauts have now gone on public record revealing their personal experiences with official U.S. government encounters surrounding nonreferenceable objects and entities ( UFOs & Aliens ):

http://www.youtube.com/watch?v=XkOCwATi3tw

Introduction

The following additional ’official video clips and audio broadcasts’ ( below ) will reveal some amazing things for you if  these two ( 2 ) ’basic principles’ are remembered while viewing them:

1.  Meteorites ‘do not suddenly change direction’ and ‘do not make left-hand turns, right-hand turns or U turns’ like a ’piloted spacecraft’; and,

2. Debris in outer space, known as ‘space junk’ ( e.g. pieces of rocket boosters, expended satellites, etc. ), does not exit the Earth’s gravitational pull ( that extends – at a much lesser strength – into outerspace ) nor does it at high rates of speed’ either, but will either ‘float around in outer space’ and/or ‘return ( in a ‘decaying orbit’ ) to Earth’s atmosphere where it may either burn-up or fall to the surface of the Earth.

A few of the documentary motion picture film clips ( below ) are in ’full color’, ’black and white’, or ’tinted green’ screen resolutions – many of which are comprised of ’time-elapsed monitoring film sped-up’ so, ‘objects being monitored may not be moving as fast as they appear’, however ’other films may have objects moving in real speed ( e.g. ‘slow docking arms and astronaut manuevers’ or ‘fast electronic movements’ of ’camera lens housings’ that ‘spin around quickly’ to ‘catch a view of other objects moving in any direction at any time’ ).

Nevertheless, always keep in-mind the two ( 2 ) ‘outerspace principles’ that only ‘spacecraft make course corrections’ – turning and going in different directions – ‘not meteorites, comets, asteroids or space debris’.

Now, please enjoy watching ‘these specific four ( 4 ) videos’ ( below ) especially selected after careful analysis from ou-of ’hundreds of others found with flaws’:

By 1971, a NASA Apollo 14 rocket carried a Lunar Excursion Module ( LEM ) camera providing the following photo ( below ):

[ PHOTO ( above ): official NASA LEM camera photo of moon in background with UFO lights in right foreground ( click to enlarge ) ]

In the above photo, was there a ‘piece of lighted extermal equipment’ outside the NASA spacecraft? No. Was there any ‘distortion in the camera lense or reflection’? No.

In 1967, four ( 4 ) years before the aforementioned 1971 LEM launch, PACIFIC OPTICAL ( El Segundo, California ) generated thousands of LEM camera rose quartz glass lenses polished to perfection and tested for ‘any possible distortion factors’ using special ‘lightwave measurement instruments’ set according to ’government standards for approval’ long before being installed within those cameras used aboard the LEM, within an incredible number of NASA space rocket missions, and U.S. Air Force National Reconnaisance Office ( NRO ) missions. How are those facts known? I personally know those particular camera lenses ’do not capture any destortion whatsoever from glare or image reflections’ because I ‘personally worked on those lenses’, knew what they consisted of, as well as their unique characteristics also used in weaponized missiles and unmanned aerial vehicles.

As far back as 1966, when I was still attending high school, I also worked the graveyard shift at the McDonnel Douglas Space Systems Center ( now BOEING ) where within a special area then-known as “The Quad” a famous defecting Russian astrophysicist secretly worked for the U.S. government on the Mercury mission space capsules. Yes, and there were two ( 2 ) such capsules! Not much comes as much of a mystery to me, especially after four ( 4 ) decades of researching intelligence for the U.S. government.

The aforementioned films and broadcats document previously unknown ’new hazards surfacing in space’, believed one of the primary reasons why ‘all NASA spacecraft insurance policy coverage was cancelled’ as NASA lost its direct U.S. government funding when it became apparent to the U.S. government intelligence directorate ( CIA Science and Technology Division ) need of private-sector corporate market bids on future space mission contracts – swinging the axe into the heart of the NASA empire given its final ”tally-ho.”

That decision made perfect economic sense to the U.S. government because now its ‘secret private-sector enterprise takes on far more financial risk than it does just space mission challenges’ where shareholder investment returns will consequently realize less profit from having to absorb the costs associated with higher insurance policy coverages directly related to ‘new risk management assessments’ ( see official films and broadcasts above ) where insurance company policy directive issuances placed on ‘future mission spacecraft’ see more financial burden placed onto thebacks of ’investors’ while essening ever-growing financial burdens on U.S. Department of the Treasury foreign debt repayment plans the Federal Reserve Board decided NASA was just costing too much money and something had to be done about getting rid of the NASA burden without jeopardizing U.S. global space superiority. What began during 1985 as an intelligence mission to bury secret technology effortings within closely guarded private-sector enterprises satisfies corporate promises of long ago.

It’s simply a case of one hand washing another at the expense of the people, is all, where a new space tax that may have been pulled from tomorrow shelf concepts of George Orwell for those living in futuristic societies. Unfortunately, such activity already broke plenty of pencil point lead in private-sector workshop sessions at FANX III ( Ft. George G. Meade, Maryland ) in 1998 supporting private sector funded public insurance investment programs as yet another economic means by which U.S. Social Security Administration insurance may likely receive its axe because of the ever-looming baby boomer debt crisis the U.S. government faces dead ahead as next on its big aggenda. NASA was likely but a precursor government primer for what’s in-store next for the ‘little people’. Up, up and away, that beautiful balloon – that not so beautiful balloon anymore.

 

Submitted for review and commentary by,

 

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

 

NASA Elenin Coming

NASA Elenin Coming

 

[ PHOTO ( above ): April 15, 2011 Comet Elenin position and trajectory Earth dates ( click to enlarge ) ]

NASA Elenin Coming by, Concept Activity Research Vault

May 17, 2011 16:42:08 ( PST ) Updated ( May 16, 2011 11: 20 )

CALIFORNIA, Los Angeles – May 17, 2011 – After much controversy over a celestial body referred to as “Elenin,” said to be entering our solar system, NASA admits there is a ‘comet’ it calls “Elenin” that ‘is’ in-fact ‘coming very soon’ toward Earth that will simply go around our Sun and slingshott off into outerspace, however there seems to be a few missing pieces to the NASA public report.

The first ( 1st ) problem is the ‘planetary or solar body slingshot fly-by effect’, which NASA spacecraft have used for decades to ‘increase speed’ and consequently propel – ‘faster than we can engineer a rocket motor to do’ – satellites toward destinations using various forms of ’staring plane mosaic’ technology incorporated into ’electronic telemetry guidance systems’. Comet Elenin, will also be experiencing the ‘slingshot fly-by effect’, but with ‘no electronic telemetry guidance systems’ to guide it in any particular direction.

The second ( 2nd ) problem is NASA’s Jet Propulsion Laboratory ( JPL ) Near Earth Object ( NEO ) Program Office solar system plotting system ( see below ) pictures Elenin ’on a direct collision course’ with “Mercury” around May 21, 2011 but NASA reports ‘nothing publicly’ about Elenin colliding with, hitting or glancing off-of ”Mercury” so, you be the judge:

[ IMAGE ( above ): May 16, 2011 Elenin solar system slingshot pathway  ( NOTE: distance indicated from Earth ) – View #1 ( click to enlarge ) ]

[ IMAGE ( above ): May 16, 2011 Elenin solar system slingshot path to Mercury ( NOTE: distance indicated from Earth ) – View #2 ( click to enlarge ) ]

[ IMAGE ( above ): May 21, 2011 – Elenin solar system slingshot Mercury collision path ( NOTE: distance indicated from Earth ) – View #3 ( click to enlarge ) ]

NASA news ( below ) and NASA plots for May 16, 2011 do not appear to match NASA actual plots for May 21, 2011 ( above ) when looking at “Mercury.”

– –

Source: NASA Space.Com

Comet Elenin Trajectory 22,000,000 Miles Close To Earth

Don’t Fear Comet Headed Our Way — It’s A Wimp

May 10, 2011 6:37:23 PM ET Updated ( May 10, 2011 22:37:23 )

A comet first discovered just 6-months ago will be making a visit to the inner solar system soon, but don’t expect to be completely dazzled. This comet is a bit of a wimp, NASA says.

Comet Elenin ( also known, by its astronomical name: C/2010 X1 ), was first detected on December 10, 2010 in Lyubertsy, Russia by observer Leonid Elenin who found the comet while using the remote controlled ISON Observatory near Mayhill, New Mexico, USA.

At the time of its discovery, the comet was about 401,000,000 million miles from Earth.

Over the past 4-1/2 months, the comet has closed the distance to Earth’s vicinity as it makes its way closer to perihelion ( its closest point to the Sun ).

As of May 4, 2011 the Comet Elenin distance is about 170,000,000 million miles.

“That is what happens with these long-period comets that come in from way outside our planetary system,” said NASA Jet Propulsion Laboratory ( JPL located in Pasadena, California ) Near-Earth Object ( NEO ) Program Office Don Yeomans in a statement. “They make these long majestic speedy arcs through our solar system and sometimes put on a great show, but not Elenin; right now that comet looks kind of wimpy.”

The comet doesn’t offer much of a view and is quite dim to behold.

“We’re talking about how a comet looks, as it safely flies past us,” said Yeomans. “Some cometary visitors, arriving from beyond the planetary region like the Hale-Bopp comet in 1997, have really lit-up the night sky – where you can see them easily with the naked eye – as they safely transit the inner solar system. But Elenin is trending toward the other end of the spectrum, you’ll probably need a good pair of binoculars, clear skies and a dark secluded location to see it – even on its brightest night.”

Comet Elenin, An Icy Run

Comet Elenin should be at its brightest, shortly before the time of its closest approach to Earth on October 16, 2011 when its closest point will be 22,000,000 million miles from Earth.

Even at such a distance, the Elenin comet will ‘not’ be able to shift tides or tectonic plates here on Earth – as some Internet rumors have suggested.

Some have even wondered if the comet could possibly be pushed closer to Earth than usual.

“Comet Elenin will not encounter any ‘dark bodies’ that could perturb its orbit, nor will it influence Earth in any way,” said Yeomans. “It will get no closer to Earth than 35,000,000 million kilometers.”

Not only is the Elenin comet far away but it is also on the ‘small side’ for comets, said Yeomans.

“So you’ve got a modest sized icy dirtball getting no closer than 35,000,000 million kilometers,” said Yeomans.

“It [ Elenin, the comet ] will have an immeasurably minuscule influence on our planet. By comparison, my sub-compact automobile exerts a greater influence on the ocean tide than will comet Elenin ever.”

This Fall – Cosmic Comet Show

But just because the Elenin comet ‘will not change much’ here on Earth, ‘does not mean skywatchers should not pay attention’.

“This comet [ Elenin ] may not put on a great show, just as certainly it will not cause any disruptions here on Earth, but there is a cause to marvel,” said Yeomans. “This intrepid little traveler [ Elenin, the comet ] will offer astronomers a chance to study a ‘relatively young comet’ that ‘came here from well beyond our solar system planetary region’. After a short while, it [ Elenin ] will be headed back out again, and we will not see or hear from Elenin for thousands of years. That’s pretty cool.”

NASA detects, tracks and characterizes asteroids and comets – passing relatively close to Earth – using both ground-based and space-based telescopes, and its Near Earth Object ( NEO ) Observations Program ( called: SpaceGuard ) discovers these objects, characterizes a subset of them and predicts their paths to determine if any could be potentially hazardous to Earth.

References

http://ssd.jpl.nasa.gov/sbdb.cgi?sstr=Elenin;orb=1;cov=1;log=0;cad=1#cad http://ssd.jpl.nasa.gov/sbdb.cgi#top http://ssd.jpl.nasa.gov http://www.jpl.nasa.gov/news/news.cfm?release=2011-138 http://www.jpl.nasa.gov/asteroidwatch/newsfeatures.cfm?release=2011-129 http://www.jpl.nasa.gov/news/news.cfm?release=2010-144 http://www.space.com/11617-comet-elenin-wimpy-solar-system.html

– –

How big is comet Elenin? What does comet Elenin look like real close? Well, that just seems to be the third ( 3rd ) and fourth ( 4th ) problem because NASA is ‘not reporting anything about those two ( 2 ) additional items’ either.

The only public cross-section comparative information analysis on comet Elenin was found contained in a privately produced video by a ‘good ole boy’ whose factual references appear fairly represented ( below ) – less having ‘overlooked his own mis-typing input’ of ”1,700,000,000,000″ ( without ‘commas’ – as seen ‘here’ ) indicating ‘billion’ ( miles ) rather than ”170,000,000″ ‘million’ ( miles ) he ‘thought’ as he indicated typing-in in his diligent attempt trying to make a calculation demonstration:

Submitted for review and commentary by,

Concept Activity Research Vault E-MAIL: ConceptActivityResearchVault@Gmail.Com WWW: http://ConceptActivityResearchVault.WordPress.Com

 

Secret Locations

Paul Bennewitz photo of ETV at Wirt Canyon near Dulce NM - Lg

[ IMAGE ( above ): Special motion picture film and photographic imaging equipment caught this particular near still image of what Paul Bennewitz obtained certain information about ExtraTerrestrial Vehicles ( ETV ) passing in-to and out-of this particular solid rock face mountain at Wirt Canyon near Dulce, New Mexico, USA ( click on image to enlarge ) ]

Secret Locations
by, Kentron Intellect Research Vault ( KIRV )

March 17, 2011 11:08:42 (PST)

NEW MEXICO, Santa Fe – March 17, 2011 – Paul Philip Schneider, a mechanical engineer, was no slouch but a patriotic ex-military American who was found garroted to death shortly after publicly exposing a work-site area near the Archuleta Mesa ( between Colorado and New Mexico ) where an event occured while he worked under ‘contract’ for MORRISON-KNUTSON INC. that upon instructions from the Los Alamos Scientific Laboratory ( LASL ) Geosciences Division – now known as – Los Alamos National Laboratory ( LANL ) caused him to be sent down an ‘old nuclear bomb borehole’ ( see further below about “Nuclear Fracturing” ) he claimed was located in a ‘remote area of northern New Mexico’ where his mission was to discover why huge plumes of ‘black soot’ were forcefully being shot up into the atmosphere’ at an active drill site where his team was located.

Paul Philip Schneider 8

Schneider, by his own accounts, claimed the black soot atmosphere smelled like a sulphuric gas lingering in the air all around the drill-site, but who ever heard of a ‘carbon atmosphere’ of ’black sulphur gas’?

It is still seriously doubted if Phil Schneider knew what was actually happening down in the cavern he was ordered into through the borehole.

Did Schneider, know:

– What those plumes of ‘black soot’ actually consisted of, and ‘why’ it was being ‘forcefully shot-out of the borehole’ with such ‘pressure’;?

– How a ‘huge cavern came to be made’ at the bottom of the borehole?

– What the ‘cavern had been filled with and why’?

– Why boreholes were ‘specifically directed to be drilled precisely where they were’?

From what has been able to be gathered from in-depth research, Schneider was apparently sent by KNUTSON to perform his role as a ‘mechanical engineering consultant’, who totally by accident discovered something beyond his wildest imagination – and many other’s too.

Schneider was lowered down one of the boreholes with an ‘armed black beret military team’ that surrounded him but he could not understand ‘why’ he had such escorts – except for some unknown military security reason he wasn’t cleared to know, and being ex-military, Schneider knew-better than to ask any questions, but to ‘do what was asked of him by his superior’.

Now comes the point within this report that information becomes incredibly bizzare, however by in-depth research Schneider’s claims ( below ) have been pulled somewhat back into ‘more detailed perspective’ by ‘incredibly related facts’ that existed long ago.

Schneider and the other armed U.S. federal military employees discovered a group of what Schneider claims were ExtraTerrestrial Biological Entities ( EBEs ) that he referred to as alien “Grey talls” ( 7-foot extraterrestrial aliens with a bluish purple bioskin color.

Once lowered through the borehole, a deep underground cavern appeared and this is when Schneider claims he instantly got off several rounds of his own ammunition ( from his sidearm pistol ) before being shot by some ‘blue beam of light’ – leaving him cut wide-open in the chest as having several fingers of his cut-off  from what the first ( 1st ) alien he encountered ( down there ) apparently aimed at him ( Schneider ).

Paul Philip Schneider 2

Paul Philip Schneider 7

Schneider further claimed that nearly sixty ( 60 ) U.S. government personnel and contractors lost their lives during this August 1979 ‘unpublicized event’ in what Schneider described having been involved-in an underground cavern filled with alien beings.

Hard to believe, I know because at-first I totally scoffed at Schneider’s claims being utterly ridiculous, until some unrelated research studied earlier portions of what Schneider claimed.

Years earlier ( 2002 ) I was exchanging communications with someone related to Paul Philip Schneider, on an entirely ‘different subject’ related to work his ‘father’ Oscar Schneider was involved-in with the U.S. Navy, and current research now dovetailed back to that which I was researching years ago surrounding a U.S. National Security Agency ( NSA ) ‘precious metals’ scientific recycling process that involved baking ore resulting in low-yield nuclear energy transmutation of material into high-yield elements.

It appeared from further studies that Oscar Schneider’s son, Paul Philip Schneider claims required further detailed clarifications, which he either could not perform earlier due to secrecy agreements with the U.S. government or he may have felt he should ‘not’ provide too many specifics for fear of breeching a “business agreement” he made with the U.S. government, however based upon current research beyond Schneider’s claims, incredible further facts go far beyond this report sufficient to at least warrant a suitable public release of information contained herein.

In 2002, Schneider’s intinately close friend sent me “Rhyolyte 47″ ( classified ) ‘official U.S. Navy documents’ pertaining to Paul Philip Schneider’s ‘father’, U.S. Navy Captain Oscar Schneider, who from his ‘military professional medical position’ became directly involved with “Project Blue Ship” ( also known as ) “The Philadelphia Experiment.”

Phil Schneider mentions that upon drilling “boreholes” the drill-site teams were experiencing “huge plums of black soot’ or “black dust” ( that stank ) and were being ‘rapidly ejected under pressure’ into the atmosphere, which also lingered near the ground, from within these underground hole locations they were being told to drill at.

None apparently knew ‘why’ or ‘how’ these odd occurences were taking place, or why they were drilling where they were.

Initial research indicates that “Nuclear Fracturing” ( 1962 – 2005 ) had been used earlier in ‘underground nuclear blasts’ for what was said to be “oil experimentations,” which was a U.S. government cover for underground testing of nuclear warheads subsequent to Russia having exploded what it called the “Tsar Bomba,” a 100-kiloton nuclear bomb that the U.S. later claimed only yielded 50-kiloton blast with a Circular Error Probability ( CEP ) range for death from irradiation up to 60-miles away from ground-zero.

Did previously ‘large U.S. oil reserves once exist’ until subsequently becoming nuclear blasted underground, creating both a ‘high pressure black oil carbon soot atmosphere’ and simultaneously creating these ‘huge underground caverns’?

Were “Grey” alien “talls” ( 7-foot ) provided with a ‘government secret atmospheric environment’ consisting of ‘large underground cavern living where they were the only ones able to breath a high carbon pressurized atmosphere consisting of gamma irradiated particle elements” derived from ‘earlier U.S. secret military underground nuclear Projects’ publicly called “Nuclear Fracturing”? If so, could an extraterrestrial alien agreement been reached secretly with the U.S. government?

Was the LANL / LASL Geosciences Division ‘borehole drillings’ “Hot Rock” Project – publicly revealed as only seeking ‘new geothermal energy alternatives’ only a ruse?

By ‘drilling boreholes’ ( in highly ‘specific remote locations’ where contractors were ‘told to drill’ ) did that ‘actually and instantly deplete’ all of the carbon dense atmosphere used by these Grey alien “talls” whom were thereby being killed down inside these caverns? If so, why would the U.S. government destroy what it must have worked so hard to build for these aliens?

Might another “aggenda” have entered determining a take over of those Grey tall ( 7-foot + ) aliens existence underground? If so, why were these “Grey talls” suddenly considered a “threat?”

There is only rumored mentions of a percentage reduction number of Grey talls ( aliens ) being sighted on Earth, whereas “Grey shorts” ( 4-foot to 5-foot tall ) are now only but rarely sighted so, were nearly 80% of the “Grey talls” actually “wiped-out” by ‘another alien species’ – as rumored?

Is there any truth to any of this and are there any other correlations worth researching to such deep underground military bases and caverns albeit for aliens or humans?

Why have ‘sophisticated landing strips’ on “Church of Scientology” ( COS ) ‘remote properties’ been purchased by its “Church of Scientific Technology” entity that has also built ‘sophisticated underground vaults’ and ‘sophisticated long tunnel systems’ that are ‘guaranteed to last 1,000 years’ equipped with ‘sophisticated nuclear blast-proof doors’ built to withstand any direct nuclear hit? All of these have been substantiated through my research and documented by satellite photos, facility diagramatics, and land parcel details and grant information on many of these COS / CST ‘remote installation properties’.

Related points to initially consider:

– Schneider claimed his work was on a Los Alamos National Laboratory ( LANL ) Project, that my research indicates was actually performed the name “Los Alamos Scientific Laboratory” ( LASL ) Geosciences Division”  within its geothermal ( volcanic magma ) “Hot Rock” Project during August 1979 or ‘likely earlier’;

[ IMAGE ( above ): Archuleta Mesa ( top ) where ( due south ) is Dulce, New Mexico ( click to read, enlarge image ) ]

– Schneider of KNUTSON worked at this Archuleta Mesa / Kit Carson National Forest (aka) Carson National Park work-site during the August 1979 alien event;

– Schneider possibly a ‘sub-contracted engineer’ ( like Keith Millheim of CER GEONUCLEAR CORP. ) working for KNUTSON ( Tulsa, OK ) performing what that industry refers to as “high risk construction contracts;”

– New Mexico site on the Archuleta Mesa / Kit Carson National Forrest – aka – Carson National Park ( New Mexico near Colorado border ), research land grants near Edith, Colorado and Blue Lake, New Mexico ( Indian land );

– KNUTSON, which Schneider claimed to have been working for, may have been a ‘specially named adjunct’ working out-of Area 51 / S4 ( Groom Lake, Nevada ) while Schneider travelled from to work at the Archuleta Mesa ( Edith, Colorado ) site. Could Schneider have been covering-up the exact location being actually in the Kit Carson National Forest Indian land of Blue Lake, New Mexico ) near Taos, New Mexico where the phonemena of the “Taos hum” is and all the UFO sighting northeast of Taos, which would place the UFO sightings directly over Indian land of Blue Lake, New Mexico;

– Check both work and worksite connections between EG & G CHANDLER ENGINEERING ( Tulsa, OK ) and KNUTSON ( Tulsa, Oklahoma ) on work-site contracts that took place during 1979 ( or ‘earlier’ ), in the area of the Kit Carson National Forest ( New Mexico ) and /or Archuleta Mesa ( New Mexico or southern Colorado ), performing an industry referenced “high risk construction contract;” and,

– Google Earth shows a ‘gigantic cave-in’ or ‘explosion hole from something’ where scattered atop is an unusual ‘broken and caved-in lattice pattern’ of ‘huge very-very long beams / tubes’ [ cavern / cave / tunnel support beam cylinders ] that appear ‘snapped like twigs’ and sunken into a remote cave-in area nearest Edith, Colorado of the Archuleta Mesa area.

====

Part 2

COS CST & IGSS Brief Note –

Church of Scientology ( COS ) and Church of Scientific Technology ( CST ) New Mexico vault tunneling construction performed by INTERNATIONAL GROUND SUPPORT SYSTEMS ( Santa Fe, New Mexico ).

Note all ‘sub-contracts’, ‘consulting engineers’, ‘consulting architects’, ‘insurance companies’, and any other ‘related construction support firms’ for connections, between:

– INTERNATIONAL GROUND SUPPORT SYSTEMS ( Santa Fe, New Mexico ); – CER GEONUCLEAR CORPORATION of EG&G ( Las Vegas, Nevada 89114 ); – MORRISON-KNUDSEN INC. – aka – KNUTSON ( Las Vegas, Nevada ); – MORRISON KNUDSEN ENGINEERS – aka – KNUTSON ( Tulsa, Oklahoma ); – CHANDLER ENGINEERING of EG&G ( Tulsa, Oklahoma ); – Others ( e.g. AUSTRAL OIL, et al. ).

====

PART 3 –

Nuclear Fracturing Projects –

Atoms for Peace program fails oilpatch testing. Atoms for Peace was a grand idea aimed at benefiting the world and was tested in the oilpatch. It never reached its potential in ‘industrial use’, however it did set an active alternative; accelerating the nuclear arms race between the U.S. and Russia.

– –

UNITED NATIONS ( UN ) – UNITED STATES President Dwight D. Eisenhower –

It kicked off on November 28, 1953 when the United Nations ( UN ) General Assembly asked the Disarmament Commission for suggestions to halt nuclear proliferation, 10-days later U.S. President Dwight D. Eisenhower addressed the UN General Assembly saying:

“The United States would seek more than the mere reduction or elimination of atomic materials for military purposes. It is not enough to take this weapon out of the hands of soldiers. It must be put into the hands of those who will know how to strip its military casing and adapt it to the arts of peace.”

– –

OPERATION PLOWSHARE ( 1972 ) –

In the United States, that effort kicked-off OPERATION PLOWSHARE – named for the Biblical passage that referred to turning swords into plowshares. The United States had high hopes for Operation Plowshare. It anticipated twenty-seven ( 27 ) nuclear programs using atomic devices for peace.

– –

– PANAMA – – NICARAGUA – – COSTA RICA –

Among those plans were proposals to use more than one-hundred ( 100 ) nuclear explosions to blast a 37 mile ( 60 kilometer ) long canal across the isthmus of Panama at San Blas.

Another plan would use at least 250 blasts for a 140 mile ( 225 kilometer ) canal on the Nicaragua Costa Rica border.

– –

– ALASKA –

PROJECT CHARIOT Cape Thompson, Alaska

Another, PROJECT CHARIOT, almost went into action that would have used several hydrogen bombs to blast out a harbor at Cape Thompson, Alaska in the Chukchi Sea about 75 miles ( 121 kilometers ) from the Russia border in the Bering Sea Strait.

– –

– NEVADA –

PROJECT SEDAN ( 06JUL62 @ 10:00 a.m. ) Yucca Flats, Nevada

Concerns over the native population reduced that to a 104 kiloton proof-of-concept blast called SEDAN at Yucca Flats, Nevada on July 6, 1962. The blast displaced 12,000,000 million short tons of dirt and released a 12,000 ft. high ( 3,660 meter ) radioactive cloud.

– –

– NEW MEXICO –

PROJECT GASBUGGY ( 10DEC67 ) Kit Carson National Forest ( New Mexico ) / Archuleta Mesa adjacencies: Dulce, NM; Blue Lake, NM; and, Edith, CO. LAWRENCE LIVERMORE NATIONAL LABORATORY ( Livermore, California ) U.S. ATOMIC ENERGY COMMISSION ( AEC ) EL PASO NATURAL GAS

Native American indians of the Tiwa tribe, continue cultural traditions to this day. Around 1903, by Executive Order, U.S. President Theodore Roosevelt took over 48,000 acres of Tiwa tribal land ( in the “Sangre de Cristo Mountains” of New Mexico ) making it the “Kit Carson National Forest” (aka) “Carson National Forest” in New Mexico. In 1970, by Executive Order, U.S. President Richard Nixon returned part of that land to the Tiwa tribe.

A ‘tribal disaster’ ( elsewhere ) was mentioned in an official U.S. Administration letter. Pull letter from existing files under Knights of Malta member Thomas Sawyer.

Kit Carson National Forest, Sangre de Cristo Mounbtains, Archuleta Mesa, and other names ‘narrow area’ of ‘work-site location(s)’ requiring ‘further research’:

– Jemez Plateau; – Jemez Mountains; – Fenton Hill, New Mexico <?> – UNION OIL ( Baca location ) or UNOCAL Geothermal Division ( Los Angeles, CA ); – Mayrsville <?> could be “Marysville” < sp ? >; – Coso; – LASL Geosciences Division; – ERDA ( Energy Research and Development Administration ) National Laboratories Division of Geothermal Energy ( DGE ); – Drilling Third ( 3rd ) Borehole ( code name: EE-1 )

Most of the real testing took place in the oilpatch, starting with PROJECT GASBUGGY – the first test aimed at releasing natural gas from tight sands and the first use of a nuclear device for industrial purposes.

It took place on December 10, 1967 in the Carson National Forest of New Mexico – about 90 miles northwest of Santa Fe, New Mexico and 25 miles southwest of the town of Dulce, New Mexico.

The U.S. ATOMIC ENERGY COMMISSION ( AEC ) was the oversight agency as LAWRENCE LIVERMORE NATIONAL LABORATORY ( LLNL ) and EL PASO NATURAL GAS ( EPNG ) conducted the test. They set off a 29 kiloton blast 4,240 ft ( 1,293 m ) underground in a tight shale. According to official figures, the blast created a cavity 80 feet ( 24 m ) wide and 335 feet ( 102 m ) high filled with gas. Unfortunately, the ‘gas was too radioactive’ to use. They didn’t stop there.

– –

– COLORADO –

PROJECT RULISON ( 10SEP69 ) Rifle, Colorado; CER GEONUCLEAR CORP. ( Las Vegas, Nevada ); Keith K. Millheim ( president of Strategic Worldwide LLC The Woodlands, Texas ) under CER GEONUCLEAR CORP. contract; AUSTRAL OIL ( Denver, Colorado ); TELEDYNE ISOTOPES ( Palo Alto, California ); ATOMIC ENERGY COMMISSION ( AEC ).

The industry moved on to Project Rulison, probably the best known of the nuclear blasts for peace. This blast, also under the control of the ATOMIC ENERGY COMMISSION ( AEC ), was conducted by CER GEONUCLEAR CORP. ( Las Vegas, Nevada ) and AUSTRAL OIL ( Denver, Colorado ).

In between international assignments Keith K. Millheim ( president of Strategic Worldwide LLC The Woodlands, Texas ) was seconded by CER GEONUCLEAR CORP. to help design the drilling and testing of the Project Rulison Nuclear Gas Stimulation Experiment.

This 43 kiloton blast – 2.6 times the size of the bomb dropped on Hiroshima, Japan – was lowered 8,426 ft ( 2,568 m ) underground from a drill site on the southwest flank of Doghead Mountain in Battlement Creek Valley in Garfield County, Colorado approximately 40 miles northeast of Grand Junction, Colorado and 11 miles southwest of Rifle, Colorado. During hearings in the area, testimony revealed that the government believed the blast would fracture the tight MesaVerde gas sands and release more gas than conventional fracturing. If they could find a better way of fracturing, they might have the key to releasing more of the 317 Tcf of gas in place in tight sands, according to an article by Chester McQueary ( Parachute, Colorado resident ) writing for the High Country News. Also during the hearings, David Evans from Colorado School of Mines testified the industry would have to set off 13,000 similar blasts to get the kind of recoveries the government wanted. According to Chester McQueary, 1-week before the blast the AEC set up a 5 mile ( 8 kilometer ) quarantine zone around the well site as workers lowered the bomb into the hole. It even paid some homeowners to leave for the day. Unfortunately, winds that could have carried radiation north to Rifle, Colorado or west of Grand Junction, Colorado delayed the test until September 10, 1969. Protesters entered the quarantine area in 2s and 3s so they couldn’t be easily removed. KWSR radio station in Rifle carried a countdown for the blast. Finally, security forces just left protestors, McQueary among them, in the quarantine area. He felt a “mighty thump” that lifted him 8 inches into the air and measured 5.5 on the Richter scale in Golden, Colorado. Again, about 30-days later, the test team found the ‘gas was too radioactive to use’, but that wasn’t the end of the ‘nuclear program in natural gas’.

– –

– COLORADO –

PROJECT RIO BLANCO ( 1971 – 17MAY73 )

EG&G CER GEONUCLEAR CORP. ( Las Vegas, Nevada – 1971 – 1973 ) EQUITY OIL CO. ( Salt Lake City, Utah ) C.F. KNUTSON ( 1973 – 1975 ) D.L. Coffin ( 1968 – 1971 ); F.A. Welder ( 1968 – 1971 ); R.K. Glanzman ( 1968 – 1971 ); X.W. Dutton ( 1968 ).

<?> Virginia Glanzman ( USGS – Denver, Colorado ) <?>

The next test also took place in the Piceance Basin of Colorado, this one on May 17, 1973. It was called Project Rio Blanco and was scheduled in Rio Blanco County about 75 miles northwest of Grand Junction, Colorado and 30 miles ( 48.3 kilometer ) southwest of Meeker.

EG&G CER GEONUCLEAR CORP. ( Las Vegas, Nevada ) and EQUITY OIL CO. ( Salt Lake City, Utah ) conducted the blast where the operators set three ( 3 ) 330 kiloton devices between 5,838 feet and 6,689 feet ( 1,781 meters and 2,040 meters ) to blast out a huge cavern in the tight Mesaverde below the Green River oil shale. The blast went off, but the caverns didn’t connect.

– –

– NEVADA –

Nevada Test Site ( NTS ) ( 1993 – 1995 ) Nellis Air Force Base, Test Range Site, S-1 ( Nevada ) Area 51

LAWRENCE LIVERMORE NATIONAL LABORATORY ( LLNL – Livermore, California ) DESERT RESEARCH INSTITUTE ( P.O. Box 60220, Reno, Nevada 89506-0220 ) D.K. Smith; B.K. Esser; J.L. Thompson; J.I. Daniels; R. Andricevic; – 1994 – L.R. Anspaugh  ” ; R.L. Jacobson  ” ; I.Y. Borg; – 1976 – R. Stone  ” ; H.B. Levy  ” ; L.D. Ramspott  ” .

– –

– WYOMING – PROJECT WAGON WHEEL

AUSTRAL OIL PRESCO INC. ( The Woodlands, Texas – V-P: Kim R.W. Bennetts ) EL PASO NATURAL GAS U.S. ENVIRONMENTAL PROTECTION AGENCY ( EPA )

The federal agency had planned to move next to PROJECT WAGON WHEEL, but that test never took place. Wagon Wheel was scheduled to loosen up the tight formations on the Pinedale Anticline in Sublette County, Wyoming – an area operators have only recently started developing with large numbers of wells.

Unlike PROJECT GASBUGGY, which was a cost-is-no-object technical test, PROJECT WAGON WHEEL was supposed to ‘test profitability of atomic fracturing’ and was the biggest test to-date with plans for five ( 5 ) 100-kiloton nuclear explosions set-off 5-minutes apart and from 9,220 feet to 11,570 feet ( 2,812 meters to 3,529 meters ) in a field discovered by EL PASO NATURAL GAS.

The operators planned to wait 4-months to 6-months before testing the well, and thought radiation released in testing would be lower than normal background radiation. But the project gathered a lot of opposition. Among the statements that killed it was testimony by Dr. Ken Perry, a geologist with the University of Wyoming.

Looking at plans for 40 to 50 nuclear explosions per year, he said for full area development.

Southwestern Wyoming would be the ‘earthquake center of the world, according to Adam Mark Lederer in a thesis entitled, “Using Public Policy Models to Evaluate Nuclear Stimulation Programs: Wagon Wheel in Wyoming.”

After the PROJECT RIO BLANCO blast, officials plugged and abandoned three ( 3 ) wells in the area, but left three ( 3 ) wells open on the RB-E-01 drill pad so they could monitor the wells. When they finally abandoned the surface facilities, the radiation was no different than the background radiation in the area except for tritium readings, which exceeded government criteria in several samples.

In 2002, officials decided to lift all restrictions on surface activity. They did mandate there should be no penetration of the surface to 1,500 feet ( 458 meters ) within 100 feet ( 30 meters ) of the well bore and no intrusion from 1,500 feet to 7,500 feet ( 458 meters to 2,288 meters ) within a 600 feet ( 183-m ) radius of the well bore. The situation was similar at the Rulison well site, but with a recent twist.

Following the $6,500,000 million ( USD ) Rulison well, also called the AUSTRAL OIL Hayward #25-95 in Section 25-7s-95w, the government continued monitoring the subsurface as it had at Rio Blanco. In both cases it wanted to monitor the possible migration of radiation. The Environmental Protection Agency ( EPA ) also conducts annual sampling of deep monitoring wells and water sources in the area. Except for deep radiation at both sites, the areas are clean.

Recently, however PRESCO INC. ( The Woodlands, Texas ) sought permission for 40-acre spacing over an area that would include the Rulison well site.

It already had drilled a well in prolific Rulison field 1.5 miles ( 2.4 kilometers ) from the test site with no sign of radioactivity. Approval would have allowed the company to drill multiple wells near the Hayward well bore. Kim R.W. Bennetts, vice president of exploration and production for the company, said even if it drilled into the cavity, which it didn’t plan to do, very little radiation remained. Bennetts also said the company couldn’t, and wouldn’t, sell radioactive gas. Its planned wells in the area would be 1,200 feet ( 366 meters ) to the northeast and 1,600 feet ( 488 meters ) to the southeast of the Rulison test well.

After a February 10, 2004 meeting of the Colorado Oil & Gas Conservation Commission, the company then received approval to drill only one well on 40-acre spacing within a half-mile radius of the Rulison test well.

The company even had support from the county until it raised its plans from one ( 1 ) well to four ( 4 ) wells. The Colorado commission said it would approve the wells on a case-by-case basis. Currently, with no plans by any agency to continue work with ‘nuclear devices in the oilpatch’, it looks as if that research has come to an end.

While it lasted, however, it was a daring period of research into ‘advanced methods of releasing huge volumes of natural gas’, and that research continues today.

====

PART 4

EG&G –

EG & G INC. (aka) EG and G INCORPORATED 45 William Street Wellesley, Massachusetts 02181 U.S.A. Telephone: (781) 237-5100 Fax: (781) 431-4255 WWW: http://www.egginc.com

Statistics:

Public Company

Incorporated: 1947 as Edgerton, Germehausen & Grier, Inc. Employees: 13,000 Sales: $1.41 billion (1998) Stock Exchanges: New York Ticker Symbol:EGG

NAIC: 334413 Semiconductor & Related Device Manufacturing; 334419 Other Electronic Component Manufacturing; 334513 Instruments & Related Product Manufacturing for Measuring, Displaying & Controlling Industrial Process Variables; 334519 Other Measuring & Controlling Device Manufacturing; 54133 Engineering Services; 54199 All Other Professional, Scientific, & Technical Services; 54171 Research & Development in the Physical, Engineering & Life Sciences

Company Perspectives:

Our vision is that we can create value in an environment of ever-accelerating change. Value creation is our singular aim and ultimate measure of success. We believe that the increasing drive to create value represents the surest and most consistent avenue for us to benefit our customers, employees, stockholders and constituent communities. Our value creation model focuses on growth primarily derived from internal development.

Company History:

EG & G Incorporated is a diversified technology company that develops and provides products for public and private customers in the medical, aerospace, telecommunications, semiconductor, photographic, and other industries. The company’s operations are broken into five business units: Instruments; Life Sciences; Engineered Products; Optoelectronics; and Technical Services. Its Instruments operation is based on x-ray imaging systems and provides screening and inspection systems for use in airport and industrial security, and environmental, food, and nuclear industry monitoring. The Life Sciences unit develops systems for biochemical research and medical diagnostics. The Engineered Products unit designs and produces pneumatic systems, seals, and bellows for aerospace, semiconductor, and power generation markets. EG & G’s Optoelectronics division specializes in optical sensing devices for industrial and medical applications. The company’s final unit, Technical Services, provides engineering, research, management, and support services to governmental and industrial clients.

Nuclear Management and Monitoring: 1940s – 1950s

EG & G was established by three nuclear engineers from the Massachusetts Institute of Technology shortly after the end of World War II. These engineers, Harold E. Edgerton, Kenneth J. Germehausen, and Herbert E. Grier, had been involved in the American effort to construct an atomic bomb during the war. So valued were their contributions that after the war the government asked them to establish a company to manage further development of the country’s nuclear weapons. The three established a small partnership called Edgerton, Germehausen & Grier on November 13, 1947, and quickly began collecting contracts to advise the government on nuclear tests in Nevada and on South Pacific islands.

One of the first employees of the new company was Bernard J. O’Keefe, another MIT graduate who had worked for Dr. Grier during the war. O’Keefe served with the 21st Bomber Command in the Mariana Islands during the war, and is said to have personally wired the bomb that later destroyed the Japanese city of Nagasaki. O’Keefe was sent to Japan after its surrender to investigate that country’s progress with nuclear technology and recruit promising Japanese scientists for other atomic projects. A specialist in the design and development of electronic instrumentation and controls, O’Keefe quickly gained an important position in the growing firm.

Inconvenienced by the length of the company’s name, employees soon began to rely on the simple acronym EG & G, which later became its official name. In order to maintain close contact with MIT and its excellent nuclear and electronic engineering programs, EG & G set up its headquarters in Bedford, Massachusetts, in northwest suburban Boston.

EG & G was involved in the U.S. effort to build a more powerful nuclear weapon, the hydrogen bomb. That year, Grier and O’Keefe were present at a Nevada test site to personally witness an H-bomb detonation. After the weapon failed to explode, Grier and O’Keefe flipped a coin to determine who should scale the 300-foot test tower and disarm the bomb. Although O’Keefe lost, he won the special distinction of being the first man to disarm a live H-bomb.

O’Keefe had a second brush with disaster in 1958 when he witnessed an H-bomb detonation at Bikini Atoll in the South Pacific. There, shifting winds in the upper atmosphere caused a radioactive cloud of fallout to shower his bunker.

These experiences taught O’Keefe the awesome destructive power of nuclear weaponry and the dangers of radioactive fallout. As an engineer and manager he was bound to perform his company’s contracts, but grew personally opposed to the use of nuclear weapons. This sharpened his sense of responsibility toward the emerging form of warfare, a quality that was not lost upon the government’s Atomic Energy Commission.

As a result of EG & G’s experience with detonations, and O’Keefe’s concern for nuclear non-proliferation, the company became increasingly involved in distant monitoring projects, particularly as they related to Soviet nuclear tests. By observing changes in the atmosphere, EG & G was able to determine the incidence and strength of Soviet tests and provide important data on the progress of Moscow’s weapons program. In the process, EG & G gained highly specialized knowledge in environmental sciences. These skills had numerous applications outside the weapons industry, in such areas as pollution control and environmental management.

Exploring Commercial Markets: 1960s

As early as 1960, O’Keefe and the company’s three founders had considered establishing a new environmental analysis business, which would lessen EG & G’s dependence on low-margin government contracts and permit the company to enter new commercial markets. But at the time, neither public concern nor legislation placed a high value on such endeavors.

Three years later, the United States, the Soviet Union, and the United Kingdom signed a protocol that banned nuclear tests in the atmosphere, above ground, in the water, or in outer space.

With this document, EG & G appeared to lose a major portion of its business. However, the protocol did not prevent underground tests, which were far more complicated.

EG & G remained the only company with the proper supervisory credentials to manage this type of nuclear testing.

The company was forced to develop geologic analytical capabilities and become a ‘tunneling’ and ‘mining operation’ as well.

Furthermore, the government had also laid plans to establish a kind of NASA equivalent of oceanographics ( NOAA ).

Eager to take a place in this organization, EG & G invested heavily in oceanographic research.

While the underwater NASA never materialized, the efforts enabled O’Keefe to further cultivate new commercial markets for EG & G, including excavation and water transmission.

During this time the company’s three ( 3 ) founders moved further into retirement, taking ceremonial “executive chairman emeritus” positions.

As a result, O’Keefe became the de facto head of the company.

EG & G also pursued a strong acquisition campaign, taking over thirteen ( 13 ) companies between 1964 – 1967 when a strong environmental movement began to form in the United States.

With legislation still years away, EG & G began laying plans to play an important role in the environmental projects it was sure would result.

EG & G was divided into four ( 4 ) main operating divisions.

EG & G INTERNATIONAL, primarily concerned with oceanography, was the smallest.

EG & G standard products and equipment division produced a variety of machines and electronic devices, grew fastest during the 1960s.

EG & G nuclear detonation and monitoring business segment remained its largest.

EG & G nuclear technology group, most innovative and interesting, involved design of nuclear rocket engines ( ion engines ) for interplanetary propulsion.

EG & G CER GEONUCLEAR CORPORATION Projects included “nuclear landscaping” known as nuclear explosion controlled blasts, carving out:

Harbors; Canals; and, Passageways ( Tunnels, Pipelines, etc. ).

EG & G CER GEONUCLEAR CORPORATION Unit participations tested nuclear explosions ‘used to fracture layers of rock’ for access to otherwise inaccessible reserve locations of oil and gas for exploitation. Although feasible, these EG & G’public works’ projects failed in gaining public support.

In fact, opposition to nuclear technology in-general ‘increased’ as people grew wary of the safety of nuclear energy.

In addition, nuclear excavation would have required an unlikely waiver of the 1963 Nuclear Test Ban Treaty.

With the evaporation of good commercial prospects for its nuclear engineering expertise, EG & G was forced to rely again on military projects. Despite efforts to step up mechanical and electrical engineering work (partly by acquiring a spate of small research companies), EG & G mustered only four percent annual growth during the late 1960s.

Failed Initiatives: 1970 – 1975

Interest in nuclear power increased dramatically during the 1973 – 1974 Arab oil embargo, in which Americans sought to reduce their costly dependence on imported oil.

Realizing that the world’s oil exporting nations stood to permanently lose their largest customer, the United States, King Faisal of Saudi Arabia promptly called for an end to the embargo. Nonetheless, while Americans regained access to Arab oil, the end of the embargo was disastrous for the U.S. nuclear energy industry and for EG & G.

The end of the embargo removed one of the great justifications for nuclear power, and gave anti-nuclear activists time to properly organize legislative battles.

While EG & G was being locked out of yet another promising commercial application of its technologies, it attempted projects in other fields.

Some years earlier, in an effort to develop a new process for purifying nuclear isotopes, EG & G developed a flash tube that was ideal for photocopiers, but by the time an application could be developed, XEROX had already saturated the market with conventional designs.

In another ill-timed move, the company bet that environmental laws would cause demand for the unconventional Wankel engine to rise.

EG & G purchased a Texas automobile testing agency in hopes of winning large emission monitoring contracts, but the oil embargo destroyed the market for the clean but gas-eating Wankel engine as automobile environmental legislation was abandoned.

During this time, with the encouragement of the government, EG & G established a minority-dominated subsidiary, EG & G Roxbury, in a neighborhood of Boston, hoping to help strengthen the economic structure of the community. The project floundered, however, when bureaucrats failed to properly support the program, causing only a few sales to be made from the subsidiary. After a few years of disastrous results, the entire program was wound up.

On the Upswing: 1976 – 1980s

In 1976, EG & G environmental division evolved into a comprehensive resource efficiency operation providing complete oceanographic, atmospheric, and geophysical analysis rather than concentrating on environmental compliance languishing after the oil embargo by conserving resources, operations could more easily achieve pollution and waste reduction targets.

EG & G port development was unable to use nuclear devices to carve custom designed harbors, became a world leader in oceanographic studies and channel engineering by designing numerous tanker ports in the Persian Gulf and bauxite harbors in South America.

In 1979, U.S. President Jimmy Carter asked Bernard O’Keefe to serve as Chairman for the Synthetic Fuels Corporation of the U.S. government. Having already been asked to serve on a transition team for then presidential candidate Ronald Reagan, O’Keefe refused President Carter’s offer.

With the election of U.S. President Ronald Reagan in 1980, the United States took a sudden turn toward military armament programs.

EG & G experienced a resurgence in its flagging nuclear testing business and was tapped to develop a number of new nuclear weapons systems, including, the:

MX nuclear underground mobile railroad missile; and, Strategic Defense Initiative ( SDI ) – Star Wars Program.

A self-described “card carrying member of the military-industrial complex,” Bernard O’Keefe wrote in his book “Nuclear Hostages” that the United States and the Soviet Union were deadlocked in a nuclear arms race neither could control. Ironically, EG & G remained deeply involved in a number of Reagan administration projects O’Keefe opposed, including:

MX nuclear underground mobile railroad missile; Neutron bomb; and, Europe nuclear missile weapon stationings.

Nevertheless, EG & G pre-tax operating profit doubled from the new business.

EG & G also became involved in the space shuttle program, checking the spacecraft electrical components, loading its fuel, and managing the Cape Canaveral, Florida space center during shuttle missions.

EG & G site management abilities won it a position with the U.S. Department of Energy ( DOE ) elite Nuclear Emergency Search Team ( NEST ) which investigated nuclear extortion threats.

The company also won a contract to manage the U.S. government troubled Rocky Flats installation outside Denver, Colorado facility, widely criticized under ROCKWELL INTERNATIONAL for mismanagement, manufacturing nuclear weapon triggers.

EG & G maintained its momentum throughout the 1980s, winning contracts from diverse governmental agencies, including, the:

U.S. Department of Energy ( DOE ); U.S. Army; U.S. Air Force; U.S. Department of Defense; and, U.S. Customs Service ( now U.S. Department of Homeland Security ).

In 1988 the company hit a record high for both sales and earnings.

O’Keefe retired from EG & G during this period of strong growth, and was succeeded by John M. Kucharski.

Rapid Diversification: 1990s

Under U.S. President George Bush, and with the subsequent collapse of the Soviet military threat, the number of EG & G nuclear test projects decreased significantly.

As such, EG & G was under pressure to cultivate profitable new commercial ventures to offset the loss of revenue from military contracts.

The company responded rapidly, entering new commercial markets via a series of acquisitions.

One of the first acquisitions of the 1990s was ELECTRO-OPTICS, the optoelectronics business of GENERAL ELECTRIC ( GE ) Canada.

GENERAL ELECTRIC ELECTRO-OPTICS designed and produced advanced semiconductor emitters and detectors for defense, space, telecommunications, and industrial applications.

Other new ventures followed quickly:

WALLAC Group ( Finland-based ), which produced analytical and diagnostic systems;

IC SENSORS, a maker of sensing devices for industrial, automotive, medical, and aerospace uses; and,

NoVOCs Inc., an environmental remediation specialist.

In 1994, facing legal pressure from activists’ groups, EG & G announced that it would discontinue its nuclear-related endeavors as its various existing contracts expired. 1994, the company undertook a major reorganization to accommodate its newly acquired interests and the discontinuation of its nuclear business.

One important area of focus for the company was its Instruments division, which was rapidly becoming a leader in the field of weapons and explosives screening systems. After providing x-ray machines and metal detectors for the Democratic and Republican national conventions in 1992, the company won a contract to supply state-of-the-art explosives detection systems for U.S. federal courthouses across the nation. A subsequent contract with the Federal Aviation Administration ( FAA ) called for ten ( 10 ) of the most advanced explosives detection systems of EG & G for screening checked baggage in airports.

In 1998, EG & G president and CEO John M. Kucharski was replaced by Gregory Summe ( former president of ALLIEDSIGNAL INC. Automotive Products Group ) known for his ability to streamline and consolidate technology businesses.

He assumed his new position with the twin ( 2 ) goals, of:

1. improving operational efficiency; and,

2. restructuring EG & G portfolio, sharpen the focus on identified high-growth markets.

One of his first efforts toward better operational efficiency was to consolidate all EG & G’s business into five ( 5 ) independent strategic business units:

Life Sciences; Instruments; Engineered Products; Optoelectronics; and,Technical Services.

The company also began repositioning its portfolio by liquidating assets that fell outside these growth areas and making acquisitions that strengthened the EG & G position within its identified markets.

This strategy led to the largest acquisition in the company’s history:

LUMEN TECHNOLOGIES.

Lumen, purchased for $250,000,000 million in December 1998, was known globally as a producer of ‘specialty lighting’.

Lumen acquisition served to strengthen the company’s existing position in the medical lighting market, while at the same time allowing it entry into the areas of video and entertainment lighting.

Looking to the Future –

The consolidation efforts that Summe’s management team initiated in 1998 were expected to continue, with the goal of streamlining sites, functions, and processes so as to reduce operating costs and improve quality, consistency, and response time.

The company intended to continue its focused acquisition strategy.

It also planned to continue aggressively developing and marketing new products in its various divisions. Some of the products expected to be introduced were high-volume, cost-effective systems for drug screening, and a Point of Care system that allowed diagnosticians to determine whether or not a patient had suffered a heart attack in just 15-minutes.

In addition to introducing new products, the company also anticipated an increased emphasis on product line extensions and renewals.

EG& G Principal ( Primary List – Public ) Subsidiaries:

EG & G Alabama Inc.; EG & G ASTROPHYSICS ( England ); EG & G ATP GmbH ( Germany ); EG & G ATP GmbH & Co. Automotive Testing Papenburg KG ( Germany ); EG & G Automotive Research Inc.; EG & G CALIFORNIA INC.; EG & G Benelux BV ( Netherlands ); EG & G Canada Investments Inc.; EG & G Canada Limited; EG & G DEFENSE MATERIALS INC.; EG & G do Brasil Ltda.; EG & G E.C. ( UK ); EG & G Emissions Testing Services Inc.; EG & G ENERGY MEASUREMENTS INC.; EG & G Exporters Ltd. ( U.S. Virgin Islands ); EG & G Florida Inc.; EG & G GmbH. ( Germany ); EG & G HOLDINGS INC.; EG & G Hong Kong Ltd.; EG & G IC Sensors Inc.; EG & G Idaho Inc.; EG & G Information Technologies Inc.; EG & G Instruments GmbH. ( Germany ); EG & G Instruments International Ltd.; EG & G Instruments Inc.; EG & G International Ltd.; EG & G Japan Inc. ( USA ); EG & G Judson InfaRed Inc.; EG & G KT AEROFAB INC.; EG & G Langley Inc.; EG & G Ltd. ( UK ); EG & G Management Services of San Antonio Inc.; EG & G Management Systems Inc.; EG & G Missouri Metals Shaping Company Inc.; EG & G Mound Applied Technologies Inc.; EG & G Omni Inc.; EG & G Pressure Science Inc.; EG & G Singapore Pte Ltd.; EG & G SPECIAL PROJECTS INC.; EG & G Star City Inc.; EG & G S.A. ( France ); EG & G SpA ( Italy ); EG & G Technical Services of West Virginia Inc.; EG & G Vactec Philippines Ltd.; EG & G Ventures Inc.; EG & G Watertown Inc.; ANTARTIC SUPPORT ASSOCIATES ( Columbia ); B.A.I. GmbH. ( Germany ); Benelux Analytical Instruments S.A. ( Belgium; 92.3% ); Berthold A.G. ( Switzerland ); Berthold Analytical Instruments Inc.; Berthold France S.A. ( 80% ); Berthold GmbH & Co. KG ( Germany ); Biozone Oy ( Finland ); EC III Inc. ( Mexico; 49% ); Eagle EG & G Inc.; Eagle EG & G Aerospace Co. Ltd.; Heimann Optoelectronics GmbH ( Germany ); Heimann Shenzhen Optoelectronics Co. Ltd. ( China ); NOK EG & G Optoelectronics Corp. ( Japan; 49% ); PRIBORI Oy ( Russia ); PT EG & G Heimann Optoelectronics ( Singapore ); RETICON CORP.; Reynolds Electrical & Engineering Inc.; Science Support Corporation; SEIKO EG & G CO. LTD. ( Japan; 49% ); SHANGHAI EG & G RETICON OPTOELECTRONICS CO. LTD.; Societe Civile Immoiliere ( France; 82.5% ); THE LAUNCH SUPPORT CO. L.C.; VACTEC INC.; WALLAC ADL AG ( Germany ); WALLAC ADL GmbH ( Germany ); WALLAC A/S; WALLAC Holding GmbH ( Germany ); WALLAC Norge AS ( Norway ); WALLAC Oy ( Finland ); WALLAC SVERIGE AB ( Sweden ); WALLAC INC.; WELLESLEY B.V. ( Netherlands ); WRIGHT COMPONENTS INC.; ZAO PRIBORI.

====

PART 5

Research –

Keywords:

BaneBerry; Buggy; Cannikin; Dribble; Gnome; HardTack II; Hood; Pile Driver; PlumbBob; Roller Coaster; Rulison; Sedan; Shoal; Strategic Petroleum Reserve Operation ( SPRO ); Vela Uniform; Nuclear Rocket Development Station ( NRDS ); Kit Carson National Forest ( 10DEC67 ) – GasBuggy; Kit Carson National Forest ( AUG79 ) – Schneider’ Borehole Geophysics; Four-Dimensional Process Monitoring.

– –

– Pittman Station ( Henderson, Nevada ) USAF personnel support for Area 51. – PAX = Pittman Air Station ( USAF ) – PAD = Area 51 Test Site – Personnel Response Phrase For Area 51 ( Nevada ) Access Entry: “I work for ‘EG And G’ at the ‘Site’.”

– –

DESERT RESEARCH INSTITUTE ( DRI ), New Mexico.

– –

Not likely places to be visited by many of us are those secretly hidden underground, and even more-so underwater – since at least the 1930s.

Superpower nations have reigned supreme based on financial support for programs and projects worked on by some citizens holding ‘deep underground secrets’ hidden for decades by governments reasoning to such workers that their work has been done to protect ‘people’ and ‘property’ – all “in the interest of national security.”

Rarely do ‘government contract worker ants’ ( citizens ) ever realize ‘what it is’ we need to be protected from, and all anyone can think is that it is something bad we could imagine while never being able to comprehend anything ‘unimaginable’ we need protection from. Do any of us really want to know what ‘that is’, which we’re supposedly being protected against?

This report covers ‘unexplainable encounters’ governments decline to publicly explain. Events, buried over time and matters long forgotten about from most memories, that-is until recent events began unfolding a few, putting old stories thought-of as ’myths’ back into perspective today.

Some noteworthy facts, still remain ”classified’ and ‘locked away’ by high level authorities. Some, never revealed on television, motion picture films, books, or newspapers may now be reviewed through selected intelligence bits ‘n pieces ( below ) that may begin to dawn on some of us recalling having once heard of something about one or more but could not recall the source or details.

Legacy reports and more ( below ):

====

1975 – California ( Southern and Northern ) and Oregon ( Southern )

California Floats On Ocean?

John J. Williams of CONSUMERTRONICS CO. ( Alamogordo, New Mexico, USA ), said:

“Some time ago, I heard ( on a television interview show ) a man briefly mention that parts of California ( and neighboring states ) are floating on the Pacific Ocean.  He was a high ranking U.S. Navy officer aboard a top secret nuclear submarine that has been ( and is ) ‘exploring’ and ‘mapping’ enormous caverns and passageways underneath the Western U.S. for over 10-years now. A friend of mine finally tracked down the man who is now quietly living in retirement and asked that no details pointing to him be revealed as he does not want publicity and government attention. After writing this article, I destroyed my files on him.  This is his story.”

Williams explained that ‘not all’ areas in-question are actually ‘resting’ or “floating” on the ocean.

Many subterranean cavities are located beneath the western United States but not limited to just “California,” and consist of very large water-filled aquasystem passageways are explored up to several hundred miles inland, by nuclear-powered submarine, particularly in regions of southern California, northern California and southern Oregon.

Williams continues, “…When this U.S. Navy officer retired ( several years ago ), in spite of about 10-years of intensive U.S. Naval Oceanographic study, the U.S. Navy had still not gotten even a handle on their [ subterranean waterway passageway cavaties ] exacts [ oceanographic coordinates ] and dimensions. Today, the story may be different. He [ U.S. Navy officer, retired ] made the following statements from his observations:

1. These passageways are labyrinths with widths from a few [ feet ] up to thousands of feet wide, averaging roughly about a 100-feet across [ wet caverns ];

2. Much like dry caverns, heights and depths vary a great deal, and in some cases two [ 2 ] or more ‘caverns or passageways pass over or under each other’ at different depths;

3. Most of the underwater entrances lie just off the Continental Shelf;

4. Most of the underwater entrances are ‘too small for submarine investigation’, but many that are large enough lie in waters that are too deep [ ultra deep sea ];

5. Some of the caverns ( in southern California ) are topped by oil while others are filled with gases believed approximate to our atmosphere [ in very ancient times ];

6. The San Joaquin Valley [ California ] is essentially a portion of the original cavernous area that collapsed eons ago due to it’s sheer weight;

7. What is being passed-off as the “San Andreas Fault” are large unsupported chambers [ caverns ] that are in the process of collapsing.  When the ‘big one’ [ earthquake ] finally hits, many scientists in the know believe that most of California will break off like a cold Hershey bar [ chocolate candy bar ] and slide into the ocean.

8. [ this item was later deleted due to the individual’s fear that disclosure may in-part – due to recent ( 1985 ) international events – disturb a resolution to the feared problem, a similar scenario to that portrayed in a James Bond motion picture film depicting underground caverns, silicon valley technologies, nuclear weapons, and the San Andreas fault. ];

9. A well known U.S. nuclear submarine lost its way, within these passageways, resulting in its disappearance, but publicly reported as lost amidst open sea elsewhere using a recovery effort cover [ GLOMAR ( Global Marine ) Explorer ];

Williams continued, “I have no reason to doubt the man.  I can’t tell for sure whether or not these caverns and passageways exist or to their extent.  The story does sound a bit fantastic but I have no reason to doubt the man [ retired U.S. Navy officer ].  I have seen copies of documentation at least proving he was a high ranking U.S. Navy officer with nuclear-powered submarine duty, and a distinguished scientist. His scientific background and reputation are impeccable.  He definitely cannot be labeled as a crackpot, lunatic or publicity seeker.

I would very much like more information on this topic …”

Upon further inquiries, by ‘inner earth’ researchers, John J. Williams responded with the following when asked whether or not he [ Williams ] had received any replies to his request for more information about the alleged subterranean aquasystem passageways below California:

“Since publishing our article on the vast cavern network under much of California, we have received many responses and inquiries.  Some of these responses appear to have been from knowledgeable sources.  Note that the material sent to us for this article was written by someone of very high repute whose credentials I personally checked out.  Due to an agreement with him, I cannot reveal his identity.”

John J. Williams continued, “One response was from a retired submarine U.S. Navy Commander claiming to have spent many years in the waters off California and that such caverns do ‘not’ exist. Another response was from an anonymous person who cited unpublished oil company seismographic data stated, ‘Although most of the caverns you depict in your drawing are smaller, larger or located somewhat differently than the actual caverns, you are essentially correct … My information is more up-to-date than what you apparently relied upon.’  He ( or she ) did not supply any maps to pin down our differences, just some written descriptions, however some knowledgeable person could probably deduce his ( or her ) overall ‘map’ from the voluminous seismographic data sent.  I am in the process of looking for this input; it’s been several years now and it may have all been thrown out … Incidentally, the oil company seismic data had much data around the City of Fresno in Central California area if that helps any.”

1970s – California, County of Los Angeles, City of Long Beach

“One incident, which may lend credence to California floating on the ocean was a newspaper story that made ( in recent years ) headlines ( The Press Telegram newspaper of Long Beach, California ) involving an oil discovery beneath Long Beach, California as oil companies pumped oil out-of the ground the entire City of Long Beach began to sink up to 26-feet into the Pacific Ocean, and dikes had to be built to keep the seawater out.  The problem was ( temporarily ) being resolved by ‘water injection’ ( i.e. pumping an equivalent amount of water into the ground caverns to replace the amount of oil removed, in order to keep the City of Long Beach, California afloat.”

1963 April 23 – SubOceania      One note of interest, in connection with the account of John J. Williams, was a statement made by Virginia Louise Swanson, a prominent investigator of the California elusive creature known as “Bigfoot,” a huge hairy creature that walks upright. Virginia L. Swanson has performed considerale studies on cave connections in relationship to the Bigfoot phenomena and its ability to hide so well, and refers to these California dry cavernous openings saying, “Somewhere I got the idea that a big portion of Death Valley [ California ] is located on a shelf of ‘false bedrock’.  A certain type of earthquake would collapse all of it down to an enormous series of caverns that would open-up into another Grand Canyon [ Arizona ].”      According to our knowledge, the only nuclear-powered submarines to ever disappear – under mysterious circumstances – were the USS SCORPION or USS THRESHER.

It is uncertain whether the retired U.S. Navy officer, who John J. Williams spoke of, was referring to the USS SCORPION or USS THRESHER, although the USS THRESHER disappearance caused more publicity as a Flagship, the World’s most advanced class nuclear attack submarine designed to operate deeper in sea depths and more silent ( noise reduction propeller screws ) than any of its predecessor submarine vessels, The U.S.S. THRESHER was also endowed with highly significant advanced sonar equipment, fire-control systems, and was the most advanced international submarine in the World at the time of its disappearance so, it could have easily been an ideal choice as a U.S. Naval Oceanographic underwater global exploratory vessel missioned top secret for caverns mentioned by the U.S. Navy officer earlier interviewed by John Williams.

On April 10, 1963 the USS THRESHER, under command of U.S. Navy Lieutenant Commander John W. Harvey, a total of 129 men comprised of the ‘crew’, ‘civilian technicians’, and ‘observers’ – according to official government reports – disappeared without any explanation, trace or clue as to the fate of the vessel or occupants, nothing was ever recovered, and no indications of any oil slicks, radiation, floating debris, or similar signs of wreckage were ever seen.

It is interesting to note that, at the time almost all reports stated the USS THRESHER “disappeared” or was “lost” but no reports indicated it was “sunk” [ or “buried” or “captured” ].

One woman, whose husband was on the ill-fated USS THRESHER, reported she believed her husband was still alive.

Theologically speaking, the possibility of a long distance connection or “communion” on a deep emotional level between a husband and a wife may not always be consigned to the realm of the occult or psychic phenomena.  Many religions believe the very spiritual natures of a husband and a wife are united upon consummation of a marriage and thus become as Christian teachings indicate, “one flesh”.

The actual words of this woman, interviewed by William Carson and Jeannie Joy – two [ 2 ] writers [ columnists for Search Magazine ] devoted to pursuing strange events – shortly after the USS THRESHER disappearance, were as follows:

“My husband was on the submarine THREASHER when it disappeared.  I don’t consider myself a widow.  I don’t believe my husband is dead.  No, it’s not a matter of just not being able to believe it, to accept reality, I just can’t get over the conviction that he’s still alive somewhere.  I love my husband very much.  I know he loved – loves me.  We were very close.  We could always tell when something was wrong with each other.  Intuition, I guess.  I should have felt something the instant there was trouble, if he was really in serious trouble and knew it – a matter of life and death – but I didn’t.”

“What do you believe really happened?” Carson and Joy asked the attractive young woman.

“Most people think I’m crazy when I say this, but I believe the THRESHER was captured.”

“By whom?”

“I can’t say for sure, but there ‘was’ a Russian submarine spotted near there that day ( near ‘where it reportedly vanished 220-miles off Boston harbor ), only I can’t imagine how even the Russians could ‘capture’ a vessel like the THRESHER without leaving the slightest evidence!”

1989 – California, County of Inyo, City of Deep Springs ( east of Owens Valley and Bishop, CA ) and Nevada, Las Vegas

The following account, concerning an area just east of Owens Valley in Bishop, California was related by Val Valerian in his ‘Leading Edge’ newsletter ( December 1989 – January 1990 issue ) article entitled “Deep Springs, California,” stated:

“Deep Springs, California is an area that is becoming known as the site for very strange events.  According to the information released both on the air of KVEG AM radio and from other sources, the area is full of strange people wandering around in black suits. There have also been rumors that there is an underground facility in the area.

Checking with gravity anomaly maps proved that there are large cavities under the ground in that area. The wildest claims relative to the area have stated that alien life-forms are being released there … Deep Springs Lake has been probed and it appears bottomless.

Divers have traveled along an underground river 27-miles toward the Las Vegas, Nevada area before having to turn around.”

1963 – California, County of Inyo-Kern, Bishop & Casa Diablo

In the April 1963 issue of Search Magazine, investigative reporters William Carson and Jeannie Joy, in their regular column ( “Prying Into The Unknown” ) relayed the following information:

“It has always been a mystery to us, in the first place, how Mr. and Mrs. P.E. [ names excised for privacy ] can find and afford the time to do the sort of things most of us only dream of doing.  After knowing them for more than 15-years, it is inconceivable to suspect their integrity or sanity – and yet they impose the following excise upon our credulity.

While exploring for petroglyphs, in the Casa Diablo vicinity of Bishop, California Mr. & Mrs. P.E. came upon a circular hole in the ground, about 9-feet in diameter, which exuded a sulfurous steam and seemed recently to have been filled with hot water.  A few feet from the surface the shaft took a tangent course which looked easily accessible and, upon an impulse with which we cannot sympathize, the dauntless E.’s – armed only with a flashlight – forthwith crawled down into that hole. At a depth we’ve failed to record the oblique tunnel opened into a horizontal corridor whose dripping walls, now encrusted with minerals, could only have been carved by human hands, countless ages ago; of this the E.’s felt certain.

The end of the short passage was blocked by what seemed to be a huge doorway of solid rock which, however, wouldn’t yield.  Their flashlight was turned to a corner where water dripped from a protuberance, which proved to be a delicately carved face, distorted now by the crystallized minerals, and from whose gaping mouth water issued.

As Mr. and Mrs. E. stood there in silent awe – wondering what lay behind that immovable door – the strangest thing of all happened … but our chronology will not be incorrect if we wait until they return to the surface before revealing this, for now the water began gushing from the carved mouth and from other unseen ducts elsewhere in that cave and rising at an alarming rate. They hurried to the surface, and in less than 30-minutes there was only a quite ordinary appearing pool of warm mineral water on the desert floor.

‘Do you know,’ Mrs. E. said to her husband, ‘while I stood down there I heard music – the strangest, most weird music I’d ever heard.  But it seemed to come from everywhere at once or from inside my own head.  I guess it was just my imagination.’

Mr. E. turned pale. ‘My God,’ he said, ‘I thought it was my imagination but I heard it too – like music from some other world!’

Why do they call that rock formation – near where the E’s had their strange experience in Bishop, California – ” Casa Diablo ” [ known in English as ] “Devil House”? And why was that area [ County of Inyo ] named by the Indians as ” Inyo ” [ known in English as ] dwelling place of the great spirit?”

1991 – New Madrid Fault Area ( Central United States & Northeast Mexico )

– Typhlichthys Fish – Interwoven Immense Cave Systems ( Central United States & Southern United States – New Madrid Fault Area ) – Sotano de las Golondrinas Cave Hole ( Aquismo, S.L.P., MEXICO )      Erich A. Aggen Jr., in his article “Top Secret: Alien UFO Bases” ( Search Magazine issue Summer 1991 ), presented the following revelations concerning the UFO subterranean connection:

“A great deal of UFO research has also led to the conclusion that various … species of aliens have set up secret underground bases in the United States and other countries.  It is logical to assume that such bases have also been established elsewhere in the solar system.  If such bases exist, where would we find them?  Existing information allows us to make a few educated guesses.

Earth bases, underground, The dark, cavernous world beneath our feet is the source of many baffling mysteries. Clandestine UFO bases may be hidden deep within the earth in natural and/or artificial caverns.

As a former member of the National Speleological Society ( NSS ), I am well aware of the vast extent of cave systems within the United States.

In my own native state of Missouri, for example, there are over 2,500 known caves and dozens of new ones being discovered every year.  Many of these caves are intricately linked together by numerous passageways and interconnecting chambers.

One particular species of blind white ( albino ) cave fish, known as the TYPHLICHTHYS, has been found in many widely separated cave systems over several states.

Typhlichthys fish been found in caves that make a great arc through the states, of:

Kentucky; Indiana; Illinois; Missouri ( ‘under’ the Mississippi River ); Arkansas; and, Oklahoma

These states rest above one [ 1 ] immense cavern system that comprises a large area of ‘both’ the Central United States and Southern United States where many caves possess rooms hundreds of feet in length, width and height as huge natural caverns only reached and explored with the utmost skill and perseverance.

There are only a few thousand National Speleological Society ( NSS ) members in the U.S. and only a few hundred of this number are active spelunkers so, with few spread over such a large area only a very small fraction of the tens of thousands of known caves in the U.S. have been carefully mapped and explored while thousands of other caves remain undiscovered and unexplored.

Extensive evidence indicates caves in the U.S. may be connected with caves in other parts of the world.

In the Municipio de Aquismo, S.L.P. of Mexico, the cave known as ” Sotano de las Golondrinas ” [ known in English as ] “Basement of the Swallows ” reaches a depth of 1,100 feet ( 334 meters ).

The Sotano de las Golondrinas cave is actually a giant ‘sink hole’ or ‘hole in the ground’. Atop Sotano de las Golondrinas hole is a ‘near circular opening hundreds of feet in diameter that is impossible to climb down the sides of because the walls of the opening are too smooth and “belled-out” so, the only way to reach the bottom is to secure – at its top – a special rope over 1,100 feet long dropped into the sinkhole.

Underground explorers [ spelunkers ] must descend – into the Sotano de las Golondrinas yawning hole – one [ 1 ] person at a time, using special cave repelling gear and climbing techniques.  At the bottom of the Sotano de las Golondrinas hole are numerous ‘leads’ ( openings ) that feed into multiple different crevices, passageways, crawlways and rooms never mapped or investigated.

Sotano de las Golondrinas cave entrance is located in one of the most primitive and uncivilized areas of Mexico, and local inhabitants are afraid to approach the cave because they believe it is full of ‘evil spirits’ luring people to their deaths.  They tell stories of people mysteriously disappearing never to be heard from again while passing near the cave entrance.

These stories may be based more on fact than fiction as they are similar in some respects to UFO [ unidentified flying objects ] abduction reports.

Because of the Sotano de las Golondrinas huge hole size, remote location and unique geological structure it would be an ideal UFO [ unidentified flying objects ] base.  Naturally camouflaged caves, in other parts of the world, may serve as excellent natural bases, way-stations or depots for UFOs.

1968 December – Nevada and Canada

In December 1968, the ” SCHOONER EXPERIMENT ” was an underground nuclear test conducted that substantiates the theory that caves in North America and South America are intimately linked.

The Schooner Experiment was a 35-kiloton nuclear bomb exploded under the Nevada desert, however 5-days later and 1,000 miles further away in Canada 5-days test radiation levels rose up to 20 times greater than at the Nevada test site.  The only way the radioactive dust could have traveled that far is through an interconnected system of caves extending all the way from Nevada to Canada!”

1932 – California, Death Valley, Panamint Mountains

Bourke Lee authored the book, ‘Death Valley Men’ ( MacMillan Co., N.Y. 1932 ), wherein Chapter “Old Gold” described a conversation Bourke Lee had with a small group of Death Valley, California residents speaking of Native American Paihute Indian legends when two ( 2 ) men ( Jack and Bill ) began describing their experience with accidentally discovering an ‘underground city’ after one of them fell through the bottom of an old mine shaft near Wingate Pass where they began following a natural underground quay like tunnel system ( apparently formerly lit by subterranean gas light ) traveling 20-miles northward on an incline taking them to a higher level with an exit out onto a ledge looking about halfway down the slope of the eastern face of the Panamint Mountains, but back down within the tunnel system they came across a huge ancient underground cavern city that contained many sights, including several perfectly preserved mummified bodies ( mummies ) still wearing thick arm bands, gold spears, a ‘large polished round table’ within another huge ancient chamber, giant statues comprised of gold, stone vaults where ‘drawers’ were filled with a variety of precious gem stones and gold bars, stone doors perfectly counterweighted easy to open, amongst other amazing sights, as well as ‘extremely heavy stone wheelbarrows’ designed with scientific counterweight construction for perfect balance so they could easily be manipulated.

From the ledge, they could see Furnace Creek Ranch and its Arroyo ( a water wash ) below them in Death Valley where they realized it was formerly filled with water so, they concluded that having previously seen the underground city system of huge archway openings could have been ancient waterway docks for large boats.

Bourke Lee was further informed that they brought some of the treasure out of the caverns and tried to set up a deal with certain people, including scientists associated with the SMITHSONIAN INSTITUTE ( Washington, D.C. ) to garner assistance in further exploring and publicizing this ancient underground city as one of the wonders of the World, but there efforts ended in disappointment when a friend of theirs stole the treasure ( their ‘evidence’ ) and were consequently scoffed at and rejected by scientists because when the discoverers went to show the ‘mine’ entrance after a then apparent cloud-burst brought such severe rains upon the entire hillsides were washed down over entire countryside landscape areas rearranged to obscure the entrance location.

The last time Bourke Lee heard from his friends, Bill and Jack, they were preparing to climb the eastern face of the Panamint Mountains to locate the ancient tunnelled-out ledge opening – located half way up the side of that steep slope. Bourke Lee never saw or heard from his discoverer friends ( Jack and Bill ) again.

During their initial lengthy conversation with Bourke Lee, the two ( 2 ) discoverers ( Jack and Bill ) had previously ‘revealed secrets of the underground city’ to others, but they discussed many things, including:

An alleged ‘subterranean race’ living in deep underground caverns beneath the ‘former seabed floor area’ of what is now the desert area of Death Valley, California.

There was another conversation about a remarkable Native American ‘Paihute tribe legend’ similar to an ancient myth of Greece.

The Paihute legend surrounds the death of the wife of a tribal Chief who, according to Native American tradition, took a ‘spiritual journey to the underworld’ to locate her, and amazingly upon returning with her he forbiddenly ‘looked back’ and was then prevented from bring his wife the remainder of the back from the dead.

This would not be the same as a more tangible earlier report from the Native American Navaho indian Oga-Make who conveyed that a Native American Paihute indian tribal Chief was alleged to have been ‘physically’ taken into the Native American “Hav-musuv” tribe subterranean cities beneath the Panamint Mountains.

Paihute indian legends, of the Hav-musuvs indicate these ancient Panamint Mountain dwellers abandoned their ancient city within by migration deeper into larger caverns below.

Could these reports coincide with the Paihute legends of the Hav-musuvs?

Bourke Lee, discourse ( below ):

“… The ‘professor’, ‘Jack’ and ‘Bill’ sat in the little canvas house in ‘Emigrant Canyon’, and heard the legend all the way through.

The professor said, ‘That story, in its essentials, is the story of Orpheus and Eurydice.’ ( Greek mythology )

‘Yes,’ I said, ‘it’s also a Paiute legend. Some indians told that legend to ‘John Wesley Powell’ in the sixties.’ ( 1960s )

‘That’s very interesting,’ said the professor.  ‘It’s so close a parallel to Orpheus and Eurydice that the story might well have been lifted bodily from the Greeks.’

Jack said, ‘I wouldn’t be surprised.  I knew a Greek.  I forgot his name, but he ran a restaurant in almost every mining town I ever was in.  He was an extensive wanderer.  The Greeks are great travelers.’

Bill said, ‘They don’t mean restaurant Greeks.  The Greeks they’ve talked about have been dead for thousands of years.’

‘What of it?’ asked Jack, ‘maybe the early Greeks were great travelers, too.’

The professor said, ‘It’s very interesting.’

‘Now! About that tunnel,’ said Bill, with his forehead wrapped in a frown, ‘You said this indian went through a tunnel into a strange country, didn’t you?’

‘Yes,’ I said, ‘I think I called it a cave or a cavern, but I suppose a miner would call it a tunnel. Why?’

‘Here’s a funny thing,’ said Bill, ‘This Indian trapper living right across the canyon has a story about a tunnel, and it’s not 1,000 years old either. Tom Wilson told me that his grandfather went through this tunnel and disappeared. He was gone 3-years, and when he came back he said he’d been in a strange country living among strange people. That tunnel is supposed to be somewhere in the Panamints ( Panamint Mountains ) not awful far from where we’re sittin. Now! What do you make of that?’

Jack said, ‘I think Tom’s grandfather was an awful liar.’

I said, ‘Tom’s grandfather lived when the Paiutes ( Native American indian tribe ) were keeping their tribal lore alive. He probably knew the old legend.  Powell ( John Wesley Powell ) heard it in Nevada only 65-years ago.’

‘It’s very interesting,’ said the professor.

‘I got an idea about it,’ said Bill ( thoughtfully ), ‘Tom’s grandfather might have wandered into some tunnel all goofy from chewin’ jimson weed and then come out an found some early whites ( pioneer caucasian settlers ) and stayed with them. Tom told me that the people spoke a queer language and ate food that was new to his grandfather and wore leather clothes. They had horses and they had gold. It might have been a party in Panamint Valley, or even early explorers or early settlers in Owens Valley ( California ). How about that?’

Jack said, ‘Yeah. The Spaniards ( Spain ) were in here, too. So it might have been Spaniards ( Spanish ) or the early Greeks ( Greece ). And, where is this tunnel? And why did Tom’s grandfather have trouble speaking the language? This is an entirely different story than the one Buck told.  We are arriving at no place at all with these Indians and Greeks. To return – for a moment – to our discussion of geology, professor, ‘Have you been in Nevada much?’

From that point forward the conversation went onto another subject.

1970′s ( early ) – California, County of Kern, Garlock, Goler, and Mojave

An area in the [ southern California ] Mojave Desert region, that may connect to the U.S. Western Region subterranean ( subsurface ) drainage network, involves “Red Mountain” ( also known as the “Iron Mountain Range” ) [ 1-mile northwest from the old ghost town of Garlock, California ] where one [ 1 ] of its [ southeastern ] peaks in the “El Paso Mountains” – [ about 20-miles ] northeast of Mojave, California where there are many bizarre accounts connected with this mountain that apparently got it’s name in-part from the many old mines which can be found there, along with numerous natural cavities which open out to the surface in many different areas.

The area has allegedly been the site of certain activity concerning Native American Indian ritual and occult practices, as well as the site of alleged secret government activity, some of which reportedly involves the observation and monitoring of strange [ biological ] creatures and ‘automatons’ [ half and half, “Man-Machines” (aka) “Manchines” ] said to stealthily ( night cloakers ) emerge from seemingly out-of nowhere ( faint sound of electronic whirring ) that travels up and down and into the [ canyon ] areas on occasions.

Just exactly what these ‘bionic creatures’ are is uncertain, but some accounts indicate that they are dangerous!

Could it also be a ‘magnetic’ zone due to the high iron content [ within the “Iron Mountain” range ]?

During the early 1970′s, on ‘no less than’ two ( 2 ) seperate occassions have U.S. federal government employees mysteriously disappeared from ”Red Mountain” range areas, and not reported back for work the following day.

The second [ 2nd ] occassion was, when a U.S. government employee went missing while investigating the previous disappearance of the U.S. government employee during the first [ 1st ] occassion.

While all of the facts, surrounding United States missing federal employees were not released to the public, portions of information were discovered early-on into the case situation that was limitedly documented by a few additional facts:

… [ EDITED-OUT ] …

Some of this particular report is only held ‘in-part’ as ‘proprietary information’ by Kentron Intellect Research, and ‘officially’ in-full by the U.S. Geological Survey ( USGS ), U.S. Department of Justice Federal Bureau of Investigations, and other U.S. government authorities governing release of sensitive information. ]

– West Virginia, County of Webster ( Northern )

Some years ago, a woman by the name of Joan Howard – at the time living in eastern Canada although originally from Britain ( UK ) – wrote a manuscript in which she described her own paranormal experiences with small “alien” entities.

Joan had experienced several UFO type abduction / encounters while at a very young age when she still lived in Britain ( UK ), and claimed to have had ‘psychic contact’ with [ biological ] beings that claimed to be of extraterrestrial origin.

These experiences were accompanied by a great deal of occult manifestations – such as poltergeist phenomena, psychic dreams, encounters with invisible entities, etc.

Joan even admitted that she often doubted the claims of these [ biological ] ‘beings’ – their actions being manipulative and just didn’t seem to coincide with their claims of being here as some kind-of ‘group of cosmic saviors’ to ‘lead humanity’ into a ‘New Age’ of ‘enlightenment’.

She also warned other researchers, to retain a “keen analytical mind” – when dealing with alien entities – so as not to fall under possible deception or manipulation.

Perhaps, as she suggested to others, they [ alien beings ] ‘might actually be here’ to ‘prepare for a future invasion of this planet’ and were merely ‘using her for various purposes to help prepare the way’, and that all of their ‘benevolence’ talk was just that – talk!

She ‘did’ describe vivid “dreams” in which she saw ‘alien craft hovering over major cities blasting frightened and terrified people in the streets’ with powerful ‘beam weapons’ – ‘dreams’, which she suggested, might be somewhat ‘prophetic’ in nature.

She described the [ biological ] entities as being small or dwarfs, yet was unsure whether they were human or not – although they ‘did’ attempt to pass themselves off as some type of ‘evolved human species’ – something which the ‘Grays’ [ biological alien beings ] have apparently done in order to break down any natural enmity which might prevent their ‘contactees’ or ‘abductees’ from receiving the lies which they intentionally fed them as part of their program of conquest and control.

Joan Howard, incidentally, wrote a privately published book, “The Space – Or Something – Connection,” which is referred to because it dealt with some experiences her husband had – shortly after she came to America.

In fact she devoted an entire chapter of her book ( “The Space – Or Something – Connection” ) to her husband’s account, which involved some incidents that took place while he was doing some field work for a certain company requiring a great deal of activity outdoors, and her husband and his co-workers travelled through some relatively unpopulated terrain in West Virginia regional areas between Newville, West Virginia ( Braxton County ) and Helvetia, West Virginia ( Randolph County ) general around northern Webster County, West Virginia where through mountains of rolling hill forests and wilderness he encountered some very strange things and heard accounts of strange cave related incidents from the locals.

During, which at one point, her husband claimed their group ran across what appeared to be a pipe sticking up from the ground – far away from the nearest town – where there was no other sign of civilization or anything man-made for miles on either side; yet here was this large pipe or tube sticking straight up from the ground.

The most remarkable thing about the pipe was that a flame of fire was shooting straight up out-of the pipe as if it were burning-off some type of gas – they never found out just what it was – but it was ‘within this same general area’ they explored ‘caverns’ containing unexplained issues.

One ( 1 ) of the caves displayed strange hieroglyphic writings on its walls, according to some men, while others claimed also hearing faint volumes of voices – behind the walls of the cave, in-addition to faint sounds – coming from beneath the cave floor – as though machines were moving around within ‘underground’ depths.

Her husband claimed, that after a long work day in the field, one evening two ( 2 ) men fell asleep at the mouth of one particular cave that inside ( a great distance away ) contained an unexplored but apparently very deep chasm ( hole ), and the following morning one ( 1 ) of the two ( 2 ) men awoke ( in front of that cave ) but found his partner had disappeared – no trace was ever found of that missing employee.

That particular cave had been known as a place of unusual occurrences, and a place to stay away from. Some even went so far as to call it “Satan’s Lair.”   Whatever the case circumstances may actually be, all the aforementioned information may provide some additional awareness into what may have surrounded that employee’s disappearance.

One of the most remarkable accounts that Joan Howard’s husband heard involved a man claiming, that while exploring labyrinth depths of a particular cavern in the same area of north Webster County, West Virginia ( USA ), he suddenly came face to face with an attractive woman completely void of any hair on her head, and the woman spoke in a language completely foreign to the man – whereupon after unsuccessfully trying at great lengths to communicate with each other they departed and went their separate ways.

1989 – California ( Southern )

On November 3rd, 1989 Ken Hudnell, a well-known Los Angeles, California radio talk show host announced – over broadcast airwaves – his intention to take a group to visit ‘one of the ancient underground cities’ that had an ‘entrance’ located 60-miles from Anaheim, California. ( The Leading Edge Magazine )

1962 – California ( City of Mojave ), Nevada ( Carson City ), Utah ( Zion Canyon ) and Arizona ( Page )

During the 1940′s, one of the few specialized publications – that grew out-of the Palmer – Shaver ( Richard S. Shaver ) controversy – was The Hidden World ( issue A-8 ) reporting about a letter released from Charles Edwards ( aka ) Chuck Edwards, a researcher, surrounding what many people ( especially those throughout southern California ) have believed for decades based on having been officially told the United States Western Regional area subterranean drainage ultimately all flows into the Pacific Ocean.      Later, in 1962, Chuck Edwards released some of his own discoveries surrounding the “Western Subsurface Drainage Network” ( i.e. southern California, Nevada, and Utah ) that ‘does not’ “ultimately flow into the Pacific Ocean,” but actually flows ‘underground’ through a ‘vast subterranean network drainage system’ dumping elsewhere.

Addressed to Richard S. Shaver, the Chuck Edwards’ letter reply is quoted ( below ):

“This letter is in reply to your January 31 letter.  Please forgive me for not answering sooner.  Enclosed is some material I hope that you can glean something of value [ from ].  Please be as candid as you have been in the past and if I am far off base don’t hesitate to tell me. …

Our foundation has located a vast system of underground passages in the Mother Lode country of California. They were first discovered in 1936, ignored by all even with our best efforts to reveal them.

Recently a road crew blasted out an opening verifying our claims. One [ of the chambers is ] 200-feet long, 70-feet wide and 50-feet high.

We have disclosed what we believe to be a vast subterranean drainage system ( probably traversing the Great American Desert country for a distance of more than 600-miles ).

We believe this system extends out like five [ 5 ] fingers of your hand to such landmarks as Zion Canyon in Utah, the Grand Canyon [ Arizona ], another runs south from the Carson Sink in Nevada, and yet another follows [ below ] the western slope of the same range – joining it’s counterpart and ending somewhere in the Mojave Desert [ southern California ].

We believe – contrary to orthodox geologists – that the existence of this underground system, drains all surface waters running into Nevada ( none, with the exception of the Armagosa, runs out ) and accounts for the fact that it is a Great American Desert. The hairy creatures, that you have written about, have been seen in several of these areas. Certainly there has been much ‘saucer’ [ UFO ] activity in these parts. For 2-years, I have collected material pertinent to these creatures and if you have any opinions along these lines I would appreciate hearing them.

So much for now.  I hope that I am still your friend.

Much of my time has been devoted [ to ] helping a farmer near Portland [ Oregon ] who has made a fantastic discovery of incredible stone artifacts. He has several tons of them. They predate anything yet found ( or accepted ) let us say that for now.

We are making slow but steady progress in getting through the wall of orthodoxy.

– Chuck Edwards”

1946 – California ( Northern ), Mt. Lassen, Cascade Mountains, Cascadia Fault, Oregon, Washington and Canada ( British Columbia )

Following the Sierra Nevada [ mountain ] range from here [ California ] into the northern territories, one arrives at the Cascade Range [ mountains ], consisting mostly of dormant or extinct volcanic mountains rising at intervals through the U.S. northern State of California, Oregon, Washington, and into southwestern Canada.

The Cascade Range [ Cascadia Fault zone ] is not without it’s own peculiar accounts of subterranean recesses occupied by unknown beings – both human and non-human – apparently rediscovering what are portions of ancient antediluvian underground networks, which some say were inhabited by a race of intelligent [ biological ] but war-like hybrid reptiles genetically resembling instances of humanoid shapes.

There are many unanswered questions as to just how the subsurface world was used or exactly what role it may have played in relation to subterranean ancient legends of inhabitant races, but the following account may explain some mysteries by envisioning a clearer and broader perspective.

Around September 1946, Ralph B. Fields submitted his account to Amazing Stories Magazine ( December 1946 issue, pp. 155-157 ), with the assurance that it actually happened and his facts were true, as follows:

“In beginning this narrative and the unexplained events that befell my friend and myself, I offer no explanation, nor do I even profess to offer any reason.  In fact I have yet to find a clue that will even in part offer any explanation whatever. Yet as it did happen, there must be some rhyme or reason to the whole thing. It may be that someone can offer some helpful information to a problem that just should not exist in these times of enlightenment.

To begin with, if we had not been reading an article in a magazine telling us about the great value of guano, ( i.e. old cave bat excrement / dung droppings as being highly valued fertilizer ] that have accumulated over a great number of years, we would have continued to mend our merry way through life without ever having a thing to worry about. But having read the article as we were at the time living near a small town called Manten in Tehama County, California we thought that would be a good country to explore for a possible find of this kind.

After talking it over for some time, and as we had plenty of time just then, we decided to take a little trip up the country just back of us.

As we were almost at the foot of Mount Lassen, that seemed the best place to conduct our little prospecting tour.

Collecting a light camping outfit, together with a couple of tents to sleep in, we started out on what we expected to be a 3 or 4-day jaunt up the mountain … I guess we covered about 10 or 12-miles on the 3rd day and it was fast approaching time to begin to look for a place to spend the night and the thought was not very amusing as it had turned a little colder and we were well over 7,000 feet above sea level.

We soon found a sheltered place, beneath a large outcrop of rock, and set about making a camp.

As I was always the cook, and Joe the chore boy, I began getting things ready to fix us some grub. Joe began digging around for some dead scrub brush to burn.

I had things all ready and looked around for Joe and his firewood, but I could see no signs of him.

I began calling for him, and he soon came into sight from around the very rock where we were making our camp.

And I knew he was laboring under some great excitement and his face was lit up like a Christmas tree.

He had found a cave.

The entrance was on the other side, of that very rock.

He was all for exploration right away.

But I argued that we had better wait till morning.

But he argued that, in a cave it was always night and we would have to use flashlights anyway, so what would be the difference?

Well, we finally decided that we would give it at least a once-over after we had a bite to eat.

It wasn’t much to call a ‘cave’ – at first – as it had a very small entrance, but back about 20-feet it widened out to about 10-feet wide and around 8-feet high. And it did reach back a considerable distance as we would see at least 100-yards and it appeared to bend off to the left. The floor sloped slightly down. We followed to the bend and again we could see a long way ahead and down … At this point we became a little afraid as we were some way into the mountain …

I don’t know how far we went, but it must have been 1-mile or 2-miles, as we kept on walking and the cave never changed it’s contour or size. Noticing this I mentioned it to Joe.

And we discovered an amazing thing. The floor seemed to be worn smooth as though it had been used for a long time as a path or road. The walls and ceiling of the cave seemed to be cut like a tunnel. It was solid rock and we knew that no one would cut a tunnel there out of rock as there had been no sign of mining operations ( tailings ). And the rock in the walls and ceiling was run together like it had been melted. Or fused from a great heat.

[ UPI NOTE INSERT ( here ): It is believed the U.S. government possesses PlasMole ( aka ) “Terron Drive” ( ref.: Paul Peter Schneider ) tunneling machines that travel at 5-mph and can permeate – by melting – a 50-foot hole through solid rock. ]

While we were busy examining the cave in general, Joe swore he saw a light way down in the cave.

We started down the cave once more and found a light. Or should I say the light found us as it was suddenly flashed into our faces. We stood there blinded by it for a minute until I flashed my light at it’s source and saw we were confronted by three [ 3 ] men. These men looked to be about 50 or a little younger. They were dressed in ordinary clothing such as is worn by most working men in the locality. Levi type pants and flannel shirts and wool coats. They wore no hats. But their shoes looked strange as their soles were so thick that they gave the impression of being made of wood.

There stood three [ 3 ] men looking at us in a cave, 1-mile or so in the depths of old Mount Lassen … One of them spoke to us. He asked what we were looking for … we came to the conclusion that we had better retreat. Turning to go we were confronted by two [ 2 ] more of them. One of the strangers told us, ‘I think maybe you had better come with us.’ … So we permitted the five [ 5 ] to escort us deeper into the depths of old Mt. Lassen … They had led us farther down and I guess we had gone a couple more miles when we came to the first thing that really amazed us.”

We came to a place where the cavern widened out a little and we saw some kind of machine, if it can be called that. Though I had no chance to examine it closely at the time, I did later and it was a very strange contrivance. It had a very flat bottom, but the front was curved upward something like a toboggan. The bottom plate was about 8-inches thick and it was the color of pure copper. But it was very hard tempered. Although I have had a lot of experience in metals and alloys, I had no opportunity to examine it closely enough to determine just what it was. I doubt very much if I could.

It had a seat in the front directly behind a heavy dashboard affair and there was a dial shaped in a semi-circle with figures or markings on it. I had not the slightest idea what they stood for, but they were very simple to remember. If there was a motor, it was in the rear.

All I could see was two [ 2 ] horseshoe or magnet-shaped objects that faced each other with the round parts to the outside. When this thing was in operation, a ‘brilliant green arc’ seemed to leap between the two [ 2 ] and to continue to glow as it was in operation. The only sound it gave off was a hum or buzz that sounded like a battery charger in operation.

The seat in the front was very wide.

The only method of operation was a ‘black tear-shaped object’, which hung from the panel by a chain.

One [ 1 ] of these men – sitting in the middle – took this thing and touched the sharp end to the first [ 1st ] figure on the ‘left side’ of the dial.

When he touched the first [ 1st ] figure, the contraption seemed to move almost out from under us, but it was the smoothest and quietest take-off I ever experienced. We seemed to float. Not the slightest sound or vibration.

And after we had traveled for 1-minute he touched the ‘next [ 2nd ] figure’ on the dial and our speed increased at an alarming rate.

But when he had advanced the black object over past the center of the dial, our speed increased until I could hardly breathe.

I can’t begin to estimate the distance we had traveled or our speed, but it was terrific.

The two [ 2 ] horseshoe objects in the rear created a green light that somehow shone far ahead of us, lighting up the cavern for a long way.

I soon noticed a black line running down the center of the cavern and our ‘inner-mountain taxi’ seemed to follow that.

I don’t know how long we continued our mad ride, but it was long enough for us to become used to the terrific speed and we had just about overcome our fear of some kind of wreck when we were thrown into another spasm of fear. Another machine of the same type was approaching us head on. I could see that our captors were very nervous, but our speed continued.

As the other machine became closer our speed slowed down very fast and we came to a smooth stop about 2-feet from the front of the other machine.

Our machine had no sooner stopped than our captors leaped from the machine and started to dash away.

A ‘fine blue light’ leaped from the other machine in a ‘fine pencil beam’ and it’s sweep caught them and they fell to the cavern and lay still.

The figures dismounted from the other machine and came close to us.

Then I noticed they carried a strange object in their hands. It resembled a ‘fountain pen flashlight with a large round bulb-like affair on the back end and a grip’ – something like a German luger pistol.

They pointed them at us. After seeing what had happened to our erstwhile captors I thought that our turn was next, whatever it was.

But one [ 1 ] spoke to us.

“Are you surface people?”

I guess we are, as this is where we came from very recently.

“Where did the horlocks find you?”

If you mean those guys, I pointed to the five [ 5 ] motionless figures, back there a few hundred miles – I pointed toward the way we had come in our wild ride.

“You are very fortunate that we came this way,” ‘he’ told us, “You would have also become horlocks and then we would have had to kill you also.”

That was the first time I had realized that the others were dead. They put their strange weapons away and seemed friendly enough, so I ventured to ask them the who and why and everything we had run into.

I told them of our search for guano and how we had encountered the five [ 5 ] horlocks, as he called them, and asked ‘him’ about the machines, their operation and could we get out again?

He smiled and told us, “I could not tell you too much as you would not understand. There are so many things to explain and you could not grasp enough of what I could myself tell you. The ‘people on the surface’ are ‘not ready to have the things’ that ‘the ancients’ have left. Neither I nor any one in any of the caverns know why these things work, but we do know how to operate some of them. However, ‘there are a great many evil people here who create many unpleasant things for both us and the surface people’. They are safe because ‘no one on the surface believes us or them’. That is why I am telling you this. No one would believe that we exist. We would not care, but there are many things here that ‘the outer world must not have until they are ready to receive them’, as ‘they would completely destroy themselves, so ‘we must be sure that they do not find them’. As for the machine, I don’t know how it works, but I know some of the principles of it. It works simply by gravity. And it is capable of reverse. The bottom plate of it always is raised about 4-inches from the surface of the floor. That is why there is no friction and has such a smooth operation. This ‘object suspended from this chain is pure carbon’. It is the key to the entire operation. As I told you before, I cannot explain why it runs, but it does. We want you two [ 2 ] to ‘return to where you came’ and ‘forget about us’. We will show you ‘how to operate the sled’ and ‘we want you never again to enter the cave’. If you do – and you do not encounter the horlocks – we will have to do something about you ourselves so, ‘it would not be advisable to try to return at all events’. One thing I can tell you. ‘We never could permit you to leave another time’.”

He explained to us the operation of the machine and in some way reversed it’s direction.  So thanking them, we seated ourselves in the sled, as he had called it, and were soon on our way back.

Our return trip was really something we enjoyed, as I was sure not to advance the carbon far enough on the dial to give us such terrific speed, but we soon found ourselves where we started from.  The sled slid to a smooth stop and we jumped out and started up the cave afoot.

We must have walked a long way coming in, for we thought we never would come to the surface.  But at last we did.  And it was late afternoon when we emerged.

We lost no time in making our way down the mountain, and Joe tells me that he isn’t even curious about what is in that cave. But I am.

What is the answer to the whole thing?  I would like to know.

We had been told enough for me to believe that down there – somewhere – there are things that might baffle the greatest minds of this Earth. Sometimes I’m tempted to go back into that cave if I could find it again, which I doubt, but, then I know the warning I heard in there might be too true, so I guess I had better be of the same mind as Joe.  He says: ‘What we don’t know don’t hurt us’.

Regardless of Joe’s opinion, however, there is reason to believe that influences from these nether regions can and do affect “us” in a profound way, and even the men whom Ralph and Joe encountered, whoever they were, admitted this fact.

====

Submitted for review and commentary by,

Kentron Intellect Research Vault ( KIRV )
E-MAIL: KentronIntellectResearchVault@gmail.com
WWW: http://KentronIntellectResearchVault.WordPress.Com

Secret ET Technologies

[ NOTE The video ( above ), amidst its computer graphic interface ( CGI ), manipulates many of the actual ‘image document layout photographs of symbolics technology’ and ‘laboratory premise photos of components and sub-structures’ ( removed from the U.S. government classified laboratory Project CARET ), plus ‘select photos’ of unidentified flying objects ( UFO ) bearing similar symbolics worked on by an individual using the alias name “Issac” who publicly released partial details about this story. ]

Secret ET Technologies
by, Concept Activity Research Vault ( CARV )

November 22, 2010 12:37:42 ( PST ) Update ( Published: October 23, 2010 )

USA, California, Menlo Park – November 22, 2010 – What some perceived as chicken footprints  may likely be extraterrestrial symbolic construct technologies. Years ago, an individual – using the alias name “Issac” – conveyed a multi-page report ( ” CARET ” ), laboratory photographs, and detailed personal encounters ( from at least 1984 through 1987 ) on what was believed a U.S. Department of Defense ( DoD ) Defense Advanced Research Projects Agency ( DARPA ) Program recruiting to work on a specific Project ( believed Phase II or Phase III ) studying what was ‘officially briefed’ to Issac – amongst his task team –  as highly complex relatives of  ’extraterrestrial’ structures, materials, components, construct language symbolics.

Unwanted Publicity Intelligence Annex report ( herein ) will only provide more detailed analysis on Issac provided information that news media organizations half-heartedly carried to the public years ago.

Extraterrestrial materials, although highly complex to what Issac’s group had ever seen in their lives before, saw supercomputers lumber under tasking extremely complex substrates and geometric symbolics, amongst other secret-sensitive items, that Issac and others analyzed and deciphered.

Within the building, amongst other secret-sensitive items, combinatoric studies were not limited to extremely complex substrates, symbolics, and more that developed an extremely complex ‘primer’ in which Issac’s report is named “Commercial Applications Research for ExtraTerrestrial Technology” ( C.A.R.E.T. or CARET ).

Issac’s personal accounting ( further below ) reports the aforementioned work was conducted within what first appeared as only an upscale industrial office complex ‘building’, presumably located in the State of California County of Santa Clara.

Issac describes his facility adjacencies being multi-compartmentalized ‘individual government contractor offices’ – believed assigned to various sensitive tasks for the United States government – whereupon, amongst other compartmentalized secret characteristics ( never mentioned the facility being a ’self-sealing building’ ), were five ( 5 ) underground floors hidden.

Five ( 5 ) stories down, sub-surface levels – not reported by Issac – but easily ascertained must have included:

– One [ 1 ] underground level dedicated parking for ‘secondary staff’ and/or ‘special visitors ( e.g. military officials, etc. ) standard passenger vehicles; and,

– Two [ 2 ] underground level dedicated parking for ‘equipment delivery’ trucks; and,

– Three [ 3 ] underground level dedicated parking for ‘militarized troop personnel’ vans and/or buses.

Carefull observation, when combining all the aforementioned, initially bring a ‘few new questions’ followed by a few ‘remote suppositions’ ( immediately below ):

1. Could Issac’s seemingly ‘personal account’ have actually been ‘cleverly ghost written’ for ‘someone else’?

2. Could Issac have actually been a ‘female’?

3. Could the ‘name’ of the author, “Issac,” have actually been derived from the ‘name’ of a ‘male sibling or spouse’?

4. Could Issac’s seemingly ‘personal account’ have actually ‘taken place geographically elsewhere’?

“Issac” ‘reports’ ( further below ) begin ‘personal accountings’ by ‘laying a foundation scene’ surrounding the State of California County of Santa Clara “Silicon Valley” industrial technology history. Issac then simply includes a report ‘cover page’ entitled, “Palo Alto CARET Laboratory” so, for all intents and purposes readers may ‘then instantly gravitate with the assumption’ that Issac’s ‘personal account took place in Palo Alto, California’, but then “Issac” mentions – but does not detail – only very few ‘building characteristics’ and uses the most ‘general of terms’. Might Issac have ‘purposely laid such a foundation’ after ‘altering the true facility name on he report’ to only be known as the “Palo Alto CARET Laboratory” or “PACL” when the ‘building’ may have actually been ‘remotely located elsewhere’ albeit within or under a ‘temporary U.S. government contract project’ and/or ‘adjunct’ of yet another larger organization.

Was Issac’s ‘reported building’ just a stand-alone upscale city street-side industrial office building made of normal iron re-enforced concrete / cement walled tilt-up construction?

At the time, of Issac’s personal account’, the former ROCKWELL SCIENCE CENTER PALO ALTO LABORATORY ( 444 High Street, Suite #400, Palo Alto, California 94301 ) existed near a plethora of ‘other such organization buildings’ performing secret-sensitive work in the Silicon Valley area of northern California.

Plenty of such ‘remotely located buildings’ exist.

To name a ‘few’, are ‘buildings remotely situated’ at the U.S. National Laboratory in Los Alamos, New Mexico and although ‘such buildings and private contractors are geographically situated there’, funding secrets are hidden under ‘administrative domain auspices’ of the ‘University of California’.

But where do “Issac’s” reported ‘armed military personnel’ easily appear from in such a ‘building’?

Other ‘remotely situated buildings’ also exist – under U.S. government contract to private companies – on military reservations such as the United States Air Force Research Laboratory ( ARL ) that oversees “PHILIPS Laboratory” secret-sensitive work performed and tested ‘near but not within’ Kirtland Air Force Base, New Mexico but secretly hidden on that huge ‘reservation’.

[ PHOTO ( above): PHILIPS Laboratory at Kirkland Air Force Base, New Mexico ( USA ) NOTE: click to enlarge photo details. ]

In southern California, Edwards Air Force Base reservation holds unique offerings, amongst other secrets, where after a vehicle passes the ‘entrance sign’ it must continue to be driven an additional 20-miles further before even reaching the ‘main gate’ to gain ‘official admittance’ but with ‘further restricted movement’, whereupon scattered – all around that ‘reservation’ – are a plethora of ‘remotely situated buildings’ under ‘use’ by ‘private business U.S. government contract holders performing, amongst other things, U.S. government secret-sensitive work within a complex of buildings such as  those seen inside Area 51 ( also known as ) the Lazy G Ranch ( also known as ) The Ranch ( Nevada, USA ).

[ PHOTO ( above): Area 51 (aka) Lazy G Ranch (aka) The Ranch ‘main gate’ ( Nevada, USA ) circa 1970s. NOTE: click to enlarge photo details. ]

[ PHOTO ( above): Area 51 (aka) Lazy G Ranch (aka) The Ranch ( Nevada, USA ) circa 1970s. NOTE: click to enlarge photo details. ]

But even Area 51 (aka) the Lazy G Ranch (aka) The Ranch located in the Nevada desert cannot be compared ( here ) to its secret-sensitive  ’sister reservation’ that is only known as a ’proving ground’ someone named ”Dugway” ( Utah, USA ).

In  1968, the U.S. Navy had private contractors build its secret-sensitive China Lake Naval Weapons Station ( near Trona, California ) whereon that ‘reservation’ holds one ( 1 ) building, with eight ( 8 ) subterrainean story floor levels, that is stuck out in the middle of the southern California desert. If the U.S. Navy is a military ship sailing and aircraft flying defense organization, what is it doingwith an 8-story subterrainean building in the middle of a desert?

From within Issac’s given parameter basics describing the reported ‘upscale city industrial office building’ complex – with five ( 5 ) subterrainian stories – the closest resemblance ‘within the State of California County of Santa Clara Silicon Valley area’ that was the  ’unknown building predecessor’ of what later became known as the ‘first privately-owned and operated business’ belonging to the U.S. Central Intelligence Agency ( CIA ) named QIC ( believed known as ) QUANTUM INTEFACE CENTER ( formerly known as ) IN-Q-IT CORPORATION ( formerly known as ) IN-Q-TEL ( affectionately nicknamed ) CIA-IN-Q-TEL where the CIA business ’special technology’ research and development ( R&D ) was performed – although ‘never fully reported’ about – on ‘applications’ for what would later also be known as ” Commercial Off The Shelf ” ( C.O.T.S. / COTS ) product development of secret-sensitive technologies – ‘products’ in-essence, there would accumulate plenty of, for later distribution –  to be eventually traded for ‘other valuable considerations’ ( only very little press coverage reported it, as “… products to be sold to …”  ) ‘in-exchange’ for which a few ‘private companies’ ( e.g. ‘foreign based company’ PHILIPS, and a few select others ) could possibly offer ‘in-exchange’ for what they ‘could’ or ‘were already providing’ under U.S. government contract(s) that could ‘then’  secretly return U.S. Congress ‘budget approved’ U.S. Department of the Treasury funds by re-routing or mirroring bank wire transferred monies back into the U.S. Central Intelligence Agency ( CIA ) private business that could then re-route those monies as deemed fit secretly into yet other out-of U.S. Congressional scrutinized intelligence projects and programs.

But could all this ‘really happen’?

The webpage links ( above ) show who was initially put in-charge and what senior executives were selectively chosen from key private industries that led the private U.S. Central Intelligence Agency business so, it really should come as no surprise to a few understanding mechanics behind international stock market trading and international bank wire transfers between domestic and foreign operations of the United States Federal Reserve System.

[ PHOTO ( above ) : U.S. Central Intelligence Agency ( CIA ) business IN-Q-TEL CORPORATION logo. ]

A few such secret-sensitive self-sealing rad-hard ( anti-radiation hardening concrete / cement via ‘gamma radiation saturation’ ) buildings reposturate – ‘prior to the onslaught of’ a U.S. national emergency – via remote triggering such buildings to submerge their entire mass into underground special covering multi-story holes dug in the ground beneath them.

So, did Issac’s ‘reported building’ have such ‘additional capacities’ or ‘more’?

One might consider such to be more distinct possibilities based on what the “CARET” report entailed and according to “Issac’s” ‘personal accountings’ surrounding such.

Courtesy: Unwanted Publicity Information Group

====

My Experience With The CARET Program And Extra-Terrestrial Technology

by, Isaac [ alias moniker used by the original author ]

June 2007

This letter is part of a package I’ve assembled for Coast to Coast AM [ a nightly broadcast radio station located in the United States of America ] to distribute to its audience. It is a companion to numerous ‘document’ and ‘photo’ scans and should not be separated from them.

You can call me Isaac, an alias I’ve chosen as a simple measure of protection while I release what would be called tremendously ‘sensitive information’ even by todays standards.

‘Sensitive’ is not necessarily synonymous with ‘dangerous’, though, which is why my conscience is clear as I offer this material up for the public.

My government [ United States of America ] has its reasons for its continual secrecy, and I sympathize with many of them, but the truth is that I’m getting old and I’m not interested in meeting my maker one day with any more baggage than necessary.

Furthermore, I put a little more faith in humanity than my former bosses do, and I think that a release of at least some of this information could help a lot ‘more’ than it could ‘hurt’, especially in today’s world.

I should be clear before I begin, as a final note:

I am not interested in making myself vulnerable to the consequences of betraying the trust of my superiors and will not divulge any personal information that could determine my identity.

However my intent is not to deceive, so ‘information that I think is too risky to share’ will be simply ‘left out’ rather than obfuscated in some way ( aside from my alias, which I freely admit is not my real name ).

I would estimate that with the information contained in this letter, I could be narrowed down to one [ 1 ] of maybe 30 to 50 people at best, so I feel reasonably secure.

Some Explanation for the Recent Sightings –

For many years I’ve occasionally considered the release of at least some of the material I possess, but the recent wave of photos and sightings has prompted me to cut to the chase and do so now.

I should first be clear that I’m not directly familiar with any of the crafts seen in the photos in their entirety. I’ve never seen them in a hangar or worked on them myself or seen aliens zipping around in them. However, I have worked with and seen many of the parts visible in these crafts, some of which can be seen in the Q3-85 Inventory Review scan found at the top of this page.

More importantly though, I’m very familiar with the ‘language’ on their [ craft(s) ] ‘undersides’ [ under bellies ] seen clearly in photos by Chad, Rajman, and – ‘another form’ – in the Big Basin photos.

One question I can answer – for sure – is why they are suddenly here.

These crafts have probably existed – in their current form – for decades, and I can say – for sure – that the technology behind [ abut ] them has existed for decades before that.

The ‘language’, in fact – I’ll explain shortly why I keep putting that in quotes – was the subject of ‘my work’ in years past. I’ll cover ‘that’ as well.

The reason they [ extraterrestrial craft(s) ] are suddenly ‘visible’, however is ‘another matter’ entirely.

These crafts – assuming they’re anything like the hardware I worked with in the 1980s ( assuming they’re better, in fact ) – are equipped with technology that enables invisibility. That ‘ability’ can be controlled both ‘on board’ the craft, and ‘remotely’.

However, what’s important in this case is that this ‘invisibility’ can also be ‘disrupted’ by ‘other technology’. Think of it like ‘radar jamming’.

I would bet my life savings ( since I know this has happened before ) that these craft are ‘becoming visible’ and then ‘returning to invisibility’ arbitrarily – probably unintentionally – and undoubtedly for only ‘short periods’ due to the ‘activity of a kind’ of ‘disrupting technology’ [ sonic flocculation ] being ‘set-off elsewhere’ but ‘near-by’.

I’m especially sure of this in the case of the Big Basin sightings where the witnesses themselves reported seeing the craft just ‘appear’ and ‘disappear’.

This is especially likely because of the way the witness described one [ 1 ] of the appearances being only a ‘momentary flicker’, which is consistent with the ‘unintentional’, ‘intermittent triggering’ of such a ‘device’.

It’s no surprise that these sightings are all taking place in ‘California’ ( USA ), and especially the Saratoga Bay / South Bay area.

Not far from Saratoga is Mountain View, California ( USA ) / Sunnyvale, California ( USA ) home to Moffett Field [ formerly, a United States Army Air Corps military airfield / United States Air Force Base ( USAFB ) ] and the [ National Aeronautic Space Administration ] NASA Ames Research center.

Again, I’d be willing to bet – just about anything – that the device capable of hijacking the cloaking of these nearby craft was inadvertently triggered, probably during some kind of experiment, at the exact moment they were being seen.

Miles away, in Big Basin, the witnesses were in the right place – at the right time – and saw the results of this disruption with their own eyes.

God knows what else was suddenly appearing in the skies at that moment, and who else may have seen it.

I’ve had some direct contact with this device, or at least a device capable of the same thing, and this kind of mistake is not unprecedented.

I am personally aware of at least one [ 1 ] other incident in which this kind of technology was accidentally set off, resulting in the sudden visibility of normally invisible things.

The only difference is that these days, cameras are alot more common!

The technology itself is ‘not’ ours, or at least it was ‘not in the 1980s.

Much like the technology, in these crafts themselves, the device capable of remotely hijacking vehicle clacking comes from a non-human source too.

Why we were given this technology has never been clear to me, but it’s responsible for a lot.

Our having access to this kind of device, along with our occasionally hap-hazard experimentation on them, has lead to everything from cloaking malfunctions like this to full-blown crashes.

I can assure you that most ( and in my opinion all) incidents of UFO crashes or that kind of thing had more to do with our meddling with extremely powerful technology at an inopportune time than it did mechanical failure on their part.

Trust me, those things don’t fail unless something even more powerful than them makes them fail ( intentionally or not ). Think of it like a stray bullet. You can be hit by one at any time, without warning, and even the shooter did ‘not’ intend to hit you.

I can assure you heads are rolling over this as well.

If anyone notices a brilliant but sloppy ‘physicist’ patrolling the streets of Baghdad [ Iraq ] in the next couple weeks, I’d be willing to guess how he got there. ( I kid – of course – as I certainly hope that has ‘not’ actually happened in this case ).

I would now like to explain how it is that I know this.

The CARET Program –

My story begins the same as it did for many of my co-workers, with graduate and post-graduate work at university in electrical engineering. And I had always been interested in computer science, which was a very new field at the time, and my interest piqued with my first exposure to a Tixo during grad school.

In the years following school I took a scenic route through the tech industry and worked for the kinds of companies you would expect, until I was offered a job at the United States Department of Defense [ DoD ] and things took a very different turn.

My time at the DoD [ United States Department of Defense ] was mostly uneventful but I was there for quite a while. I apparently proved myself to be reasonably intelligent and loyal.

By 1984 these qualities along with my technical background made me a likely candidate for a new program they were recruiting for called “CARET.”

Before I explain, what CARET was, I should back up a little.

By 1984, Silicon Valley had been a juggernaut of technology for decades. In the less than 40-years since the appearance of Shockley’s transistor, this part of the world had already produced a multi billion dollar computer industry and made technological strides that were unprecedented in other fields – from hypertext and online collaboration in 1968 to the Alto in 1973.

Private industry in Silicon Valley was responsible for some of the most incredible technological leaps in history and this fact did not go unnoticed by the US government and military.

I don’t claim to have any special knowledge about Roswell [ New Mexico, USA incident believed to be an extraterrestrial flying object ( UFO ) crash ] or any of the other alleged early UFO events, but I do know that whatever the exact origin, the ‘military’ was hard at work trying to understand and use the ‘extraterrestrial artifacts’ it had in its ‘possession’.

While there had been a great deal of progress overall, things were not moving as quickly as some would have liked.

So, in 1984, the CARET program was created with the aim of harnessing the abilities of private industry in silicon valley and applying it to the ongoing task of understanding extra-terrestrial technology.

One of the best examples of the power of the tech sector was XEROX PARC, a research center in Palo Alto, California [ USA ].

XPARC was responsible for some of the major milestones in the history of computing.

While I never had the privilege of working there [ XEROX PARC ( Palo Alto, California, USA ], myself, I ‘did’ know many of the people who ‘did’ and I can say that they were among the brightest engineers I ever knew.

XPARC served as one [ 1 ] of the models for the CARET program’s first incarnation, a facility called the PALO ALTO CARET LABORATORY ( PACL ) – lovingly pronounced, “packle” during my time there.

This [ Palo Alto CARET Laboratory ] was where [ Palo Alto, California, USA ] I worked, along with numerous other civilians, under the auspices of military brass who were eager to find out how the tech sector made so much progress so quickly.

My time at the DoD [ U.S. Department Of Defense ] was a major factor behind why I was chosen, and in fact about 30+ [ 30 or more ] others – who were hired around the same time – had also been at the Department [ U.S. Department Of Defense ] about as long but this was not the case for everyone.

A couple of my co-workers were plucked right from places like IBM [ INTERNATIONAL BUSINESS MACHINES ] and, at least two [ 2 ] of them came from XPARC [ XEROX PARC ( Palo Alto, California, USA ] itself.

My DoD [ U.S. Department Of Defense ] experience did make me more eligable [ eligible ] for positions of management, however, which is how I have so much of this material [ documents, photos, etc. ] in my possession to begin with.

So, in other words, civilians ( like myself ) who had – at most – some decent experience working for the DoD [ U.S. Department Of Defense ] but no actual military training or involvement were suddenly finding ourselves in the same room as highly classified extra-terrestrial technology.

Of course they spent about 2-months briefing us all before we saw or did anything, and did their best to convince us that if we ever leaked a single detail about what we were being told, they’d do everything short of digging up our ancestors and putting a few slugs in them too – just for good measure.

It seemed like there was an armed guard in every corner of every room.

I’d [ I had ] worked under some pretty hefty NDAs [ Non-Disclosure Agreements ] in my time but this was so far out of my depth. I didn’t think I was going to last 2-weeks in an environment like that. But amazingly things got off to a good start.

They wanted us, plain and simple, and our industry – had shown itself to be so good at what it did – that they were just about ready to give us carte blanche.

Of course, nothing with the military is ever that simple, and as is often the case they wanted to have their cake and eat it too. What I mean by this is that despite their interest in picking our brains and learning whatever they could from our way of doing things, they still wanted to do it ‘their way’ often enough to frustrate us. At this point I’m going to gloss over the emotional side of this experience, because this letter isn’t intended to be a memoir, but I will say that there’s almost no way to describe the impact this kind of revelation has on your mind.

There are very few moments in life in which your entire world view is turned forever upside down, but this was one of them.

I still remember that turning point – during the briefing – when I realized what he’d just told us, and that I hadn’t heard him wrong, and that it wasn’t some kind of joke.

In retrospect, the whole thing feels like it was in slow motion, from that ‘slight pause’ he took – just before the term “extra-terrestrial” came out for the first time – to the way the room itself seemed to go off kilter as we collectively tried to grasp what was being said.

My reflex kept jumping back and forth between trying to look at the speaker, to understand him better, and looking at everyone else around me, to make sure I wasn’t the only one that was hearing this.

At the risk of sounding melodramatic, it’s a lot like a child learning his parents are divorcing. I never experienced that myself, but a very close friend of mine did when were boys, and he confided in me a great deal about what the experience felt like. A lot of what he said would aptly describe what I was feeling in that room.

Here was a ‘trusted authority figure’ telling you something that you just don’t feel ready for, and putting a burden on your mind that you don’t necessarily want to carry. The moment that first word comes out, all you can think about it is, what it was like only ‘seconds ago’, and knowing that life is never going to be as simple as it was ‘then’.

After all that time at the DoD [ U.S. Department Of Defense ], I thought I at least had some idea of what was going on in the world, but I’d never heard so much as a peep about this.

Maybe one day I’ll write more on this aspect, because it’s the kind of thing I really would like to get off my chest, but for now I’ll digress.

Unlike traditional research in this area, we weren’t working on new toys for the air force.

For numerous reasons, the CARET people decided to aim its efforts at ‘commercial applications’ rather than ‘military’ ones.

They basically wanted us to turn these ‘artifacts’ into something they could ‘patent’ and ‘sell’.

One of CARET’s most ‘appealing promises’ was the revenue generated by these product-ready technologies, which could be funneled right back into ‘black projects’. Working with a ‘commercial application’ in-mind was also yet another way to keep us in a familiar mind state. Developing technology for the military is very different than doing so for the ‘commercial sector’, and not having to worry about the difference was another way that CARET was very much ‘like private industry’.

CARET shined, in the way it let us work the way we were used to working. They wanted to recreate as much of the environment we were used to as they could without compromising issues like security. That meant we got ‘free reign to set up’ our own ‘workflow’, ‘internal management structure’, ‘style manuals’, ‘documentation’, and the like. They wanted this to look and ‘feel like private industry’, ‘not the military’. They ‘knew’ this was ‘how to get the best work out of us’, and they were right.

But things didn’t go as smoothly when it came to matters like access to classified information.

They were exposing what is probably their single biggest secret to a group of people who had never even been through basic training and it was obvious that the gravity of this decision was never far from their minds.

We started the program with a small set of ‘extra-terrestrial artifacts’ along with ‘fairly elaborate briefings’ on ‘each’ as well as ‘access to a modest amount of what research had already been completed’.

It wasn’t long before we realized ‘we needed more’ though, and getting them to provide even the smallest amount of new material was like pulling teeth.

CARET stood for “Commercial Applications Research for Extra-Terrestrial Technology”, but we often joked that it should have stood for “Civilians Are Rarely Ever Trusted.”

PACL [ PALO ALTO CARET LABORATORY ] was located in Palo Alto [ California, USA ], but unlike XPARC [ XEROX XPARC ( Palo Alto, California, USA ], it wasn’t at the end of a long road in the middle of a big complex surrounded by rolling hills and trees.

PACL was hidden in an ‘office complex’ – owned entirely by the military but ‘made to look like an unassuming tech company’.

From the street, all you could see was what appeared to be a normal ‘parking lot’ with a ‘gate’ and a ‘guard [ security ] booth’, and a 1-story building inside with a ‘fictitious name’ and ‘[ fictitious ] logo’.

What was ‘not visible’ – from the street – was that ‘behind’ the very ‘first set of doors’ was enough ‘armed guards’ to invade Poland, plus five [ 5 ] additional underground stories [ levels ].

They wanted to be as close as possible to the kinds of people they were looking to hire, and be able to bring them in with a minimum of fuss.

Inside, we had everything we needed. State of the art hardware and a staff of over 200 computer scientists, electrical engineers, mechanical engineers, physicists and mathematicians.

Most of us were civilians, as I’ve said, but some were military, a few of them had been working on this technology already.

Of course, you were never far from the barrel of a ‘machine gun’ – even ‘inside the labs’ themselves ( something many of us never got used to ) – and ‘bi-weekly tours’ were made by ‘military brass’ to ensure that not a single detail was out of line. Most of us underwent extensive searches on our way into and out of the building. There it was, probably the biggest secret in the world, in a bunch of parts spread out on laboratory tables in the middle of Palo Alto so you can imagine their concern.

One ‘downside’ to CARET was that it was ‘not’ as ‘well-connected’ as ‘other operations’ undoubtedly ‘were’.

I ‘never got to see’ any ‘actual extra-terrestrials’ ( not even photos ), and in fact ‘never even saw’ one [ 1 ] of their ‘complete vehicles’ – ’99% of what I saw’ was ‘related to the work at-hand’, all of which was conducted within a very narrow context on ‘individual artifacts only’. The remaining ’1% came from people’ I met through the program, many of which ‘working more closely’ with “the good stuff” or ‘had [ worked with ] in the past’.

In fact, what was especially amusing about the whole affair was the way that our ‘military management’ almost ‘tried to act’ as if the ‘technology’ – we were essentially ‘reverse engineering’ – was ‘not extra-terrestrial’ at all.

Aside from the word “extra-terrestrial,” itself, we rarely heard any other terms like “alien” or “UFO” or “outer space” or anything. ‘Those aspects’ were ‘only mentioned briefly’ when absolutely ‘necessary to explain something’.

In many cases it was necessary to ‘differentiate’ between the different ‘races’ and ‘their’ respective ‘technology’, and they did ‘not’ even use the word “races.” They were referred to simply as different “sources.”

The Technology –

A lot of the technology we worked on was what you would expect, namely ‘anti-gravity’. Most of the ‘researchers’ ( on the staff ) – with ‘backgrounds’ in ‘propulsion’ and ‘rocketry’ – were ‘military’ men, but the ‘technology’ we were dealing with was so ‘out of this world’ that it didn’t really matter all that much what your background was because none of it applied.

All we could hope to do was use the ‘vocabulary’ of our respective fields as a way ‘to model’ the extremely bizarre ‘new concepts’ we were very slowly ‘beginning to understand’ as best we could.

A ‘rocket engineer’ doesn’t usually rub elbows much with a ‘computer scientist’, but inside PACL [ PALO ALTO CARET LABORATORY ( Palo Alto, California, USA ) ], we were all ‘equally mystified’ and were ready to ‘entertain any and all ideas’.

The ‘physicists’ made the most headway, initially because out of all of our skills, their’s ‘overlapped the most’ with the ‘concepts behind this technology’ ( although that isn’t saying much! ). Once they [ physicists ] got the ball rolling though, we began to find that many of the ‘concepts found in computer science’ were applicable as well, albeit in very vague ways.

While I didn’t do a lot of work with the antigrav [ anti-gravity ] ‘hardware’, myself, I was occasionally involved in the ‘assessment’ of ‘how’ that ‘technology’ was meant to ‘interface’ with its ‘user’.

The antigrav [ anti-gravity ] was amazing, of course, as were the ‘advances’ we were making with ‘materials engineering’ and so on.

But what interested me most then, and still amazes me most to this day, was something completely unrelated.

In fact, it was this ‘technology’ that immediately jumped out at me when I ‘saw’ the Chad and Rajman ‘photos’, and even more-so in the ‘Big Basin photos’.

The “Language” –

I put the word Language in quotes because calling what I am about to describe a “language” is a misnomer, although it is an easy mistake to make.

Their [ extraterrestrial ] ‘hardware’ was ‘not’ operated in quite the same way as ours.

In our technology, even today, we have a combination of ‘hardware and software’ running almost everything on the planet.

Software is more abstract than hardware, but ultimately it needs hardware to run it.

In other words, there’s no way to write a computer program on a piece of paper, set that piece of paper on a table or something, and expect it to actually do something.

The most powerful ‘code’ in the world still ‘does not actually do anything’ until a piece of ‘hardware interprets it [ software ]‘ and ‘translates’ its ‘commands’ into ‘actions’.

But ‘their [ extraterestrial ] technology’ is ‘different’.

It really did operate like the magical piece of paper sitting on a table, in a manner of speaking.

They had something akin to a ‘language’ that could quite literally ‘execute’ itself – at least in the ‘presence’ of a very specific type of ‘field’ [ ‘field presence execution’ ].

The ‘language’, a term I am still using very loosely, is a ‘system’ of ‘symbols’ ( which does admittedly very much resemble a written language ) along with ‘geometric forms’ and ‘[ geometric ] patterns’ that fit together [ ‘interlocking’ ] to ‘form diagrams’ that are themselves ‘functional’.

Once they [ interlocking symbolic format diagrams ] are ‘drawn’ – so to speak – on a suitable ‘surface’ made of a suitable ‘material’ and in the ‘presence’ of a certain type of ‘field’, they immediately begin performing the desired tasks. It really did seem like magic to us, even after we began to understand the principles behind it.

I worked with these ‘symbols’ – more than anything [ else ] – during my time at PACL [ PALO ALTO CARET LABORATORY ( Palo Alto, California, USA ) ], and ‘recognized them’ the moment I saw them in the ‘photos’.

They appear in a very simple ‘form’ on Chad’s ‘craft’, but appear in the ‘more complex diagram form’ on the ‘underside’ of the ‘Big Basin craft’ as well.

Both are unmistakable, even at the small size of the Big Basin photos.

An example of a diagram in the style of the Big Basin craft is included with this in a series of scanned pages from the [ mistitled ] “Linguistic Analysis Primer”.

We needed a copy of that diagram to be utterly precise, and it took about a [ one – 1 ] month [ 30-days ] for a team of six [ 6 ] to ‘copy’ that ‘diagram’ into our drafting program!

Explaining everything I learned about this technology would fill up several volumes, but I will do my best to explain at least ‘some’ of the ‘concepts’ – as long as I am taking the time to write all this down.

First of all, you wouldn’t open-up their [ extraterrestrial ] ‘hardware’ to find a CPU here, and a data bus there, and some kind of memory over there.

Their [ extraterrestrial ] ‘hardware’ appeared to be ‘perfectly solid’, and consistent, in terms of ‘material’ – from one side to the other. Like a rock or a hunk of metal.

But upon [ much ] closer inspection, we began to learn that it was actually one [ 1 ] big ‘holographic computational substrate’ – each “computational element” ( essentially, individual ‘particles’ ) can ‘function independently’ but are ‘designed to function together’ in tremendously ‘large clusters’.

I say its ‘holographic’ because you can ‘divide it up into the smallest chunks’ you want and still find a scaled-down but complete representation of the whole system.

They produce a ‘non-linear computational output’ when ‘grouped’.

So four [ 4 ] elements, working together, is actually more than four [ 4 ] times ‘more powerful than’ one [ 1 ].

Most of the internal “matter” in their [ extraterrestrial ] ‘crafts’, usually everything – except the outermost housing – is actually ‘this [ extraterestrial] substrate’ and can ‘contribute to computation’ at ‘any time’ and in ‘any state’.

The ‘shape’ of these [ extraterrestrial ] “chunks” of ‘substrate’ also had a profound ‘effect’ on its [ extraterrestrial ] ‘functionality’, and often served as a “shortcut” to achieve a goal that might ‘otherwise’ be more ‘complex’.

So back to the language.

The language is actually a “functional blueprint.”

The ‘forms’ of the ‘shapes’, ‘symbols’ and ‘arrangements’ thereof is itself ‘functional’.

What makes it all especially ‘difficult to grasp’ is that every ‘element’ of each “diagram” is ‘dependant on’ and ‘related to’ every ‘other element’ [ elements ], which means ‘no single detail’ can be ‘created’, ‘removed’ or ‘modified’ independently.

Humans like written language because each element of the language can be understood on its own, and from this, complex expressions can be built.

However, their “language” is entirely ‘context sensitive’, which means that ‘a given symbol’ could mean as little as a ’1-bit flag’ in ‘one [ 1 ] context’, or – quite literally – contain the entire human genome or a galaxy star map in another.

The ability for a single, small symbol to contain, not just represent, tremendous amounts of data is another counter-intuitive aspect of this ‘concept’.

We quickly realized that even ‘working in groups’ of ten [ 10 ] or more on the ‘simplest of diagrams’, we found it virtually impossible to get anything done. As each new feature was added, the ‘complexity of the diagram exponentially grew’ to unmanageable proportions.

For this reason we began to develop computer-based systems to manage these details and achieved some success, although again we found that a threshold was quickly reached beyond which even the supercomputers of the day were unable to keep up.

Word was that the ‘extraterrestrials could design’ these ‘diagrams’ as ‘quickly’, and [ as ] easily as a human programmer could write a [ computer language ] Fortran program.

It’s humbling to think that even a ‘network of supercomputers’ was ‘not’ able to ‘duplicate’ what they could do in their [ extraterrestrial ] own heads.

Our entire system of language is based on the idea of assigning meaning to symbols.

Their [ extraterrestrial ] technology, however, somehow ‘merges’ the ‘symbol’ and the ‘meaning’, so a subjective audience is not needed.

You can put whatever meaning you want on the symbols, but their behavior and functionality will not change, any more than a transistor will function differently if you give it another name.

Here’s an example of how complex the process is.

Imagine I ask you to incrementally add random words to a list such that no two [ 2 ] words use any of the same letters, and you must perform this exercise entirely in your head, so you can’t rely on a computer or even a pen and paper.

If the first [ 1st ] in the list was, say, “fox”, the second [ 2nd ] item excludes all words with the letters F, O and X.

If the next word you choose is “tree”, then the third [ 3rd ] word in the list can’t have the letters F, O, X, T, R, or E in it.

As you can imagine, coming up with even a third [ 3rd ] word might start to get just a bit tricky, especially since you can’t easily visualize the excluded letters by writing down the words.

By the time you get to the fourth [ 4th ], fifth [ 5th ] and sixth [ 6th ] words, the problem has spiraled out of control.

Now imagine trying to add the billionth [ 1,000,000,000 ] word to the list ( imagine also that we’re working with an ‘infinite alphabet’ so you don’t run out of letters ) and you can imagine how difficult it is for even a computer to keep up.

Needless to say, writing this kind of thing “by hand” is orders of magnitude beyond the capabilities of the brain.

My background lent itself well to this kind of work though. I’d spent years ‘writing code’ and ‘designing’ both ‘analog’ and ‘digital’ circuits, a process that at least visually resembled these diagrams in some way.

I also had a personal affinity for ‘combinatorics’, which served me well as I helped with the ‘design of software’ running on ‘supercomputers’ that could juggle the often trillions [ 1,000,000,000,000 ] of rules necessary to create a ‘valid diagram’ of any ‘reasonable complexity’.

This overlapped quite a bit with ‘compiler theory’ as well, a subject I always found fascinating, and in particular ‘compiler optimization’, a field that was ‘not’ half [ 50% ] of what it is today back then.

A running joke among the linguistics team was that Big-O notation couldn’t adequately describe the scale of the task, so we’d substitute other words for “big”.

By the time I left I remember the consensus was “Astronomical-O” finally did it justice.

Like I said, I could go on for hours about this subject, and would love to write at least an introductory book on the subject if it was not – still completely – ‘classified’, but that’s not the point of this letter so I’ll try to get back on track.

The last thing I’d like to discuss is how I got copies of this material, what else I have in my possession, and what I plan to do with it in the future.

My Collection –

I worked at PACL [ PALO ALTO CARET LABORATORY ( Palo Alto, California, USA ) ] from 1984 to 1987, by which time I was utterly burned out.

The sheer volume of details to keep in mind while working with the diagrams was enough to challenge anyone’s sanity, and I was really at the end of my rope with the military attitude towards our “need to know”. Our ability to get work done was constantly hampered by their reluctance to provide us with the necessary information, and I was tired of bureaucracy getting in the way of research and development [ R&D ].

I left somewhere in the middle of a 3-month bell curve in which about a quarter of the entire PACL [ PALO ALTO CARET LABORATORY ( Palo Alto, California, USA ) ] staff left for similar reasons.

I was also starting to disagree with the direction the leadership wanted to take as far as the subject of extra-terrestrials went.

I always felt that at least some form of disclosure would be beneficial, but as a lowly CARET ‘engineer’ I wasn’t exactly in the position to call shots.

The truth is, our management didn’t even want us discussing – even among ourselves – non-technical aspects of this subject ( such as ethical or philosophical issues, as they felt it was enough of a breach of security to let civilians like us anywhere near this kind of thing in the first place.

So, about 3-months before I resigned ( which was about 8-months before I was really out – since you don’t just walk out of a job like that with a 2-week notice ) – I decided to start taking advantage [ remove PACL work documents, etc. ] of my ‘position’ [ a ‘situational position’ wherein his ( Issac ) PACL ‘security inspections’ on his ( Issac ) ‘person’ became lessened or ‘weak’ upon his ( Issac ) ‘departures from the PACL facility’ ].

As I mentioned earlier, my DoD [ United States Department Of Defense ] experience got me into an internal management role sooner than some of my colleagues, and after about a [ one – 1 ] year of that kind of status, the outgoing [ departing ] searches [ security inspections ] each night became slightly less rigorous.

Normally, we were to empty out any containers, bags or briefcases, then remove our shirt and shoes and submit to a kind of frisking. Work was never allowed to go home with you, no matter who you were.

For me, though, the briefcase search [ secuity inspection ] was [ had become ] eventually enough [ all that the security inspection became on him ( Issac ) ].

Even before I [ Issac ] actually decided to do it [ remove PACL work documents, etc. ], I was sure that I would be able to sneak certain materials out with me.

I wanted to do this [ remove PACL work documents, photos, etc. ] because I knew the day would come when I would want to write something like this, and I knew I’d regret it until the day I died if I didn’t at least leave the possibility open to do so.

So I started photocopying [ PALO ALTO CARET LABORATORY ( Palo Alto, California, USA ) ] documents and reports by the dozen.

I had then [ 3-months before he ( Issac ) resigned from PALO ALTO CARET LABORATORY ( Palo Alto, California, USA ) ] put the papers [ documents, etc. ] under my shirt around my lower back, tucked enough into my belt to ensure they wouldn’t fall out.

I could do this [ ‘physically able to do’ but ‘not authorized to do’ ] in any one of a few ‘short windowless hallways’ on some of the ‘lower floors’, which were among the few places that did ‘not’ have an ‘armed guard watching’ my every move.

I would walk in one end [ of the ‘short windowless hallways’ ] with a stack of papers large enough that when I came out the other end [ of the ‘short windowless hallways’ ] with some of them [ documents, photos, etc. ] in my shirt – there would ‘not’ be a visible [ observational ] difference in what I was holding.

You absolutely cannot be too careful if you’re going to pull a stunt like this.

As long as I walked carefully they would ‘not’ make a crinkling noise [ paper flex rustling upon movement ].

In fact, the more papers I took, the less noise they made, since they were ‘not’ as flimsy [ resistant to flex upon restricted movement ] that way.

I’d often take upwards of 10-pages up to 20-pages at once [ each time ].

By the time I was done, I had made out with [ unlawfully removed documents, photos, etc. away from PALO ALTO CARET LABORATORY ( Palo Alto, California, USA ) ] ‘hundreds’ [ 200+ or more ] of ‘photocopies’, as well as a few ‘originals’ and a ‘large collection’ of ‘original photographs’.

With this ‘initial letter’, I have attached high resolution scans of the following:

– One [ 1 ] page is from a [ PALO ALTO CARET LABORATORY ( Palo Alto, California, USA ) ] “inventory review” with a ‘photo’ – appears to depict one ( 1 ) of the ‘parts’ found in the Rajman sighting and ‘parts’ very similar to the Big Basin craft;

– The first [ 1st ] nine ( 9 ) pages of one ( 1 ) of our [ PALO ALTO CARET LABORATORY ( Palo Alto, California, USA ) ] ‘quarterly’ research ‘reports’;

– Scans of the ‘original photographs’ used ‘in that report’ – since the ‘photocopies obscure’ most of the ‘details’; and

– Five [ 5 ] pages from a ‘report’ on our [ PALO ALTO CARET LABORATORY ( Palo Alto, California, USA ) ] ongoing analysis of the “language” ( inappropriately titled “linguistic analysis” ) depicting the kind of diagram – just barely visible on the underside of the Big Basin craft.

This material is the most, ‘relevant’ and ‘explanatory’, I could find on ‘short notice’.

Now that these are up [ on the internet ], ‘if’ I decide to release more in the future, I’ll be able to take my time and better search this rather large collection of mine that I’ve sadly never organized.

I’m not sure what I’ll be doing with the rest of the collection in the future.

I suppose I’ll wait and see how this all plays out, and then play it by ear.

There are certainly risks involved in what I’m doing, and if I were to actually be identified and caught, there could be rather serious consequences.

However, I’ve taken the proper steps to ensure a ‘reasonable level of anonymity’ and am quite secure in the fact that the information I’ve so far provided is by ‘no means unique’ among many of the CARET participants [ had access to ].

Besides, part of me has always suspected that the [ United States of America ] government ‘relies on the occasional leak’ – like this – and actually wants them to happen, because it ‘contributes to a steady slow-paced path towards revealing’ the ‘truth’ of this ‘matter’.

Since Leaving CARET –

Like I said, I left PACL in 1987, but have kept in touch with a great many of my friends and co-workers from those days.

Most of us are retired by now, except – of course – for those of us that went-on to get ‘teaching jobs’, but a few of us ‘still hear things’ [ ‘still told of these matters’ ] through the grapevine.

As for CARET itself, I’m not sure what’s become of it.

Whether it’s still known by the same name, I’m quite sure it’s ‘still active’ in ‘some capacity’, although who knows where.

I heard from a number of people that PACL [ PALO ALTO CARET LABORATORY ( Palo Alto, California, USA ) ] closed up shop a few years after I left, but I’ve still yet to get a clear answer on why exactly that happened.

But I’m sure ‘the kind of work we did there’ [ PALO ALTO CARET LABORATORY ( Palo Alto, California, USA ) ] is ‘still going’ strong.

I’ve heard from a lot of friends that there are multiple sites like PACL in Sunnyvale, California ( USA ) and Mountain View, California ( USA ) also disguised to look like ‘unremarkable office space’.

But this is all second-hand information so you can make of it what you will.

Around 2002, or so, I came across Coast to Coast AM [ radio station in the United States of America ] and have been hooked ever since.

I admit, I don’t take most of the [ radio program ] show’s content as anything more than entertainment, but there have been occasions when I could be sure a guest was clearly speaking from experience or a well-informed source.

For me, there’s just something very ‘surreal about hearing all this speculation’ and ‘so-called inside information’ about UFOs [ Unidentified Flying Objects ] ( and the like ) but [ my ( Issac ) ] being ‘personally able to verify’ at least ‘some of it’ as being true or false. It’s [ Coast to Coast AM radio program ( USA ) ] also a ‘nightly’ [ time period, when Coast to Coast AM radio is broadcasted ] reminder of how hectic things were in those days, which helps me enjoy my retirement all the more.

Knowing I’m not part of that crazy world anymore really is something I enjoy on a daily basis, as much as I miss some of it.

Conclusion –

What I’ve shared so far is only a very small portion of what I have, and what I know.

Despite the very sheltered and insulated atmosphere within CARET, I did ultimately learn a great deal from various colleagues, and some of what I learned is truly incredible.

I’d also like to say that for what it’s worth, during my time [ 1983 – 1987 ] there [ PALO ALTO CARET LABORATORY ( Palo Alto, California, USA ) ] I never heard anything about invasions, or abductions, or many of the more frightening topics that often pop up on Coast to Coast AM [ radio program ( USA ) ].

That’s not to say that none of it is true, but in my time working alongside some of the most well-connected people in this field, it never came up.

So at the very least I can say my intent is not to scare anyone.

My view on the extra-terrestrial situation is very much a positive, albiet still highly secretive one.

One thing I can definitely say is that if they wanted us gone, we would have been gone a very, very long time ago, and we wouldn’t even have seen it coming.

Throw out your ideas about a space war or anything silly like that. We’d be capable of fighting back against them about as much as ants could fight back against a stampede of buffalo.

But that’s okay, we are the ‘primitive race’, they [ extraterrestrials ] are the ‘advanced races’, and that’s just the way it is.

The ‘other advanced races let them live through their primitive years’ back in ‘their day’, and there is no reason to think it will be any different for us.

They [ extraterrestrials ] are not in the market for a new planet, and even if they [ extraterrestrials ] were there are way too many planets out there for them [ extraterrestrials ] to care about ours enough to take it by force.

To reiterate my take on the recent sightings, I would guess that experimentation – done in the last couple months – on a device that, among other things, is capable of interfering with various crafts onboard invisibility has resulted in a sudden wave of sightings.

It may ‘not’ explain ‘all’ of the recent events, but like I said, I’d bet my life that ‘is’ exactly what happened at Big Basin – at least – and it’s probably related in some way to the Chad, Rajman and Tahoe [ Lake Tahoe, California / Nevada ( USA ) ] sightings [ of the unidentified flying object ( UFO ) ].

So, despite all the recent fanfare over this, I’d say this does ‘not’ mean much.

Most importantly, they are ‘not suddenly’ “here,” they [ extraterrestrials ] have been here for a long time, but just [ have ] happened to turn ‘intentionally visible’ for brief periods ‘recently’.

Lastly, there are so many people selling books, and DVDs, and doing lectures and all that so, I would like to reiterate the fact that I am ‘not’ here to ‘sell’ anything.

The material I’m sharing is ‘free to distribute’ provided it’s all kept intact and unmodified, and this letter is included.

I tend to question the motives of anyone charging money for their information, and will assure you that I [ Issac ] will never do such a thing.

And in the future, just to cover all the bases, anyone claiming to be ‘me’ [ Issac ] who ‘is’ selling a DVD or book is most certainly ‘not going to be me’ [ Issac ].

Any future releases from me [ Issac ] will come from the e-mail address I’ve used to contact Coast to Coast AM [ USA radio station ], and will be sent to them [ Coast to Coast AM ( USA radio station ) ] only.

I’d like to make this clear as well to ensure that people can be sure that any future information comes from the same source, although I must be clear:

At this time I do not have any future plans for additional information. Time will tell how long I will maintain this policy, but do not expect anything soon.

I’d really like to let this information “settle” for a while and see how it goes.

If I find out I’m getting an IRS [ United States Department of the Treasury, Office of Internal Revenue Service ( IRS ) ] audit tomorrow, then maybe this wasn’t too smart.

Until then, I’m going to take it slow.

I hope this information has been helpful.

– Issac

– –

One of the documents ( in the form of high resolution scans of the original ) uploaded was called “PALO ALTO CARET LABORATORY Q-4 1986 RESEARCH REPORT” – here are some excerpts:

1. OVERVIEW –

This document is intended as a primer for the tentative findings of the Q4 1986 research phase ( referred to herein as “Q-4 1986″ ) at the Palo Alto CARET Laboratory (aka) PACL. In accordance with the CARET program mission statement, the goal of this research has been achieving a greater understanding of extraterrestrial technology within the context of commercial applications and civilian use. Examples of such applications, in no particular order, include transportation, medicine, construction, energy, computing and communication.

The ultimate goal of this research is to provide a core set of advanced technologies in a condition suitable for patent review.

2. EXTRACTION –

The process of converting raw artifacts of extraterrestrial origin to usable, fully-documented human technology is termed extraction. The extraction process ultimately consists of two phases:

First [ 1st ] is the establishment of a complete theoretical and operational understanding of the artifact; and,

Second [ 2nd ] is a distillation of the artifact’s underlying principles into usable, product-oriented technology.

Suggestions of specific product applications on behalf of PACL have been encouraged, but are not considered mandatory or essential.

– –

From: Isaac Subject: Re: “Drones” Date: June 27, 2007

Isaac:

“There are a few misconceptions that I have noticed so far and would like to clear them up, and will also answer your questions:

1) I realize now that I did not make this clear, but I should clarify that I am not responsible for the blacking out of the Q4-86 report. Most of the copies I was able to make came from documents that were already archived, which meant that they had already been censored for use by outside parties that needed access to some, but not all, of CARET’s information. I’m trying to share this information, not hide it, but if I did feel that if a given topic was too sensitive for some reason, I would make it clear that I had personally covered it up and probably try to give a reason why.

2) I do not understand the question about why the diagram would be “formatted for 8.5 x 11″… As I mention in my letter, the diagram is a reproduction, not the original. We had a team of technical artists painstakingly copy the diagram from its original source, which was a slightly curved panel not unlike the one seen in the Big Basin craft, although this one was apparently inside the craft, not on the outside. We copied it into a drafting program over the course of about a month.

Our software was understandably primitive by today’s standards, but it was still orders of magnitude more powerful than a pencil and paper would have been. This made a task that would have otherwise been nearly impossible relatively feasible, albeit extremely time-consuming. I can assure you, “they” did not make anything particularly convenient for us. One of the reasons we chose to reproduce that particular diagram was because out of all the diagram-artifacts we had access to, it was on the flattest surface.

Since the geometry of the forms is extremely important, curvature of the surface it’s printed on must be “corrected” if it is to be reproduced in a surface with a different contour (such as a flat page). This can be done in a number of ways, by either using a mathematical model to reverse the effect of the surface curves on the diagram’s shapes, or by methods of physical measuring that allow precise measuring of irregular surfaces. In either case, however, it adds a significant new dimension of labor to an already extremely labor-intensive task, so it’s avoided whenever possible. We really just needed one or two accurately copied diagrams to serve as convenient examples for our own work in decoding and reproducing it, so luckily this was not something we had to do often. Some experimentation was being done on ways to “scan” the diagrams as well, using an almost completely automated process that could automatically account for curved surfaces, but during my time there, very little progress was made on this front.

3) I think the confusion over the quality of the documents stems from the fact that he (critic) is under the impression they (CARET document) were typeset. They were not. First of all, I’m no guru when it comes to graphics or design, but being in close contact with numerous people from places like XPARC will give you enough background to know the lay of the land. What’s first important to note is that systems capable of desktop publishing had been in development for many years before CARET, mostly starting with the Xerox Alto (in 1973), which XPARC developed themselves.

In fact, I once remember hearing from someone related to the original Alto team that Boeing (I believe) used the Alto to lay out and print the documentation for one of their planes (or something to that effect, I heard the story years ago). The joke was apparently that there was so MUCH documentation that the plane itself could essentially be filled with the pages. Furthermore, laser printing itself had also been around for many years (albeit in an extremely expensive form), and was also developed within XPARC (more or less). Other systems, such as PERQ and Lilith, also came out around the late 70′s and while none of them turned into major commercial products, they were not uncommon among large companies and [mostly] universities and were put to very productive use.

These systems were also the inspiration for the Apple Lisa and Macintosh, which was of course perhaps the biggest factor in the consumer-level desktop publishing boom of the late 80′s and early 90′s. By 1984, there were quite a few options available for producing these kinds of documents, they were just ABSURDLY expensive, so they weren’t on every street corner. Obviously it was nowhere near as turnkey and simple as it is today, but it was a very crude approximation of the same process with similar tools. We just had far less features and everything was a hell of a lot slower. But the point I’m trying to make is that while our method of documentation was somewhat advanced for its time, and also somewhat uncommon, it was hardly unattainable by a sufficiently motivated, financed, and well-connected organization.

I had very little contact with the technical writers for the most part, but I do know that we were using this kind of technology for both page layout and printing. CARET was expected to produce a massive amount of detailed, well-formatted documentation that could be easily modified and re-used for numerous drafts and revisions, and we would not have been able to keep up using traditional page layout and typesetting techniques. The mid-1980′s were a very transitional period for these fields, and I would suggest that people do not assume we were using run-of-the-mill standards.

One of the things I appreciated most about CARET was that if the technology was available, and we needed it to work better or more effectively, it was given to us with little debate. But typesetting and digital page layout are apples and oranges, so I think most of this is a moot point anyway.

The bottom line is that many people both inside and outside the engineering world frequently underestimate how long we’ve had a lot of the technology we have. 99% of the algorithms we use today were developed decades ago, they just didn’t have the same practical applications immediately available. Most of the engineers of the 60′s and 70′s would have been right at home with today’s developments and technologies. The only difference is that things have gotten smaller and faster. In the vast majority of technologies, that is the only thing that REALLY changes from one era to the next. If I told the average person that we had speech-synthesizing technology in 1936, they probably wouldn’t believe me.

I could show you a prototype of a simple drafting/design system that was operated by a light pen directly on a screen from the 1960′s. You could draw a shape freehand, then immediately rotate it, modify it, duplicate it, or whatever. You could draw lines connecting different objects, then erase them by simply drawing a squiggly line over it. The computer could interpret the squiggles as a sign to erase something, all in real time. And this was half a century ago, and decades before CARET. Think about that for a moment. The point is, most of what we have today is much older than we think. The only differences are that it’s faster, cheaper, and a marketing team has given it a glossy finish and found a commercial application for it. But if you take away some of the speed, power, ubiquity and consumer appeal, you’ll find a lot of today’s technology scattered throughout much of the 20th century. I hope this is helpful.

Isaac”

– –

From: Isaac Subject: Re: “Drones” Date: June 27, 2007

Isaac:

“1) While I wasn’t a major player in the (CARET) organization, I was hardly ‘some worker.’ My middle-management position is the only reason I was able to make out with what I did. Bear in mind that even someone in my position would never get the chance to leave with even the smallest of actual artifacts, but paperwork smuggling was feasible for anyone who wasn’t subjected to the frisking.

Also, let’s not forget that paperwork only proves so much. I’ll be the first to agree that everything I’ve provided could be faked, I suppose. It is, after all, just a series of images. While the powers that be obviously don’t want this material leaking if they can help it, they’re certainly aware that scans of documents aren’t in the same league as UFOs landing on the White House lawn. I’m not the first person to leak a document or a photo, and I won’t be the last. The information I’ve shared is very unlikely to change the world, and this is the reason I’m not worried about being literally murdered if I’m identified. I’ll face consequences to be sure, but it’s not the kind of thing they kill for.

2) Of course the manual doesn’t look anything like typical government and military documents. The entire purpose of CARET was to recreate the look and feel of silicon valley private enterprise, populate it with private industry engineers, and let it tackle the problem of extraterrestrial technology research. Style manuals were among the numerous things we brought with us from the ‘outside world.’ I’m not sure what else can be said about this. I agree it’s uncommon for non-standard documents to come out of this kind of research, but it’s even more uncommon for people like myself (and even more so for many of my co-workers) to be brought into this kind of project in the first place. Most of us were decidedly not military men. I find it a lot more bizarre than the fact that we were able to design our reports a certain way. CARET was an exception to many of the usual rules.

3) If he (one of many critics who emailed Earthfiles and which I shared with Isaac) believes the pictures are fake, I certainly can’t do or say anything to prove otherwise. He sounds very sure of himself.

4) Most importantly, be very wary of anyone who claims to ‘know the mind’ of extraterrestrials. The comments he’s made are, to put it lightly, naive and extremely presumptuous. Firstly, he’s referring to ‘the aliens’ as if there is a single collective group of them. The universe is not split into ‘humans’ and ‘non-humans,’ any more than Earth is split up into ‘Spanish’ and ‘non-Spanish’ or something equally arbitrary. There are numerous races – and again, like our own races of humans here on earth, they do things in very different ways.

His comment that ‘the aliens don’t do this or that’ is akin to saying ‘humans don’t speak Japanese.’ Well, many humans don’t, but Japanese humans certainly do. The point is not that his statement is right or wrong, but simply that it’s phrased illogically. He then goes on to suggest that the design of the drones is wasting space, which is again, alarming in its arrogance. We had some of the brightest minds in the world spending years just to understand a single facet of their technology, while this individual claims to be able to assess basically every detail of a given design after looking at a single photo and conclude that it’s inefficient. I’m not even sure such a statement should be dignified with a response, and I’m sure you can understand why.

To be honest, whoever this person is, I wrote him off as soon as he said ‘the aliens would never design as these pictures depict.’ That’s about as presumptuous (if not ignorant) as a statement on this subject can be, at least coming from a fellow human. Unless there’s an alien engineer on the other side of this email, there’s simply no way such statements could have merit. I’m really only writing this as a courtesy to you.

At best, he’s been exposed to technology from a radically different race, and at worst, he doesn’t know what he’s talking about. This individual may have access to real information, and he might not. If he is a fellow ‘whistle blower,’ then I’m not interested in attacking him. If he’s not, and is simply making things up, then I’m even less interested. Whatever he is or isn’t is not for me to say, but judging by the way he talks about this issue I have my doubts.

It’s a big world and these are complicated issues. A sense of humility and the admission we don’t know everything is one of our greatest assets.

Isaac”

====

Submitted for review and commentary by,

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

Advanced Stealth – Part 2

Quantcast

Advanced Stealth – Part 2
by, Concept Acvtivity Research Vault ( CARV )
April 21, 2009
CALIFORNIA, Los Angeles – April 21, 2009 – The Los Angeles Times article ( below ) mentions NORTHROP GRUMMAN prototype development testing of lighter than air stealth type dirigibles in the Washington, D.C. Metroplex Area ( New Jersey, Pennsylvania, Maryland, Virginia, and Washington, D.C. ).
Might central Virginia’s NORTHROP GRUMMAN SPERRY MARINE DIVISION ( U.S. Route 29 / South Seminole Trail, Charlottesville, Virginia ) test site monitor new stealth technology equipped aerial vehicles?
Airship vessel ships and crafts whose surface skin material that can be remotely-controlled to illuminate like an artificial star or approaching aircraft with landing lights brightly shining downward from reaction to ground-based InfraRed-like invisible light pumped onto a stealth dirigible  ( lighter-than-air, pressurized helium gas, aerial platform, LEO vehicle ), a surveillance vessel whose external surface skin has the technological capability to gather invisible light and convert that hidden light into a refracted visible white-light spectrum like a extremely large floodlight / landing-light / strobe light with emissions emminating from buoyant airborne surveillance platforms. Imagine, central Virginia Greene County western quadrant foothills – near Rockingham County – with an artificial star, a snooping blimp that doesn’t look like a blimp but just an innocent looking star above in the night sky.
Conventional technologies would normally be comprised of bulky and heavy items such as arrays of refractive mirrors, heat diffusers, high-energy collectors / converters, high-energy power supplies, and even more that could not possibly be launched inside a lighter-than-air dirigible airship platform; at least not a blimp that would still wish to remain invisible or un-observable from the ground.
If the craft, vehicle, or vessel surface skin was made from a new electronic sensitivity controlled material, like electronic textile materials the U.S. Department of Defense ( DoD ) DARPA MEMS Biofutures Program has been working on for over 10-years, it might be a material so advanced that it could – on demand / command – ’react’ to any spectrum of invisible light ( blue-green laser communication beamlight ) and be controlled from a long distance wireless control station that might make anything possible. What about super secret electronic textiles and hidden super characteristic capabilities?
What about ‘transparent airships’ and other the new advanced stealth technologies?
– – – –
Scientists Create Observable Fiber Webs That See
July 2006
In a radical departure from conventional optic lens type materials, MIT scientists developed a sophisticated optical system comprised of mesh type webbed light detection fibers.
Having a number of advantages over its conventional lens predecessors, the constructs of this new fiber is currently capable of measuring the direction, intensity and phase of light ( a property used to describe a light wave ) without the lenses, filters, or detector array classic elements of conventional optical systems such as eyes or cameras.
Researchers expect the new system will be capable of much more.
Potential applications range from ‘improved space telescopes’ to ‘clothing that provides situational awareness’ to soldiers, or ‘those visually impaired’.
The transparent fiber-webs could even enable huge computer screens to be activated with beams of light instead of the touch of a finger. “We could use light to enhance interaction with computers and even gaming systems,” said Professor Yoel Fink of the Department of Materials Science and Engineering and the Research Lab of Electronics, leader of the team. “It’s intriguing–the idea of touching with light.”
The scientists report the work in the June 25 online edition of Nature Materials, and it is featured on the cover of the July print issue of the magazine.
The human eye, digital and film cameras, and even the Hubble space telescope rely on lenses and detector surfaces ( like the retina ) to create images. But while these systems deliver excellent images, they are constrained by their size, weight, fragility and limited field of view.
In contrast, the fiber webs are flexible and lightweight. Plus, a fiber web in the shape of a sphere can sense the entire volume of space around it, according to Fink.
“When you’re looking at something with your eyes, there’s a particular direction you’re looking in,” says Ayman Abouraddy a research scientist in Fink’s lab. “The field of view is defined around that direction. Depending on the lens, you may be able to capture a certain field of view around that direction, but that’s it. Until now, most every optical system was limited by an optical axis or direction.”
In addition to having an unlimited field of view, the fiber sphere can also detect the direction of incoming light. Light enters the transparent sphere at one point and exits at another, providing a directional reference back to the light source.
Fink’s team has also created a flat, two-dimensional web of fibers and placed two such webs in parallel. These constructs, which can measure the intensity of incoming light, are capable of generating rough images of objects placed near them, such as the shape of a letter “E” cut stencil-like from paper and lit from behind. The image shows up on a computer screen, reconstructed from a light intensity distribution measured by the webs.
The fibers used in the webs are about 1 millimeter in diameter. They consist of a photoconductive glass core with metal electrodes that run along the length of the core, all surrounded by a transparent polymer insulator.
The fibers can detect light anywhere along their length, producing a change in current in an external electrical circuit. While one fiber on its own cannot detect the exact location of an incoming beam of light, when many fibers are arrayed in a web, their points of intersection provide the exact coordinates of the beam. A computer assimilates the data generated by the web and translates it for the user.
If the fibers were woven into a textile, for instance, an embedded computer could provide information on a small display screen or even audibly.
Improving the imaging power of the fiber webs will require reducing the diameter of the fibers and creating denser webs. Fink says he’s not certain whether the new technology will one day replicate human vision. “Just the idea of imaging with a transparent object is a true eye opener,” he said.
Reference(s)
– – – –
SOURCE: The Los Angeles Times ( Los Angeles, California, USA )
Scientists Create Observable Fiber Webs That See
July 2006
In a radical departure from conventional optic lens type materials, MIT scientists developed a sophisticated optical system comprised of mesh type webbed light detection fibers.March 13, 2009
Having a number of advantages over its conventional lens predecessors, the constructs of this new fiber is currently capable of measuring the direction, intensity and phase of light ( a property used to describe a light wave ) without the lenses, filters, or detector array classic elements of conventional optical systems such as eyes or cameras.
Researchers expect the new system will be capable of much more.
Potential applications range from ‘improved space telescopes’ to ‘clothing that provides situational awareness’ to soldiers, or ‘those visually impaired’.
The transparent fiber-webs could even enable huge computer screens to be activated with beams of light instead of the touch of a finger. “We could use light to enhance interaction with computers and even gaming systems,” said Professor Yoel Fink of the Department of Materials Science and Engineering and the Research Lab of Electronics, leader of the team. “It’s intriguing–the idea of touching with light.”
The scientists report the work in the June 25 online edition of Nature Materials, and it is featured on the cover of the July print issue of the magazine.
The human eye, digital and film cameras, and even the Hubble space telescope rely on lenses and detector surfaces ( like the retina ) to create images. But while these systems deliver excellent images, they are constrained by their size, weight, fragility and limited field of view.
In contrast, the fiber webs are flexible and lightweight. Plus, a fiber web in the shape of a sphere can sense the entire volume of space around it, according to Fink.
“When you’re looking at something with your eyes, there’s a particular direction you’re looking in,” says Ayman Abouraddy a research scientist in Fink’s lab. “The field of view is defined around that direction. Depending on the lens, you may be able to capture a certain field of view around that direction, but that’s it. Until now, most every optical system was limited by an optical axis or direction.”
In addition to having an unlimited field of view, the fiber sphere can also detect the direction of incoming light. Light enters the transparent sphere at one point and exits at another, providing a directional reference back to the light source.
Fink’s team has also created a flat, two-dimensional web of fibers and placed two such webs in parallel. These constructs, which can measure the intensity of incoming light, are capable of generating rough images of objects placed near them, such as the shape of a letter “E” cut stencil-like from paper and lit from behind. The image shows up on a computer screen, reconstructed from a light intensity distribution measured by the webs.
The fibers used in the webs are about 1 millimeter in diameter. They consist of a photoconductive glass core with metal electrodes that run along the length of the core, all surrounded by a transparent polymer insulator.
The fibers can detect light anywhere along their length, producing a change in current in an external electrical circuit. While one fiber on its own cannot detect the exact location of an incoming beam of light, when many fibers are arrayed in a web, their points of intersection provide the exact coordinates of the beam. A computer assimilates the data generated by the web and translates it for the user.
If the fibers were woven into a textile, for instance, an embedded computer could provide information on a small display screen or even audibly.
Improving the imaging power of the fiber webs will require reducing the diameter of the fibers and creating denser webs. Fink says he’s not certain whether the new technology will one day replicate human vision. “Just the idea of imaging with a transparent object is a true eye opener,” he said.
Reference(s)
– – – –
SOURCE: Los Angeles Times ( Los Angeles, California, USA )
Pentagon Plans Blimp To Spy From New Heights
by, Julian E. Barnes
March 13, 2009
USA, Washington, D.C. – The Pentagon said Thursday that it intends to spend $400,000,000.00 (USD) million to develop a giant dirigible that will float 65,000 feet above the Earth for 10-years, providing unblinking and intricate radar surveillance of the vehicles, planes and even people below.
 “It is absolutely revolutionary,” Werner J.A. Dahm, chief scientist for the U.S. Air Force, said of the proposed unmanned airship – describing it as a cross between a satellite and a spy plane.
 The 450-foot-long craft would give the U.S. military a better understanding of an adversary’s movements, habits and tactics, officials said. And the ability to constantly monitor small movements in a wide area – the Afghanistan and Pakistan border, for example – would dramatically improve military intelligence.

“It is constant surveillance, uninterrupted,” Dahm said. “When you only have a short time view – whether it is a few hours or a few days – that is not enough to put the picture together.”

The project reflects a shift in Pentagon planning and spending priorities under Defense Secretary Robert M. Gates who has urged the military services to improve intelligence and surveillance operations while cutting high-tech weaponry costs.

If successful, the dirigible – brainchild of the Air Force and the Pentagon research arm [ DARPA ] – could pave the way for a fleet of spy airships, military officials said.

However, it marks the return to a form of flight that has stirred anxiety and doubt since the 1937 Hindenburg disaster – 36 people were killed when that airship went up in flames in New Jersey.

The military has used less sophisticated tethered blimps, called ‘aerostats’, to conduct surveillance around military bases in Iraq.

But flying at 65,000-ft., the giant airship would be nearly impossible to see beyond the range of any hand-held missile, yet safe from most fighter planes.

And its range would be such that the spy craft could operate at the distant edges of any military theater, probably out of the range of surface-to-air [ SAM ] missiles as well.

The Air Force intelligence, surveillance and reconnaissance abilities have improved dramatically in the last 5-years with the expansion of the Predator and other drones [ Unmanned Aerial Vehicles ( UAV ) ]. Although such craft can linger over an area for a long time, they do not watch constantly.

The giant airship military value would come from its radar system giant antenna – would allow the military to see farther and with more detail than it can now.

“Being able to observe threats [and] understand what is happening is really the game-changing piece here,” Dahm said.

The dirigible will be filled with helium and powered by an innovative system that uses solar panels to recharge hydrogen fuel cells. Military officials said those underlying technologies, plus a very lightweight hull, were critical to making the project work.

“The things we had to do here were not trivial; they were revolutionary,” said Jan Walker, a spokeswoman for the Defense Advanced Research Projects Agency [ DARPA ], the Pentagon research arm.

The U.S. Air Force has signed an agreement with DARPA to develop a demonstration dirigible by 2014. The prototype will be one-third ( 1/3rd ) as long as the planned surveillance craft known as ISIS ( Integrated Sensor Is Structure ), because the radar system will be built into the structure of the ship.

While the military says the craft is closer to a blimp than a zeppelin, which has a rigid external structure, officials usually call the project an airship. Blimps get their shape from pressurized helium gas.

The Pentagon has not yet awarded a contractor to build the prototype. Earlier work was done by NORTHRUP GRUMMAN in Redondo Beach, California; Baltimore, Maryland; and other locations – and by LOCKHEED MARTIN in Palmdale, California; Akron, Ohio; and Denver, Colorado.

Reference

http://www.latimes.com/news/nationworld/nation/la-na-spyblimp13-2009mar13,0,4608400.story

– – – –

 

Submitted for review and commentary by,

 

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

 

Robot Combat Intelligence

[ PHOTO ( above ): W-88 miniature nuclear bomb property of USA ( click to enlarge ) ]

Robot Combat Intelligence
by, Concept Activity Research Vault ( CARV )

January 18, 2011 21:08:42 ( PST ) ( Originally Published: February 1, 2002 )

DISTRICT OF COLUMBIA, Washington – January 18, 2011 – Over 12-years ago, after the United States realized too late that its ‘miniature nuclear weapons technology delivery system’ ( W-88 ) secrets had already been stolen ( from the vault of its insurance carrier ) after the People’s Republic of China ( PRC ) rapidly produced their own version, ‘only a select few’ realized a secret U.S. decision took futuristic concepts into development for U.S. global military applications deploying technologies that only seemed to have been conceived from science-fiction motion picture films ( e.g. STAR TREK, STAR WARS, MATRIX, and more ) shocking audiences worldwide.   In 1999, U.S. secret defense endeavors forgings – with several universities and U.S. government contract private sector organizations – were led by the U.S. Department of Defense ( DoD ), Defense Advanced Research Projects Agency ( DARPA ) created even newer more advanced multiple Program stratagems employing various forms of ‘combinatoric’ technologies developed for globally deploying U.S. military dominance with various and sundry secret-sensitive devices and systems far beyond many imaginations.

DARPA SIMBIOSYS Program –

DARPA SIMBIOSYS Program entails, amongst other things, multi-functional microbiological nano technology robot android devices primarily for military applications, where such remained until just a few years ago, until it began being applied in some medical arenas today.

To understand what is ‘current’, one must first look briefly at DARPA Programs ‘past’ ( 1999 – 2002 ), which ( alone ) is enough to ‘still send chills down many people’s spines today’. Once realizing what DARPA was doing 12-years ago, it’s not all that unfathomable to comprehend where DARPA has taken and will continue taking many.

SIMBIOSYS ( 1999 – 2002 ) –

In 1999, DARPA SIMBIOSYS developed a combined quantitative understanding of various biological phenomena characteristics opening the DARPA door to what amounts to MicroElectroMechanical Systems ( MEMS ) integrating microphotonics in, amongst many things such as electro-optic spatial light modulators ( SLM ) combining very short pulse solid state lasers providing powerful new capabilities for secure communication up-links ( multi-gigabits per second ), ‘aberration free’ 3-D imaging and targeting performed at very long ranges ( greater than 1,000 kilometers away ), innovative design system integration of MEMS spatial light modulators ( SLM ) providing quantum wavefront control leaps in photonics and high speed electronics, and even ‘flexible cloth-like smart materials’ DARPA wants hardware placed into production devices and systems applications optimizing both U.S. and ‘its specially selected few other foreign nation U.S. friendlies’ ( Israel ) to hold in future warfaring battlespace management superiority over other foreign nation threats.

DARPA SIMBIOSYS includes classes of biological molecules ( i.e. antigens, antibodies, DNA, cytokines, enzymes, etc. ) for analyses and diagnoses studies, from:

1. Biochemical sensors, sensing ‘details from environments’; and,

2. Biochemical sensors, sensing ‘details from human body fluids’.

Specific examples under each of those two ( 2 ) groups being left up to the discretion of the PI.

Bio-molecules importance slect criteria, includes:

1. Microsystem sensors, for automated sampling and analyses, extendibility;

2. Bio-molecules simulant, to which it represents U.S. Department of Defense ( DoD ) relevant extents; and,

3. Bio-detection high degree of sensitivity and specificity processing, etc.

DARPA SIMBIOSYS emphasis is at the ‘molecular level’ for ‘sensing’ and ‘detection’.

SIMBIOSYS Program precludes human cells and human tissue based sensing because other DARPA programs currently address those issues in combination thereof.

SIMBIOSYS Goals –

SIMBIOSYS Program ‘stimulates multi-disciplinary research’ – bringing together biologists, chemists, engineers, physicists, computer scientists and others to address difficult and pressing challenges in advancing micro and nano-biotechnology.

SIMBIOSYS Program goal is to ‘utilize phenomena’ in ‘bio-fluidic transport’, ‘molecular recognition’ and ‘signal transduction from joint studies in modeling and experiments.

SIMBIOSYS Program joint effort expects results in ‘new hardware device, new hardware processes and new hardware production communities that will begin utilizing new models, new rules, new methods and new processes together enabling design and development of enhanced performance next generation bio-microdevices.

DARPA Advanced Projects –

DARPA is focusing on, amongst many, these advanced projects:

1. Bioengineering artificial intelligence ( AI ) systems sized from nanometers and meters up to large-scale robotic systems deployed globally;

2. Biological hybrid devices and systems, inspired from computational algorithms and models;

3. Biosynthesized composite materials incorporating synthetic enzymes and pathways from biochemical cellular engineered concepts for application productions;

4. Neural phenomena control over system science computation measurement application interfaces addressing humans;

5. Micro-scale reagents biochemically engineered;

6. Biosynthesis signal processing control platform studies;

7. Molecular biological population level behavior dynamic simulation modeling complexes; and,

8. Subcellular device physics affects and cellular device physics affects within biological component systems using real-time non-destructive observation study techniques.

[ PHOTO ( above ): legacy MicroFlyer, only a Microelectronic Aerial Vehicle – MAV ( click to enlarge ) ]

Bioengineered MicroBots Developed & Deployed –

Battlefields now require ‘unmanned combat aerial vehicles’ ( UCAV ) and ‘advanced weapons’ that self-navigate and self-reconfigure with autonomous communication systems accomplishing time-critical commands, however while many use Commercial Off The Shelf ( COTS ) products, such is not the case for developed and deployed bioengineered microrobots.

MicroBot AMR Control By MARS –

DARPA mobile autonomous robot software ( MARS ) Project is designed to develop and transition ‘currently unavailable software technologies programming’ operations of autonomous mobile robots ( AMR ) in partially known changing and unpredictable environments.

DARPA SIMBIOSYS Program aims provide new software removing humans from combat, conveyance, reconnaissance, and surveillance processes by:

1. Extending military hardware range;

2. Lowering manpower costs;

3. Removing human physiology for swifter concepts, designs, engineering, development, and deployment successes; and,

4. Researchers demonstrating autonomous navigation of humanoid robots, unmanned military vehicles, autonomous vehicles and interactions between humans.

DARPA indicates that robots – to be meaningful – must be fully integrated into human lives in military, commercial, educational and domestic usages must be capable of interacting in more natural human ways.

DARPA funded research and development of robots given similar bodies with human-like intelligence for humanoid interaction providing new ways for the human world.

COG Robot –

DARPA funded Massachusetts Institute of Technology ( MIT ) researchers, employing a set of sensors and actuators ( with small microcontrollers for joint level control, up to larger audio-visual digital signal network pre-processors for controlling different levels of its heterogeneous hierarchy network ) approximating human body sensory and motor dynamics, created the robot named COG that eventually allowed DARPA further development of deployable, modular, reconfigurable and autonomous robots.

[ PHOTO ( above ): legacy Biomorphic Explorers – Snakes and Bats ( click to enlarge ) ]

CONRO Robots –

CONRO robots, developed through DARPA, employed autonomous capabilities, of:

1. Self-repair; and, 2. Morphogenesis ( changing shapes ).

Examples, amongst many, included design styled:

Snake robots, able to move ‘in-to’ and ‘out-of’ tight spaces; and,

Insect robots, able to move faster ( covering more ground meeting military mission swifter needs ).

[ PHOTO ( above ): legacy Spider, and Payload biochemical delivery simulation ( click to enlarge ) ]

CONRO robots were design equipped to perform two ( 2 ) missions:

1. Reconnaissance ( activity detection, monitorization, and reporting – surveillance ); and,

2. Deliver small ‘military payloads’ ( bio-chemical weapons, etc. ) into ‘enemy occupied remote territory locations’ ( away-from friendly warfighters ).

CONRO robots are comprised of multiple SPIDERLINK modules.

In 1999, DARPA built both ‘snakes’ and ‘hexapods’ as ‘initially tethered’ prototypes termed 1-DOF, equipped with abilities to both ‘dock’ and ‘gait ambulate’ based on applied computational algorithms.

In 2000, DARPA had twenty ( 20 ) autonomous self-sufficient ‘modules’ – not mentioning what those resembled – built designated as 2-DOF, after:

1. Hormone based control developed and tested theory;

2. Hormone hexapods and snakes implemented motions ( for 2-DOF );

3. Quadrupeds, hexapods and snakes implemented locomotion with centralized control for 2-DOF;

4. Morphing self-repair ‘modules’ delivering small payloads used ‘miniature cameras’ that were designed and tested; and,

5. Snake head with snake tail with configured docking capabilities were implemented laboratory two dimensional ( 2-D ) testing.

CONRO DARPA Near-Term Milestones:

1. Modules’ reconfigurability ( morphogenesis ) robust automation designed and demonstrated ( for 2-DOF );

2. Topology ‘discovery’ ( automatic topography recognition ) demonstration;

3. Gait reconfiguration ( morphogenesis ) automation for ambulating a ‘given’ ( programmed instruction ) topology designed and demonstrated;

4. Gait reconfiguration ( morphogenesis ) automation for ambulating a ‘discovery’ ( automatic topography recognition ) designed and demonstrated;

5. Wireless ( radio frequency, infra-red, etc. ) control of miniature cameras demonstrated;

6. Pointing ( waving, mousepad, etc. ) control of miniature cameras demonstrated; and,

7. Large scale deployment of CONRO robots demonstrated.

[ PHOTO ( above ): DARPA BioBot named Blaberus ( click to enlarge ) ]

Deployer Robot ( DR ) –

Deployer Robots ( DR ) ‘support’ and ‘deploy’ distributed ‘teams of other smaller robots’ termed “Joeys” ( singular, “Joey” ) that perform either ‘hazardous tasks’ or ‘tedious tasks’.

Deployer Robots ( DR ) have two ( 2 ) roles, that:

1. Carry and launch given numbers of smaller Joey robots ( Joeys ); and,

2. Command and control ( C2 ) – after launching – Joey robots ( Joeys ).

[ PHOTO ( above ): legacy CyberLink HID test USAF personnel with DARPA robots ( click to enlarge ) ]

Robot Loop Pyramid –

Robot-in-the-Loop ( RIL ) concept, augments Human-in-Loop ( HIL ), building a ‘pyramid of robots’ – supervised by one ( 1 ) person.

‘Launch’ and ‘Command and Control’ ( C2 ) – of different Joey robots ( multiple, i.e. Joeys ) – two ( 2 ) goals are handled independently, as:

1. ‘Launch’ of robots, via grenade sized Joey robot clusters ( multiple ), developed under DARPA Deployer Robot ( DR ) Program availability of smaller Joeys; and,

2. ‘Command and Control’ ( C2 ), is investigated using ‘larger robots’ developed for DARPA ITO sister Software for Distributed Robotics ( SDR ) Program enabling fully leverage of both Deployer Robot ( DR Program and Software for Distributed Robotics ( SDR ) Program development of algorithms leveraging heterogeneous interaction between a ‘smart’ highly mobile ‘Deployer Robot’ ( DR ) and a ‘team’ of Joey robots that are more powerful, less computational and less mobile.

[ PHOTO ( above ): legacy Virtual Combiman digital glove waving battlespace management ( click to enlarge ) ]

DARPA key universal elements of robot deployment examined:

1. Emplacement – Launching and dynamically situating the Joeys for mission goals;

2. Operations – Maintaining the infrastructure to support the distributed front, including communications and error detection and recovery ( e.g., getting back on course after positional drift ); and,

3. Recovery – Collecting Joey robots data to analyze after delivery into a format useful for the human operator.

DARPA Deployer Robot ( DR ) Program development acquired and refitted two ( 2 ) Urban Robot Upgrades ( URU ) in new Deployer Robots ( DR ) types.

DARPA, investigated five ( 5 ) alternate launch strategies, but selected only one ( 1 ):

1. Grenade barrel launch, delivery of robots, into a three ( 3 ) story building.

2. Grenade barrel launcher was designed, equipped and developed, with:

3. Grenade Magazine contains ‘multiple Joey robots’ for ejection – supports full mobility integrity of the Deployer Robot ( DR );

4. Sensor mast ( collapsible ) – for Deployer Robot ( DR ) interaction with Joey robots launched on arrival at destination location; and,

5. Communication ( 916 MHz ) link between Deployer Robot ( DR ) and Joey robots.

DARPA SDR Program –

DARPA Software for Distributed Robotics ( SDR ) Program development designed and built Joey robot prototypes ( approximately 3-1/2 inch cube ) for ultimate fabrication in a production lot quantity of 120 Joey robot units.

DARPA Software for Distributed Robotics ( SDR ) Program leverage and adaptation controls swarms of Joey robots.

DARPA Near-Term Milestones:

1. Launch propulsion mechanisms ( C02 cartridge, .22 caliber shell, or other ) deployment testing of Joey robots into battlefield areas;

2. Launcher ( of multiple Joey robot deployment ) mechanism built on-board first ( 1st ) Deployer Robot ( DR ) named Bandicoot;

3. Sensor mast ( collapsible ) built and installed on-board second ( 2nd ) Deployer Robot ( DR ) named Wombat;

4. Radio Frequency ( RF ) development protocols for interaction between Deployer Robot ( DR ) and Joey robots;

5. Infra-Red ( IR ) deployment protocols for interaction mechanisms between Deployer Robot ( DR ) and Joey robots using IR ( Infra-Red );

6. Human Interface Device ( HID ) operator remote control unit ( ORCU ) development for Deployer Robot ( DR ).

DARPA SIMBIOSYS began over 12-years ago. All the photographs ( above ) are almost one decade ( 10-years ) old.

Current careful research on this subject further provides more information about where the U.S. stands today.

Submitted for review and commentary by,

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

Reference

http://web.archive.org/web/20021214110038/http://groups.msn.com/AnExCIA/rampdintell.msnw

Israel OrphaHell Teva Pharmaceutical Anthrax In Iraq

OrphaHell Project Israel Anthrax Iraq War

 [ PHOTO: Biohazard under microscope Teva Pharmaceuticals ( Israel ) eyed by CIA Director George Tenet points Weapons of Mass Destruction ( WMD ) toward now-former Iraq President Saddam Hussein. ]

Israel OrphaHell Teva Pharmaceutical Anthrax In Iraq
by, Kentron Intellect Research Vault ( KIRV )

1997

Secret government TELEX analysis and research reveals the business of ORVET B.V. (aka) ORPHAHELL B.V. (aka) TEVA PHARMA B.V. Israel’s TEVA PHARMACEUTICALS that supplied anthrax to Iraq, information supplied by a doctor informant of the CIA making the Iraq War case appear to have originated from British intelligence, and the names of friendly nations to the United States secretly hiding that anthrax delivery.

Intercepted secret government communications and research now reveals Iraq nuclear, chemical and biological Weapons of Mass Destruction ( WMD ), former Iraq Saddam Hussein strategic contingency plans, the name of the Israel company ( ORPHAHELL B.V. via TEVA PHARMACEUTICALS ) that supplied anthrax for Iraq purchases, and the names of U.S. friendly nations that hid secret deliveries of Israel anthrax to Iraq backed-up by detailed in-field intelligence reports, sent out-of Iraq to a Persian Gulf based U.S. Naval Support Group ( NSG ) floating platform for the U.S. National Security Agency ( NSA ) at Fort George G. Meade, Maryland.

Uncensored contents from several of these back-channel intelligence traffic messages describe being routed to a long list of top secret U.S. foreign intelligence posts around the world plus key U.S. government agencies located in the United States.

The TELEX messages, believed sent by a Central Intelligence Agency ( CIA ) agent obtaining secret Iraq government information from a high-level Iraq military informant, was actually U.S. government ‘raw intelligence’ so sensitive it was still classified “yet to be evaluated” but then purposely supplied to foreign intelligence agencies located in Tel Aviv, Israel and Britain.

Answers to questions as to ‘why’ the United Kingdom began bombing Iraq [ Britain Bombs Iraq http://web.archive.org/web/20020209205305/communities.msn.com/AnExCIA/archives.msnw ], followed by U.S. air strikes and more, should now clear up many mysteries.

British Intelligence Storyline On Iraq –

The British government became alarmed by evidence obtained from inside Iraq, which in-lieu of United Nations Security Council Resolutions outlawing Weapons of Mass Destruction ( WMD ), Saddam Hussein continues to develop Weapons of Mass Destruction ( WMD ) – in-lieu of his denials – and with Weapons of Mass Destruction ( WMD ) comes the ability to inflict real damage upon regional stability of the world.

According to the published words of Britain Prime Minister Tony Blair, “I believe people will understand why the Agencies cannot be specific about the sources, which have formed the judgements, and why we cannot publish everything we know…I and other Ministers have been briefed in detail on the intelligence and are satisfied as to its authority…I believe the assessed intelligence – has established beyond doubt – is, that Saddam has continued:

– Producing Chemical and Biological Weapons; – Efforts to Develop Nuclear Weapons; and, – Extended the Range of his Ballistic Missile program.

The picture presented to me by the Joint Intelligence Committee in recent months has become more – not less – worrying.”

An independent and well-researched overview report of evidence provided by the United States International Institute for Strategic Studies ( IISS ) on September 9, 2002 suggested Iraq could assemble nuclear weapons within months of obtaining fissile material from foreign sources.

The British Joint Intelligence Committee ( BJIC ), chaired by the UK Cabinet Office, consists of three ( 3 ) UK intelligence and security agency chiefs plus key government department senior officials.

For over 60-years the British Joint Intelligence Committee has provided regular assessments on a wide range of British foreign policy and British international security issues to successive predecessor British Prime Ministers and senior colleagues.

The British Joint Intelligence Committee work, like the material it analyzes, is largely secret.

Significant information was made available to the British Government from secret intelligence sources, which was described in more detail within a document released on September 2, 2002 by the United Kingdom ( UK ) baseed in large part on the work of the UK Joint Intelligence Committee ( JIC ), the heart of British intelligence.

This 55-page British intelligence photo-filled document use to be able to be downloaded after completing a free MSN online membership registration access to the public internet website named Unwanted Publicity [ http://groups.msn.com/UnwantedPublicity ] until that website was administratively disabled by the Microsoft Corporation on August 30, 2008.

Foolish Intelligence Practices Corrected –

Shortly after 2000, and subsequent to U.S. president George W. Bush Jr. entered the Office of the United States President, the University of Massachusetts’ Center For intelligent Information Retrieval ( CIIR ) [ http://ciir.cs.umass.edu/research/ ] website became restricted to the general public but only a few knew why.

In 1997 the Center For Intelligent Information Retrieval ( CIIR ) [ http://ciir.cs.umass.edu/research/irlab.html ] formerly held website webpage links supplied by the U.S. federal government “GovBot” [ ( inactive ) http://www.GovBot.gov ( inactive ) ] search engine website that provided extensive internet database documentation derived from many U.S. government facilities, not limited to, the U.S. Army ( Fort Belvoir, Virginia, USA ), U.S. government contractors, and other U.S. government sources for information that the world public could freely ‘data mining’ over internet connections until 2000 when the U.S. government restricted CIIR access to the general public a revised the U.S. federal government online information database search engine, re-naming it “FirstGov” [ http://www.FirstGov.gov ].

During 1997, the legacy CIIR website database incorporated a publicly accessable online ‘optical character recognition’ ( OCR ) tool feature used by anyone to convert CIIR webpage ‘image documents’ into easy to read ‘text documents’, which explains where and how classified TELEX back-channel intelligence traffic messages were sourced.

Left unexplained is ‘why’ U.S. government classified intelligence image documents, although many pertained to legacy events, were nevertheless easily intercepted and optically translated into readable text format were tools freely available to the world across public access internet.

Also left unexplained, is ‘why’ the U.S. federal government used the University of Massachusetts Center For Intelligent Information Retrieval ( CIIR ) to store image document copies of U.S. government classified documents made freely available to the world across public access internet.

Image documents consisting of classified back-channel intelligence communication traffic messages in TELEX format dated between March 1990 and May 1992 revealed Netherlands company ORVET B.V., which research reveals is a foreign front company of Israel’s TEVA PHARMACEUTICALS used in the clandestine supply of 4,000 vials of anthrax purposely supplied and shipped by Israel through their Netherlands subsidiary to Jordan to hide delivery of anthrax to Iraq, then under the control of Sadam Hussein.

Finally Evaluated Intelligence –

Although those March 1990 thru May 1992 intelligence back-channel message traffics mentioned ORVET B.V. ( Mijdrecht, Netherlands ), research now reveals that in 1979 the company known as ORPHAHELL B.V. ( Mijdrecht, Netherlands ) [ http://www.tevaeurope.com/index.cfm?fuseaction=main.showpage&selid=151 ] was acquired by TEVA PHARMACEUTICALS ( Petach Tikva, Israel ) [ http://www.tevapharm.com/about/officers.asp ] 11-years before TEVA changed its subsidiary name from ORPHAHELL B.V. to ORVET B.V. and then ‘again’ changed its name, ‘after’ the back-channel intelligence traffic communication TELEXs, to TEVA PHARMA B.V. ( Mijdrecht, Netherlands ) [ http://www.tevaeurope.com/index.cfm?fuseaction=main.showpage&selid=148 ].

Although the 1997 original private party retrieval source of these classified TELEXs for the X-CIA FILES website report entitled, “Iraqi Re-Match Set – Bush vs. Hussein” datelined “IRAQ, Tuwaitha – July 30, 2001″ was erased from internet archives by the U.S. government, these TELEXs have been resurrected in their original format to provide the following:

British Intelligence Relied On From CIA Intelligence Iraq Informant Doctor – continues…

Full report ( continues ), at: http://www.indymedia.org/en/2008/06/907625.shtml

Submitted for review and commentary by,

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

/

/

Secret IT Directorate

CIA HQ former buildings

[ PHOTO ( above ): Former U.S. Central Intelligence Agency Headquarters ( click on image to greatly enlarge ) ]

Secret IT Directorate
by, Concept Activity Research Vault ( CARV )

November 22, 2011 11:45:08 ( PST ) Updated ( Originally Published: October 25, 2010 )

USA, Menlo Park, California – November 22, 2010 – Some may not recall the ‘first public announcement ( 2000 )’ of the United States Central Intelligence Agency ( CIA ) ‘private business corporation’ having been referred to as the IN-Q-TEL CORPORATION INTERFACE CENTER (aka) QIC, it was however fomerly known as IN-Q-TEL CORPORATION, but ‘that company name’ was even formerly known as IN-Q-IT CORPORATION (  ’not’ to be confused with the INTUIT CORPORATION business of software application programs QuickBooks and TurboTax ), however the CIA ’reversed’ its previous private business name change decisions back to it now being known as the IN-Q-IT CORPORATION ( IN-Q-IT ) today. Clear as mud, right?

Some wonder whether ”IN-Q-IT” is even ‘really’ the ’true name’ of this CIA ‘private business’ company today, or whether – within the intelligence community pea ‘n shell game of names – other company subsidiary names may have developed, but for now the IN-Q-IT CORPORATION is ‘currently known’ as being the U.S. Central Intelligence Agency ( CIA ) ‘venture capital’ private business corporation.

Important to understand precisely ‘what this CIA private business was supposed to be accomplishing’ versus ‘what the CIA actually did’ with its private business; more recently, however ’what it has become’ and ‘what it is supposed to be accomplishing’ today and for the future.

The curious state of affairs sees no one knowing anything more about the CIA QIC ( IN-Q-TEL Interface Center ) private business corporation than a few did when it began, but ‘now’ no one is even required to inform the public with an accounting to justify anything surrounding it. Why? Because it was meant to be a ‘private business’ company, ‘not’ a U.S. government entity, and ‘that’ was ‘how’ the CIA created it to remain – outside anyone’s purview – for a rather ‘complex’ reason.

The only method, by which an ‘even more complete’ and ‘even more accurate assessment’ may be formulated for an ’even more thorough understanding’ is quite involved and may at times be highly complex. One must not only review ‘multiple facet areas’ this CIA ‘private business company was originally designed to tackle but more-so what it was supposed to accomplish, and from within ‘both’ of those areas, go on to ’realize precisely’ what ‘were’ and ‘still are’ today’s “problem sets” facing the CIA and just ‘how’ they are juggling it all.

Some believe ‘members’ of U.S. Congressional committees’ and subcommittees’ ‘oversight’ had to attend ‘special educational lessons’ designed by the CIA. Did key members of Congress attend what basically amounted to a CIA ‘school’?

The CIA Congressional school was believed non-existent by many left to see other less palatable theories develop into the U.S. Congress having simply tired over too numerous CIA complex oversight reviews – and so much so that Congress relagated its own authority over to CIA in what some believed tantamount to the CIA ‘fox’ guarding its own global-sized intelligence ‘chicken coop’.

Some may now be enlightened to understand what the United States Central Intelligence Agency ( CIA ) decidely phrased as its own ”radical departure” away-from what it perceived as ‘inefficient economic budget support’ for solving its own ’quantum complexities’ within ’highly specific areas’ – still ‘classified’ in the interest of ‘national security’ – burdensomely producing an exponential growth of new ”problem sets” the CIA would only publicly explain – in the most general of terms – as such ‘experienced from within un-named areas’ of ‘science and technology research and development applications’ that the CIA was decidely viewing to establish ‘limits upon’ and go on to a’simultaneously’ establish as ‘marketable derivatives’ it called ”products” that CIA Office of Science and Technology ( S&T ) oversight could ‘manage distribution of information knowledge’ from but on an ‘in-exchange’ contractual agreement basis with ‘cooperative’ “private sector” ‘individuals’, ‘businesses’,  ’institutions’ and ‘organizations’ and thereby ‘establish who held proprietary keys’ to ‘special skill sets’ of what was already protected ’intellectual property rights’ of “existing” technology and CIA global establishment over all ”emerging” ( new upcoming future ) technology proprietary rights by sole marketeering ‘special talents’ and ‘special services’ could be harvested where incredible amounts of ‘profit’ could also be harnessed ( absorbed ) by the CIA.

To many of entrepeneurial independent spirit this CIA QIC private business corporation appeared, in-essence, out-of nowhere, like a new Borg structure infringing on private freedoms of what few once experienced of global marketplace past, and to others CIA QIC tenor was too Godfather-like – making people and entities an offer they couldn’t refuse. While the CIA foresaw such rumors and speculation coming,  in reality, what ‘was’ the ‘CIA’ doing by opening-up ‘its own private business corporation’?

Visionary dreams may be able to see the United States Central Intelligence Agency ( CIA ) ‘shed’ its ’government skin’ to become the ‘world’s largest multi-national corporation’ holding the ‘world’s largest monopoly’ on information technology ( IT ) research and development direction of much of the world’s finest talent resources, i.e. private ‘individuals’, ‘businesses’, ‘institutions’ and ‘organizations’ independently operating outside U.S. Congress ‘oversight, budget justification and related constraints’. Such clever restructuring in-place, CIA would cease to exist as the public knows it today, technically – by legal definition – becoming a wholly-owned ’non-profit organization’ – no longer requiring U.S. Department of the Treasury tax dollar funding. Set free, a new type of CIA would exist with ‘self-determined financing’ and stock market trading profits derived from a host of private sector corporate ‘mergers and acquisitions’ ( M&A ).

While surface dreams of such visionaries might at-first appear ingenious ‘how’ was ‘all’ this ‘actually assembled’?

Before 1999, it took the CIA Office of Legal Counsel less than 1-year to research various United States laws to locate legacy technicality provisions that the U. S. Congress approved allowing the CIA to exercise its own ”radical departure” plan.

By 2000, ’reality’ saw the fetal stages of this CIA private bussiness venture plan developing, leaving the public without hearing anymore further about its progress.

Some believed an ‘initial public offering’ ( IPO ) paying dividends to private individual investors and corporate trading of shares of ‘stock’ in what could have been misconstrued as potentially being the world’s largest ‘insider trading scheme’ headache of the United States Securities and Exchange Commission ( SEC ) whose predictives could only imagine manage the envelopment of multiple new technology area companies trading on ’stock exchange’ floors that could potentially carry forward ‘mutual profit secrets’ paying more funds than anticipated into the CIA private business plan – a “radical departure” away from what otherwise had long been understood as the status quo of world trading – where embarrassing implications might turn’terrorist fund reduction measures’ into ‘profits derived from CIA led secret private business developments in high technology products’. Could such a “radical departure” plan backfire or morphotherwise ’unsophisticated terrorists’ – utilizing improvised munition missions – into a new more powerful community of ’uncooperative competitive business terrorists’? Perhaps.

Implications of a CIA private business group of subsidiary businesses trading stock on ‘open stock market exchanges’ around the world could create an entirely ’new form of intelligence blowback’ of staggering global socio-economic business proportions for future generations.

Today, no overall clear pictures exist on what still remains cloaked in secrets – albeit ’government’ or ‘private’ – where outside both domains this CIA private business enterprise continues growing. But, in which directions?

Prior to 2001, the new CIA plan became a ’high-directional multi-tiered simulataneous growth-oriented economic support expansion’ for and of ‘key-critical secret-sensitive advanced “information technology” ( IT ) derivatives ( “products” )’ that could ‘only be implimented’ by the CIA “identifying” ( targeting ) and “partnering” ( obtaining ) ’exisiting information’ and ‘information tasking ability quotients’ from “global” ( worldwide ) “private sector” masses – an ”infinite” ( unlimited ) supply of ‘private individuals’, ‘private businesses’ and ‘institutions’ to become ‘dedicated taskers’ of controlled CIA “problem sets.”

The CIA private business futures would depend on successful simultaneous utilization of exercising better economic sense to its maximum potential immediately alongside highly specific advanced technological enhancements the United States ’intelligence community’ would be grown under a new ‘broad term secrecy’ commonly known but hidden within what the CIA termed only as ”information technology” ( IT ) that would necessarily require CIA controlled ’targeting’, ‘shaping’ and ‘acquisition’ of a plethora of private business sector information technology ( IT ) application research and development.

Truely a ”radical departure,” as the CIA publicly alluded to, when describing its ‘new mission focus’ – solving CIA “problem sets.”

Although CIA controlled ’special technology’ research and development ( R&D ) was on ‘applications’ that later becale known as ” Commercial Off The Shelf ” ( C.O.T.S. / COTS ) ‘products’ that were in-essence – during early stage informational development – the cruxt of what the CIA wanted presented on its ‘table’ whereupon the CIA would legitimatize and manage ‘mass information exchanges’ the CIA would ’trade’ for ‘other valuable considerations’ but to only a select few ‘private companies’ ( e.g. LOCKHEED, LUCENT, PHILIPS, AT&T, et al. ) that would in-return be ‘capable of offering’ through only ’United States government qualified’ contractual agreement exchanges of whatever the CIA deemed these companies ’could place of further interest’ or ’further the duration of continuing to provide’ what these select private ‘individuals’, ‘companies’, ‘institutions’ and ‘organizations’ were ‘already providing under U.S. government contract agreement harvests’.

The public, however only understood ‘press reports’ that kept all of the aforementioned very ’simple’ – indicating in the vaguest of terms – that ‘products would eventually be sold’ ”through the private sector.”

Secret-sensitive ‘products’, that were in all actuality ‘technological breakthroughs’ were to be traded between CIA selected and controlled business stock holdings, and the CIA IN-Q-TEL INTERFACE CENTER ( QIC ) would privately and thereby secretly manipulate all technology funds derived from what the CIA QIC publicly referred to as those being its ‘partners’ and ‘other vendors’ that would remain ’outside the purview U. S. Congress government budget oversight’ where all private companies remain to enjoy unfettered privledges of privacy.

By utilizing U.S. Department of the Treasury government tax funds – for U.S. government contract agreement funding to ‘private business partners’ – the United States Federal Reserve System follows in CIA footpath lock-step by ’mirroring’ private bank wire transfer monies directed and then redirected through a long chain series of foreign corporation named offshore bank accounts secretly routed back into the U.S. Central Intelligence Agency ( CIA ) IN-Q-IT CORPORATION (aka) QIC  private business enterprise handling ‘venture capital’ where new project funding amounts may be decidely broken down into smaller amounts or pooled into much larger amounts for dispersals to clandestine other secret-sensitive intelligence programs, projects and/or operations that gain the strength to easily remain ‘outside U.S. Congress intelligence oversight board committees and subcommittee scrutinization.

Directorate Central Intelligence ( DCI ) ochestral management arrangements within its IN-Q-IT INTERFACE CENTER ( QIC ) realizes from foreign historical prospectives that when a private business exercises ‘en masse privatized mind teams’ to understands today’s falabilities in-keeping with human frailties encountering CIA inherent procedural compartmentalization of secrets rule ( no “talking around” ) requiring those of such outside each task to be unable to quickly assemble an ‘overall picture’ of what overall CIA plans consist of – at least that’s how it’s supposed to work but rarely does – so, while that alone ( in and of itself ), became a “problem set” to solve that drastic measures needed taking. Hence, the CIA “radical departure” plan has another design serving to counteract intelligence information leaks.

Ingenious, is a very small word to describe even one ( 1 ) facet of this CIA private business plan where the public has its limited understanding confusingly ’shifted from what it perceives to be government secrets’ moved rapidly back and forth between ‘private sector secrets’ in what only but a few perceive to be a ‘new wave intelligence form’ or ‘combinatoric intelligence structuring’ producing a shield ( shell ) to protect even more secrets actually beneath what has become a new globally flexible CIA layered support group strengthening.

As an extremely large ’black budget’ intelligence missions funding source, CIA IN-Q-TEL INTERFACE CENTER would only be the recipient of limited and toned recommendations supplied by the QIC Board of Advisors ( CIA headquarters ) as measures to be reviewed for eventual implimentation by the QIC Board of Trustees for the essential discretionary manipulation of sophisticated:

1. Technologies [ i.e. XEROX PARC RESEARCH CENTER, et al. ];

2. Vendors [ i.e. LOCKHEED, et al. ];

2. Debt [ i.e. TELECREDIT INC., et al. ];

3. Capital [ i.e. MARSH & MCLENNAN CAPITAL INC. ];

4. Stock Market Trading [ i.e. GOLDMAN SACHS & CO. ]; plus,

5. More.

The report ( below ) shows ‘whom’ were initially placed in ‘experienced authoritative positions’ and shows ’whom’ were chosen as ’senior level executive advisors’, all selectively chosen by the CIA pulling them from ‘key critical private businesses’ to ‘guide’ the private U.S. Central Intelligence Agency business venture.

Such should really come as no surprise, at least to those understanding mechanics of international business, trading and finance where all domestic and foreign bank account transactions are mirrored under oversight by the United States Federal Reserve System ( FED ) and U.S. Securities And Exchange Commission ( SEC ), the latter two ( 2 ) of which are ‘overseen’ by the U.S. Central Intelligence Agency ( CIA ).

This information was dervived from, outside ’market sensitive‘ ( stock market trading ) material, a Critical Sensitive National Security report [ August 13, 2001 ] of the U.S. Congress, House of Representatives Subcommittee on Oversight and Investigations ( Subcommittee ) of the Committee on Financial Services relying on information supplied by, amongst others, the SEC Divisions of Enforcement and Corporation Finance, Offices of the Chief Accountant, General Counsel, Compliance, Inspections and Examinations, Office of the Comptroller, Office of Economic Analysis, the U.S. Central Intelligence Agency ( CIA ).

But, is all this ‘really going on’? See full report ( below ).

– – – –

Courtesy: Unwanted Publicity Information Group

Source: X-CIA Files [ now defunct MSN Group website ]

 

 

 

CIA Sends Hi-Tech Tsunami Warnings by, X-CIA Files – Staff Writer [ AnExCIA@bluewin.ch ]

March 12, 1999

USA, California, Menlo Park – The first ‘publicly open contract’ between a so-called ‘private firm’ partnering with the U.S. Central Intelligence Agency (CIA) where the private corporation whose CIA members conceived and funded it is known as the In-Q-Tel Corporation was formerly known as the In-Q-It Corporation.

In-Q-Tel is actually one of many CIA private-sector business partners, which is not all that uncommon a partnership, like what the CIA has had for decades with the MITRE CORPORATION (USA), BELL LABORATORIES ( Canada ), and TRW Power Thrusting Division ( Hawthorne, California, USA ), remote control center for CIA maneuvering Tracking Data and Relay Satellites ( TDRS ) and the Killer HUGHES ( KH-11 ) anti-satellite satellite ( space born destructive laser platform ) series.

The publicly revealed partnership between the CIA and IN-Q-TEL CORPORATION has sent another tsunami warning to Japan at its NIPPON ELECTRIC CORPORATION ( NEC ) high-technology monopoly, that has been right on the heels of advanced high-technology advancements seen within the RESEARCH TRIANGLE PARK ( North Carolina, USA ).

There is some doubt and controversy though between what the CIA says In-Q-Tel is, verses what Q-In-Tel says it is.

CIA claims ( on its website ) that Q-In-Tel is a “‘non-profit’ organization.”

IN-Q-TEL states it has had ’profitability in-mind for quite some time’.

Let’s see what the facts reveal (below), allowing casual observers to make up their own minds about just what IN-Q-TEL ‘really is’:

The In-Q-Tel Heirarchy

CIA IN-Q-TEL INTERFACE CENTER ( QIC ) players are stacked-up on a list that reads like something out-of a Robert Ludlum novel – filled with international intrigue high-tech corporatarchy.

CIA IN-Q-TEL INTERFACE CENTER ( QIC ) Board of Trustees is a Who’s-Who of ‘big corporate’ America:

– Gilman Louie, CEO of IN-Q-TEL CORPORATION, most recently was HASBRO INTERACTIVE Chief Creative Officer and General Manager of the GAMES.COM group ( responsible for creating the HASBRO internet game site ), previously serving as Chairman of the Board of MICROPOSE, CEO and Chairman of SPECTRUM HOLOBYTE, CEO of SPHERE INC., and is on the Boards of Directors of numerous software firms.;

– Lee Ault, Chairman, former Chairman and CEO of TELECREDIT INC.;

– Norman Augustine, former Chairman and CEO of LOCKHEED MARTIN CORPORATION;

– John Seely Brown, Chief Scientist, XEROX CORPORATION and President, XEROX PARC RESEARCH CENTER;

– Michael Crow, Executive Vice Provost of Columbia University;

– Stephen Friedman, Senior Principal of MARSH & MCLENNAN CAPITAL INC., and former Chairman of GOLDMAN SACHS AND CO.;

– Paul Kaminski, former U.S. Department Of Defense ( DoD ) Undersecretary for the U.S. Defense Acquisition and Technology Office, President and CEO of TECHNOVATIONS INC., and Senior Partner in GLOBAL TECHNOLOGY PARTNERS;

– Jeong Kim, President of CARRIER NETWORK, part of the LUCENT TECHNOLOGIES GROUP, and former founder of YURIE SYSTEMS;

– John McMahon, former Deputy Director of U.S. Central Intelligence Agency ( CIA ), former President and CEO of LOCKHEED MISSILE & SPACE COMPANY, and consultant to LOCKHEED-MARTIN CORPORATION;

– Alex Mandl, Chairman and CEO of TELIGENT, and former President and CEO of AT&T; and,

– William Perry, former U.S. Department Of Defense Secretary and currently Berberian Professor at Leiland Stanford University.

New CIA Use Of In-Q-Tel Interface Center ( QI.C )

In-Q-Tel is a new non-profit corporation funded by the CIA to seek Information Technology ( IT ) solutions to the Agency’s most critical needs. A unique venture, was formed to enable the Agency ( CIA ) to have access to ‘emerging and developing information technology’ in a timely manner.

QIC ( IN-Q-TEL INTERFACE CENTER ) is the ‘interface center’ linking the IN-Q-TEL CORPORATION to the Agency ( CIA ).

QIC – CIA Function

QIC develops a problem set for In-Q-Tel, partners with In-Q-Tel in the solution acceptance process and manages the Agency’s relationship with In-Q-Tel.

QIC plans and evaluates the partnership program, protects CIA security and CIA counter-intelligence interests and communicates the QIC / In-Q-Tel venture to the World.

CIA – QIC Function

The CIA, working in partnership with IN-Q-TEL, created the Agency’s ( CIA ) new found organization QIC.

QIC goals now, are to be the leading source for commercial, high impact IT solutions for the Agency ( CIA ), and will be herald as the single most important contributor to the Intelligence Community by the year 2001. QIC will create and use the full range of corporate processes needed to manage QIC (aka) the ” CIA-In-Q-Tel Partnership ” by delivering CIA-accepted IT solutions.

CIA Goals With QIC

Eventually, IN-Q-TEL will take on a life funded by the high-technology consumer public. QIC ( IN-Q-TEL Interface Center ) however, works comprehensively and collaboratively with Agency ( CIA ) IT specialists, customers, IN-Q-TEL experts, Agency ( CIA ) managers, the Chief Information Officer, the Chief Technology Officer, Chief Financial Officer, Agency directorates, and Executive Board to develop an annual coordinated and approved critical ‘problem set’ for IN-Q-TEL.

QIC, leads Agency participation in the partnership’s solution transfer planning, including resources, technology demonstration, and prototype testing and evaluation.

At the same time, QIC works with In-Q-Tel to assure that it addresses issues regarding the transfer of IT solutions into the Agency. QIC also works with Agency customers and their managers to create an environment conducive to the implementation and acceptance of partnership solutions and follow-on initiative.

In-Q-Tel Background

On September 29, 1999 the Central Intelligence Agency (CIA) was treated to something different. In many of the nation’s leading newspapers and television news programs a story line had appeared that complimented the Agency for its creativity and openness.

The media was drawn to a small corporation in Washington, D.C. that had just unveiled its existence and the hiring of its first CEO, Gilman Louie who described the Corporation called, the “IN-Q-IT CORPORATION“, as having been formed “…to ensure that the CIA remains at the cutting edge of information technology advances and capabilities.”

With that statement the Agency ( CIA ) launched a new era in ‘how it obtains cutting-edge technologies’.

In early January 2000, the name of the corporation ( IN-Q-IT CORPORATION ) was changed to IN-Q-TEL CORPORATION.

The ‘origins of the concept’ that has become IN-Q-TEL are traceable to Dr. Ruth David, former CIA Deputy Director for Science and Technology.

She and CIA Science And Technology Deputy Director, Joanne Isham, were the first senior Agency ( CIA ) officials to understand that the information revolution required the CIA to forge ‘new partnerships’ with the ‘private sector’ and ‘design a proposal for radical change’.

The timing of the proposal was fortuitous.

CIA Director of Central Intelligence ( DCI ), Mr. George Tenet, had just launched his own Strategic Direction Initiative ( SDI) – also known as “Star Wars“ – included technology as one of its areas for review.

The study made a direct link between Agency ( CIA ) ‘future technology investments’ and ‘improving’ its ‘information gathering’ and ‘analysis capabilities’.

By the summer of 1998, the Agency ( CIA ) had assembled a few senior Agency ( CIA ) ‘staff employees with an entrepreneurial bent’ and ‘empowered them’ to take the Dr. Ruth David original concept and flesh it out.

Aided by a ‘consulting group’ and a ‘law firm’, they ( CIA ) devoted the next 4-months to making the rounds in Silicon Valley ( California ) – and elsewhere – putting the concept through the wringer. Much of the ‘time was spent listening’.

Many they met with were often critical of one aspect or another of the concept.

But, whether they were ‘venture capitalists’, Chief Executive Officers ( CEO ), Chief Technical Officers ( CTO ) or men of Congress and staffers, all eagerly immersed themselves in spirited debates that enriched the Agency ( CIA ) team and ‘drove the concept in new directions’.

By the end of 1998, the Agency  CIA ) team reached a point at which the concept seemed about right.

Though’ it had changed considerably’ from that which had been proposed initially by Dr. Ruth David, it remained true to its core principles.

It was time to hand the ‘product’ of the Agency ( CIA ) work over to someone in the ‘private sector’ with the ‘experience’ and passion necessary ‘to start the Corporation’.

To the delight of the DCI and Agency ( CIA ) team, Norman Augustine, a former CEO of LOCKHEED-MARTIN and 4-time recipient of the Department of Defense highest civilian award, the Distinguished Service Medal, accepted the challenge.

By February 1999, the Corporation was established as a legal entity, and in March [ 1999 ] it [ IN-Q-TEL CORPORATION ] received its first [ 1st ] contract from the Agency ( CIA ). In-Q-Tel was in business, charged with ‘accessing information technology ( IT ) expertise and technology wherever it existed’ and brought it to bear on the’ information management’ challenges facing the Agency ( CIA ).

In-Q-Tel Creation

As an information based agency, the CIA must be at the cutting edge of information technology in order to maintain its competitive edge and provide its customers with intelligence that is both timely and relevant.

Many times the Agency and the federal government have been the catalysts for technological innovations. Examples of Agency ( CIA ) inspired breakthroughs, include the LOCKHEED AEROSPACE aircraft designed U-2 ( Dragon Lady ) and SR-71 ( Black Bird )reconnaissance aircraft and the CORONA ‘surveillance’ satellites, while the ‘parent of the Internet’ [ Advanced Research Projects Agency (aka) ARPA.NET ] was led forward with the Defense Advanced Research Projects Agency ( DARPA ).

By the 1990s, however – especially with the advent of the World Wide Web – the ‘commercial market’ was setting the pace in IT innovation.

And, as is the nature of a market-based economy, the ‘flow of capital’ and ‘talent’ has irresistibly ‘moved to the commercial sector’ where the prospect of huge profits from ‘initial public offerings‘ ( IPO ) and ‘equity-based compensation‘ has become ‘the norm’.

In contrast to the remarkable transformations taking place in Silicon Valley ( California ) and elsewhere, the Agency ( CIA ) – like many large Cold War era ‘private sector corporations’ – felt itself being ‘left behind’. It ( CIA ) was not connected to the creative forces that underpin the digital economy.

And, of equal importance, many in Silicon Valley ( California ) knew little about the Agency ( CIA ) IT ( information technology ) needs.

The opportunities and challenges posed by the information revolution to the Agency ( CIA ) core mission areas of ‘clandestine collection’ and ‘all-source analysis’ were growing daily.

Moreover, the [ CIA ] challenges are not merely from foreign countries, but also ‘transnational threats’.

Faced with these realities [ by 1997 ], the leadership of the CIA made a critical and strategic decision in early 1998.

The Agency’s leadership recognized that the CIA did not, and could not, compete for IT ( information technology ) innovation and talent with the same speed and agility that those in the ‘commercial marketplace’, whose businesses are driven by “Internet time” and ‘profit’, could.

The CIA mission ‘was’ intelligence collection and analysis, not IT innovation.

The leadership also understood that, in order to extend its reach and access a broad network of IT innovators, the Agency had to step outside of itself and appear not just as a buyer of IT but also as a seller.

The CIA had to offer Silicon Valley ( County of Santa Clara, California ) something of value, a business model that the Valley [ Silicon Valley ] understood; a model that ‘provides’ – for those who joined hands ( became partner affiliates ) with IN-Q-TEL – the ‘opportunity to commercialize’ their ‘innovations’. In addition, IN-Q-TEL ‘partner companies’ would also ‘gain another valuable asset’, access to very difficult CIA ‘problem sets’ that could become ‘market drivers’.

Once the Agency ( CIA ) leadership crossed these critical decision points, the path leading to IN-Q-TEL formation was clear.

In-Q-Tel – Close-Up

In-Q-Tel founder, Norm Augustine, established it as an independent non-profit corporation.

Its Board of Trustees, which now has 10 members, functions as any other board, initially guiding and overseeing the Corporation’s startup activities and setting its strategic direction and policies.

The CEO, who was ‘recruited’ by the Board [ Board of Trustees for IN-Q-TEL CORPORATION ], reports to them [ Board of Trustees for IN-Q-TEL CORPORATION ], and ‘manages’ IN-Q-TEL.

The Corporation [ IN-Q-TEL ] has offices in two ( 2 ) locations:

1. Washington, D.C.; and,

2. Menlo Park, California [ Silicon Valley ].

It [ IN-Q-TEL ] employs a ‘small professional staff’ and a ‘smaller group’ of ‘business consultants’ and ‘technology consultants’.

In-Q-Tel’s mission is to foster the development of new and ‘emerging information technologies’ and pursue ‘research and development’ ( R&D ) that produce solutions to some of the most difficult IT [ information technology ] problems facing the CIA.

To accomplish this, the Corporation [ IN-Q-TEL ] will network extensively with those in ‘industry’, the ‘venture capital’ community, academia, and any ‘others’ who are at the ‘forefront of IT [ information technology ] innovation’.

Through the business relationships that it establishes, In-Q-Tel will create environments for collaboration, product demonstration, prototyping, and evaluation.

From these activities will flow the IT solutions that the Agency ( CIA ) seeks and, ‘most importantly’, the ‘commercial opportunities’ for ‘product development’ by its ‘partners’.

To fulfill its mission, In-Q-Tel has designed itself to be:

– Agile to respond, rapidly to Agency needs and commercial imperatives;

– Problem driven, to link its work to Agency program managers;

– Solutions focused, to improve the Agency’s capabilities;

– Team oriented, to bring diverse participation and synergy to projects;

– Technology aware, to identify, leverage, and integrate existing products and solutions;

– Output measured, to produce quantifiable results;

– Innovative, to reach beyond the existing state-of-the-art in IT; and,

– Over time, self-sustaining, to reduce its reliance on CIA funding.

At its core, In-Q-Tel is designed to operate in the market place on an equal footing with its commercial peers and with the speed and agility that the IT world demands.

As an example, it [ IN-Q-TEL ] can ‘effect the full range of business transactions‘ common to the industry – it is ‘venture [ venture capital ] enabled’, can ‘establish joint ventures‘, ‘fund grants [ grant funding ]‘, sponsor open competitions, ‘award sole source contracts‘, etc. And, ‘because of the many degrees of freedom granted to it‘ [ IN-Q-TEL ] by the Agency ( CIA ), IN-Q-TEL ‘does not require Agency ( CIA ) approval for business deals it negotiates‘.

As such, In-Q-Tel represents a different approach to government R&D.

It [ IN-Q-TEL ] ‘moves away from the more traditional government project’ office model in which the program is managed by the government.

Instead, the Agency ( CIA ) has invested much of the decision-making in the Corporation [ IN-Q-TEL ].

Hence, In-Q-Tel will be judged on the outcomes produced, i.e. the solutions generated, and not by the many decisions it makes along the way.

In-Q-Tel – IT Space

As with many aspects of the In-Q-Tel venture, the Agency took a different approach in presenting its IT needs to the Corporation. It bounded the types of work that In-Q-Tel would perform – its IT operating “space” – by two ( 2 ) criteria:

In the first [ 1st ] instance, it made the decision that In-Q-Tel would initially conduct only unclassified IT work for the Agency ( CIA ).

Second [ 2nd ], to attract the interests of the private sector, it recognized that IN-Q-TEL would ‘principally invest in areas’ where there was both an Agency ( CIA ) need and ‘private sector interest’.

Whereas in the past, much of the commercial computing world did not focus on those technologies useful to the CIA, the intersection zone between intelligence and private sector IT needs has grown tremendously in recent years.

Many of the underlying technologies that are driving the information revolution are now directly applicable to the intelligence business. Examples of commercial applications that also support intelligence functions are:

1. Data warehousing and data mining; 2. Knowledge management; 3. Profiling search agents [ Search Engines and User search requests ]; 4. Geographic information systems [ Satellite Communication Information Systems ]; 5. Imagery analysis and pattern recognition; 6. Statistical data analysis tools; 7. Language translation; 8. Targeted information systems; 9. Mobile computing; and, 10. Secure computing.

Information Security or INFOSEC, a critical enabling technology for all intelligence information systems, is now a mainstream area of research and innovation in the commercial world, due in no small part to the exponential growth in Internet e-commerce.

Thus, there are a number of commercially available security technologies:

1. Strong encryption; 2. Secure community of interests; 3. Authentication and access control; 4. Auditing and reporting; 5. Data integrity; 6. Digital signatures; 7. Centralized security administration; 8. Remote users or traveling users; and, 9. Unitary log-in.

It is, no doubt, the case that the commercial investments flowing into information security outpace the spending made by the Intelligence Community.

Thus, In-Q-Tel will be poised to ‘leverage the investments of others to the benefit of the Agency ( CIA )‘.

Having bounded In-Q-Tel’s IT space with these 2 criteria – ‘unclassified work’ with ‘commercial potential’ – the Agency defined a set of strategic problem areas for the Corporation.

These four ( 4 ) areas have the added and obvious benefit of spanning the needs of all the Agency’s directorates and, hence, its core business functions of collection and analysis:

1. Information Security: Hardening, and intrusion detection, monitoring and profiling of information use and misuse, and network and data protection.;

2. Use of the Internet: Secure receipt of information, non-observable surfing, authentication, content verification, and hacker resistance.;

3. Distributed Architectures: Methods to interface with custom / legacy systems, mechanisms to allow dissimilar applications to interact, automatic handling of archived data, and connectivity across a wide range of environments.; and,

4. Knowledge Generation: Geospatial data fusion and multimedia data fusion or integration and, computer forensics.

Information Technology ( IT ) In-Q-Tel – CIA Occupancy

It will no doubt raise questions with some who will believe that it or the Agency ( CIA ) have other motives.

It is, therefore, important to highlight ‘what In-Q-Tel is not’ and what it [ IN-Q-TEL ] will not do.

First, it is not a front company for the Agency ( CIA ) to conduct any activities other than those spelled out in its Articles of Incorporation and its Charter Agreement.

As a non-profit – 501(c)3 – corporation, it will operate in full compliance with the Internal Revenue Service ( IRS ) regulations and, as with all similar non-profits, its IRS filing will be a matter public record.

In-Q-Tel is ‘openly affiliated with the Agency ( CIA )’, as was made obvious to the world during its press rollout on September 29, 1999.

Of equal importance, it will not initiate work in areas that lead to solutions that are put into so-called “black boxes” – that is, innovations that the government subsequently classifies. To do so would undercut In-Q-Tel credibility with its business partners to the detriment of the Agency.

Finally, IN-Q-TELis a solutions company‘, ‘not a product company‘.

Working through its business partners, it will demonstrate solutions to Agency problems but will not generate products for use by Agency components.

In-Q-Tel ‘inspired products‘ will be ‘developed through separate contractual arrangements‘ involving Agency ( CIA ) ‘components‘ and ‘other vendors‘.

In-Q-Tel – Structure & Staffing

Central to the In-Q-Tel business model are speed, agility, market positioning, and leveraging.

These attributes, taken together, have helped shape the evolving structure of the Corporation. It is one that intends to emphasize the “virtual” nature of the Corporation while minimizing “brick and mortar” costs, i.e. it will operate by facilitating data sharing, and decisionmaking via seamless communications using a private network with broadband connectivity to the Agency and its partners, while limiting direct infrastructure investments in laboratories and related facilities by leveraging the facilities of others.

To facilitate this intent, the In-Q-Tel Board and CEO decided to hire a small staff composed of people with strong technical and business skills.

At present, the Corporation has about ten ( 10 ) staff employees and, it is expected that, by the end of the year 2000, the total will number about thirty ( 30 ).

The CEO is currently designing In-Q-Tel management structure, but the parameters he has set for it indicate that it will be very flat and aligned for rapid decision-making.

How In-Q-Tel Works

One of the great leaps of faith the Agency took in this venture was to recognize, early on, that private sector businessmen were better equipped than it was to design the Corporation and create its work program.

The Agency’s critical role was to develop the initial concept, help form the best Board possible, give IN-Q-TEL a challenging problem set, and then design a ‘contractual vehicle‘ that ‘gave‘ the [ CIA ] CorporationQIC ] the ‘necessary degrees of freedom to design itself and operate in the market place‘.

All of this was accomplished in less than 1-year, to include the design of In-Q-Tel’s initial work program. In-Q-Tel’s current work program is built on a process of discrete, yet overlapping, elements – IT roadmapping, IT baselining, and R&D projects.

The underlying philosophy now driving the In-Q-Tel program is to gain an understanding of the many players occupying In-Q-Tel’s IT space – by roadmap analysis – and, concurrently, test and validate the performance and utility of existing products and technologies – by baseline testing – against current Agency needs.

If the test results are successful, the Agency ( CIA ) has the ‘option’ of quickly ‘purchasing’ the ‘products’ directly ‘from the vendor’.

However, in those ‘cases where there are no existing products or technologies‘, or where a gap exists between the baseline test results and the Agency ( CIA ) needs, IN-Q-TEL will launch R&D projects.

In this way, the Agency ( CIA ) obtains near-term solutions through the evaluation of those products considered “best-in-class” and can target its R&D projects more precisely – that-is, to where ‘commercial‘ or ‘other government [ contract ] IT investments [ $ ] are small‘ or nonexistent.

With its first [ 1st ] year budget of about $28,000,000 million, In-Q-Tel has focused its initial efforts on the IT roadmap and baseline elements of the program.

The roadmap project seeks, first, to ‘identify those in industry, government, and academia who occupy the same IT space as In-Q-Tel’ and, secondarily, to ‘spot existing technologies of potential interest’.

The results will also help In-Q-Tel leverage the technical advances made by others, assess the overall direction and pace of research, avoid duplicating work done by other government entities, and highlight [ identify and target ] potential business partners. The roadmap will be updated and refined by In-Q-Tel throughout the life of its work program.

Two ( 2 ) Team Incubators & Twenty ( 20 ) Hi-Tech Firms [ Businesses ]

These twenty (20) are executing the baseline-testing element of the In-Q-Tel work program. They were selected by an independent review panel of national IT experts convened by In-Q-Tel to evaluate multiple proposals.

Each of the two ( 2 ) teams is working on one ( 1 ) or more ‘incubator concepts’ derived by In-Q-Tel from the Agency ( CIA ) ‘problem set‘ enumerated above. The incubator teams will operate for over a [ 1 ] year. As the In-Q-Tel work program grows, it is possible that other baseline incubator teams will be established.

The R&D part of the program, which In-Q-Tel manages, will soon become the core of its activities, with a growing percentage of its funds directed towards a portfolio of research projects. In-Q-Tel is formulating its research thrusts based on the information and test results gathered under the roadmap and baseline work, aided by extensive interactions with the private sector and the Agency.

The design of the research projects will be set by In-Q-Tel and will vary to meet the mutual interests of the Agency ( CIA ), In-Q-Tel, and its prospective business partners.

As mentioned earlier, In-Q-Tel will ‘draw from a broad range of R&D competition‘ models to attract the business partners it seeks.

In some cases, it may assemble teams of companies that each has a necessary part, but not the whole, of the solution In-Q-Tel seeks.

In ‘other projects’ IN-Q-TEL might be a co-investor in a fledgling company with another business partner such as a venture capital firm.

Or, it could take a more traditional route, using a request for proposal.

In essence, In-Q-Tel will use whatever model most efficiently and effectively meets the needs of all parties to a transaction, with a constant eye towards leveraging its resources and solving the Agency’s IT needs.

Common to most or all of the R&D agreements that In-Q-Tel intends to use will be the subject of intellectual property (IP), or more precisely said, the ownership of IP and the allocation of IP generated revenues.

In the area of IT R&D, a deal is typically not struck until all of the parties’ IP rights are clearly established.

In-Q-Tel’s acceptance within the IT market place depends heavily on its ability to negotiate industry standard IP terms.

Recognizing this, the Agency ( CIA ) agreement with In-Q-Tel allows it and/or its partners to retain title to the innovations created and freely negotiate the allocation of IP derived revenues.

The only major stipulation is that the Agency ( CIA ) retain traditional “government purpose rights” to the ‘innovations‘.

Contract Model – In-Q-Tel

Before the partnership between In-Q-Tel and the Agency became a reality, the Agency ( CIA ) had to develop a new contract vehicle that granted the Corporation [ QIC ] the degrees of freedom it needed to operate in the market place.

Most Agency ( CIA ) contracts, including those in R&D, are based on the Federal Acquisition Regulations ( FAR ), however FAR is often viewed by industry as overly burdensome and inflexible. And, it has been the U.S. Department of Defense ( DoD ) experience that smaller companies often will not contract with the government because of the extra costs they would incur to be FAR compliant.

Because the Agency ( CIA ) wanted to encourage such companies to work with In-Q-Tel, it took a different approach and designed a non-FAR agreement with the IN-Q-TEL CORPORATION.

It [ CIA ] also adopted elements from the old internet Godfather, i.e. Advanced Research Projects Agency or ARPA and its model based on “Other Transactions ( OT )” authority granted to the DoD [ U.S. Department of Defense ] by Congress [ U.S. Congress ].

OT [ Other Transaction ] agreements ‘permit authorized government agencies’ [ e.g. CIA ] to design R&D agreements outside the FAR.

The hoped for result is to spur greater flexibility and innovation for the government. In addition, it permits well-managed businesses, large and small, to perform R&D for the government, using their existing business practices and procedures.

Using an ARPA model OT agreement as a guide, the Agency ( CIA ) designed a 5-year Charter Agreement that describes the broad framework for its relationship with IN-Q-TEL, sets forth general policies, and establishes the terms and conditions that will apply to future contracts. In addition, a short-term funding contract was negotiated that includes In-Q-Tel’s “Description of Work”.

Together these documents define the metes and bounds of the Agency ( CIA ) relationship with In-Q-Tel and permit IN-Q-TEL to negotiate agreements with its partners, absent [ without ] most government flow down requirements.

In-Q-Tel – Advancements

The In-Q-Tel venture is one that has challenged the Agency to think creatively and quickly to address the fundamental changes that the information revolution is having on its core business.

It responded by setting aside traditional policies and practices in many areas and established a new partnership with industry and academia, based on shared interest and mutual benefit.

But, one cannot ignore that this venture involves risk, both for the Agency and In-Q-Tel. From the Agency’s perspective there are three ( 3 ) major areas that will require constant attention:

1. Managing its relationship with IN-Q-TEL;

2. Solution transfer; and,

3. Security.

Perhaps the most important of the three is the first, managing the relationship without stifling In-Q-Tel’s competitive edge.

IN-Q-TEL is a small independent corporation ‘established to improve the mission performance of a much larger government Agency‘. [ ? National Secruity Agency ( NSA ) ? ]

The imperatives that led to In-Q-Tel have many parallels in industry. In fact, the IT sector is replete with examples of a large corporation seeking to improve its competitiveness by either purchasing a small start-up company or forming a subsidiary.

The ‘parent corporation‘ [ ? ] sees in ‘its offspring‘ traits that it no longer possesses – speed, agility, and expertise. But, for these traits to be realized, ‘the start-up‘ must operate unencumbered from the ‘parent corporation‘ [ ? ], whose natural tendency is to rein in and control it.

Similarly, the Agency ( CIA ) will have to restrain its natural inclination to micromanage IN-Q-TEL and, instead, allow the Corporation [ QIC ] the freedom to prosper. It must have continuous insight into In-Q-Tel’s activities, but must understand that In-Q-Tel is responsible for its own operations, including the design and management of the work program.

Acceptance by Agency ( CIA ) components of In-Q-Tel inspired solutions will be the most important measure of success in this venture. It is also likely to be the hardest.

While there is every expectation that In-Q-Tel will become commercially successful and seed innovative solutions, if they are not accepted and used by Agency line managers, then the overall venture will be judged a failure.

Although In-Q-Tel has a critical role in the solution transfer process, the burden rests with the Agency, since the challenges are as much managerial and cultural as they are technical.

The Agency ( CIA ) Chief Information Officer ( CIO ), directorate heads, and component directors will all have to work closely with IN-Q-TEL to overcome bureaucratic inertia and identify eager recipients of the innovations that the Corporation develops.

Agency ( CIA ) “product champions” for each IN-Q-TEL project should be identified early and should participate fully in its formulation, testing, and evaluation. Incentives should be considered for those Agency ( CIA ) components that commit to projects with unique risks or that require extensive personnel commitments.

These and other strategies will be employed to ensure that the return on the Agency’s investment in In-Q-Tel translates into measurable improvements in its mission performance.

The open affiliation between the CIA and In-Q-Tel is yet another unique aspect and challenge for this venture. Although the Corporation [ QIC ]will be doing only unclassified work for the Agency ( CIA ), the nature of its IT research and its association with a US intelligence agency will undoubtedly attract the interests of foreign persons, some with questionable motives.

The obvious security ramifications of this scenario were well considered in the decisionmaking process that led to In-Q-Tel’s formation. It was ultimately decided that the risks are manageable and, in many ways, are similar to those faced by any high-tech company trying to protect its IP and trade secrets.

IN-Q-TEL and the Agency ( CIA ) will be working closely to ensure that the Corporation [ QIC ] operates with a high degree of security awareness and support.

In-Q-Tel has a critical role in meeting these three ( 3 ) challenges. However, it’s most persistent challenge will be developing and sustaining a reputation as a business that:

1. sponsors leading edge research;

2. produces discoveries; that can be,

3. profitably commercialized.

Once it has established a record of accomplishment in these two areas, the high caliber IT talent the Agency hopes to reach through In-Q-Tel will be drawn to the Corporation.

In-Q-Tel Future

Those of us at the Agency who helped to create In-Q-Tel are endlessly optimistic about its prospects for success. The early indicators are all positive. Among them is the caliber of the people who stand behind and lead the Corporation and the initial reaction from industry and the trade press to its formation.

IN-Q-TEL Board of Trustees is at least the equal of any large corporation’s board. They are committed to the Agency ( CIA ) mission, the new R&D model that IN-Q-TEL represents, and have invested much of their time to its formation.

The Agency and the nation are in their debt.

The Board [ IN-Q-TEL Board of Trustees ] also recruited an outstanding CEO who brings with him the ‘experiences’ and ‘contacts’ of his Silicon Valley [ California ] base and an established reputation for starting and growing new IT companies.

The favorable press coverage of In-Q-Tel combined with the industry “buzz” engendered by the Board and CEO have brought a flood of inquiries by those interested in doing business with the Corporation. And, most importantly, its work program is already beginning to achieve results that the Agency ( CIA ) can use and that its ( CIA ) partners can commercialize.

Judging by the record to date, the road ahead appears promising. But, In-Q-Tel’s fate also rests in part on those institutions charged with oversight of the Agency and its budget.

Congress has supported the Agency as it launched this new venture. The U.S. Congress “seeded the venture with start-up funding” when it was still in its conceptual phase, but asked hard questions of the Agency throughout the design and formation of In-Q-Tel.

Members understood that starting an enterprise such as IN-Q-TEL is ‘not risk free‘. As with all R&D efforts in government and industry, there will be some home run successes but also some failures. That is the price the Agency must be prepared to pay if it wants to stay on the leading edge of the IT revolution.

With In-Q-Tel’s help plus the continued support of Congress [ U.S. Congress ] and Office of Management and Budget ( OMB ), as well as from the traditional Agency ( CIA ) ‘contractor community‘ and ‘others‘, an “e-CIA” of the next century [ 21st Century ] will evolve quickly, to the benefit of the President and the national security community.

Notes Of Interest

For the next one ( 1 ) or two ( 2 ) years [ 1999 – 2000 ], IN-Q-TEL will accept work only from the CIA‘.

All solutions that it provides to the CIA will be made available to the entire Intelligence Community.

Codified in a 5-year Charter Agreement with the CIA and a 1-year funding contract that is renewable annually. As stipulated in the Charter Agreement, “…the Federal Government shall have a nonexclusive, nontransferable, irrevocable, paid-up license to practice or have practiced for or on behalf of the United States the subject invention throughout the world for Government purposes”.

The Agency ( CIA ) component that has day-to-day responsibility for guiding the CIA relationship with IN-Q-TEL, including the ‘design and implementation of the contract’ and the ‘problem set’, is the IN-Q-TEL INTERFACE CENTER ( QIC ) which resides inside the CIA Directorate of Science and Technology.

Circa: 2002 – 2008

IN-Q-TEL INCORPORATED (aka) IN-Q-IT CORPORATION 2500 Sand Hill Road – Suite 113 Menlo Park, California 94025 – 7061 USA TEL: +1 (650) 234-8999 TEL: +1 (650) 234-8983 FAX: +1 (650) 234-8997 WWW: http://www.inqtel.com WWW: http://www.in-q-tel.org WWW: http://www.in-q-tel.com

IN-Q-TEL INCORPORATED (aka) IN-Q-IT CORPORATION P.O. Box 12407 Arlington, Virginia 22219 USA TEL: +1 (703) 248-3000 FAX: +1 (703) 248-3001

– –

IN-Q-TEL focus areas, surround:

– Physical Technologies; – Biological Technologies; – Security; and, – Software Infrastructure.

– –

IN-Q-TEL

Investments –

Strategic Investments, Targeted Returns

In-Q-Tel is ‘building’ a ‘portfolio of companies’ that are ‘developing innovative solutions’ in ‘key technology areas’

Similar to many ‘corporate strategic venture’ firms, In-Q-Tel seeks to ‘optimize potential returns’ for our clients — the CIA and the broader Intelligence Community— by investing in companies of strategic interest.

In-Q-Tel engages ‘start-ups’, ‘emerging’ and ‘established’ companies, universities and research labs.

In-Q-Tel structure attractive win-win relationships through ‘equity investments’, as well ‘strategic product development funding’, and ‘innovative intellectual property arrangements’ and ‘government business development guidance’.

An Enterprising Partner –

In-Q-Tel ‘portfolio companies’ value a ‘strategic relationship’ with a ‘proactive partner’.

Companies, that work through In-Q-Tel due diligence process, know their technologies have the potential to address the needs of one of the most discriminating enterprise customers in the world.

In-Q-Tel takes a hands-on approach, working closely with our ‘portfolio companies’ to help ‘drive their success’ in the ‘marketplace’ and to ‘mature [ ‘grow’ ] their technologies’.

In-Q-Tel ‘investment goals’ are focused on ‘return’ on technology – a ‘blend of factors’ that will ‘deliver strategic impact’ on the Agency [ CIA ] mission:

– Effective ‘deployments’ of innovative technologies to the CIA; – Commercially successful ‘companies that can continue’ to ‘deliver’ and ‘support’ innovate technologies; and, – Financial ‘returns to fund further technology investments’ to ‘benefit the Intelligence Community’.   Investing In Our National Security –

In just a few short years, In-Q-Tel has ‘evaluated’ nearly two thousand [ 2,000 ] ‘proposals’:

75% [ 1,500 ] of which have come from companies that had never previously considered working with the government.

To date, In-Q-Tel ‘established strategic relationships’ with more than ‘twenty’ ( 20 ) of these ‘companies’.

Read more about our ‘portfolio companies’ and ‘technology partners’, or learn how to submit a business plan to In-Q-Tel.

Areas Of Focus –

IN-Q-TEL focuses on next generation technologies for gathering, analyzing, managing and disseminating data. Learn more about our areas of focus:

Knowledge Management: [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/tech/km.html ];

Security and Privacy: [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/tech/sp.html ];

Search and Discovery: [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/tech/sd.html ];

Distributed Data Collection: [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/tech/dd.html ]; and,

Geospatial Information Services: [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/tech/gi.html ].

Submit A Business Plan –

“In-Q-Tel also has garnered a reputation in the tech and VC [ Venture Capital ] worlds for being hard-nosed during due diligence. Unlike some venture firms, In-Q-Tel is staffed with hard-core techies who know how to put a program through the ringer. They’ve also got one of the roughest testing domains: the computer systems of the CIA.” – Washington Business Journal ( November 19, 2001 )

– View our criteria [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/submit/index.html ] for submission, and apply for consideration online.

Media Resources –

– Investment Portfolio: [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/news/attachments/InQTelInvestmentPortfolio.pdf ].

Reference

http://web.archive.org/web/20020630223724/www.inqtel.com/invest/index.html

– – – –

Circa: 2002

IN-Q-TEL

Investments –

Technology Partners ( 2002 ) –

INKTOMI [ http://www.inktomi.com ] ( Leading Edge Search and Retrieval Technology )

INKTOMI, based in Foster City, California ( USA ), has offices elsewhere in North America, Asia and Europe.

INKTOMI division business, involves:

Network Products – comprised of industry leading solutions for network caching, content distribution, media broadcasting, and wireless technologies; and,

Search Solutions – comprised of general Web search and related services, and ‘enterprise’ search.

Inktomi ‘develops’ and ‘markets’ network infrastructure software essential for ‘service providers’ and ‘global enterprises’.

Inktomi ‘customer’ and ‘strategic partner’ base of leading companies, include:

MERRILL LYNCH; INTEL: AT&T; MICROSOFT; SUN MICROSYSTEMS; HEWLETT-PACKARD; COMPAQ; DELL; NOKIA; AMERICA ONLINE ( AOL ); and, YAHOO.

SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) Lead System Integrator ( LSI ) – SAIC LSI [ http://www.saic.com/contractcenter/ites-2s/clients.html  ]

SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ), founded in 1969 by Dr. J. R. Beyster who remained with SAIC for 30-years until at least November 3, 2003, has had as part of its management, and on its Board of Directors, many well known former U.S. government personnel, including:

– Melvin Laird, Secretary of Defense in the Richard Milhouse Nixon Presidential Administration;

– William Perry, Secretary of Defense in the William Jefferson Clinton Presidential Administration;

– John M. Deutch, U.S. Central Intelligence Agency ( CIA ) Director of Central Intelligence ( DCI ) in the William Jefferson Clinton Presidential Administration;

– U.S. Navy Admiral Bobby Ray Inman, U.S. National Security Agency ( NSA ) and U.S. Central Intelligence Agency ( CIA ) – various employed capacities in ‘both’ Agencies – in the Gerald Ford Presidential Administration, Billy Carter Presidential Administration and Ronald Reagan Presidential Administration;

– David Kay, who led the search for Weapons of Mass Destruction ( WMD ) – following the 1991 U.S. Persian Gulf War – for the United Nations ( UN ) and in the George W. Bush Sr. Presidential Administration following the 2003 U.S. invasion of Iraq.

In 2009, SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) moved corporate headquarters to Tysons Corner at 1710 SAIC Drive, McLean, Virginia ( USA ).

SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) is a scientific, engineering and technology ‘applications company’ with numerous ‘state government clients’, ‘federal government clients’, and ‘private sector clients’.

SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) works extensively, with:

U.S. Department of Defense ( DOD ); U.S. Department of Homeland Security ( DHS ); U.S. National Security Agency ( NSA ); U.S. intelligence community ( others ); U.S. government civil agencies; and, Selected commercial markets.

SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) Subsidiaries –

SAIC VENTURE CAPITAL CORPORATION; SCICOM TECHNOLOGIES NOIDA ( INDIA ); BD SYSTEMS ( BDS ); BECHTEL SAIC COMPANY LLC; BECK DISASTER RECOVERY ( BDR ); R.W. BECK; BENHAM; CLOUDSHIELD; DANET; EAGAN MCALLISTER ASSOCIATES INC.; HICKS & ASSOCIATES MEDPROTECT LLC REVEAL; SAIC-FREDERICK INC.; NATIONAL CANCER INSTITUTE ( NCI ); SAIC INTERNATIONAL SUBSIDIARIES; SAIC LIMITED ( UK ); CALANAIS ( SCOTLAND ); VAREC; APPLIED MARINE TECHNOLOGY CORPORATION; EAI CORPORATION; and, Others.

In 1991, SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) received transference of the U.S. Department of Defense ( DOJ ), U.S. Army ( USA ), Defense Intelligence Agency ( DIA ) ‘Remote Viewing Program’ renamed STARGATE Project.

In January 1999, SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) consultant Steven Hatfill saw SAIC vice president Joseph Soukup internally ( with no outside client ) commission ( with no outside client ) William C. Patrick – a retired leading figure in the legacy U.S. bioweapons program – see a report produced ( 28-pages on Feburary 1999 ) on terrorist anthrax attack possibilities via Unitd States postal mailings prior to 2001 anthrax attacks in the United States.

In March 2001, the U.S. National Security Agency ( NSA ) had SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) in ‘concept definition’ phase for what later became known as the NSA TRAILBLAZER Project, a “Digital Network Intelligence” system intended to ‘analyze data’ carried across computer ‘networks’.

In 2002, the U.S. National Security Agency ( NSA ) chose SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) to produce a ‘technology demonstration platform’ for the NSA TRAILBLAZER Project, a contract worth $280,000,000 million ( USD ).

TRAILBLAZER Project participants, included:

BOEING; COMPUTER SCIENCES CORPORATION ( CSC ); and, BOOZ ALLEN HAMILTON.

In 2005, TRAILBLAZER – believed by speculators ( http://www.PhysOrg.Com et. al. ) to be a continuation of an earlier data mining project THINTHREAD program – saw U.S. National Security Agency ( NSA ) Director Michael Hayden inform a U.S. Senate hearing that the TRAILBLAZER program required several hundred million dollars over budget – consequently trailing years behind schedule waiting for approvals.

From 2001 through 2005, SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) was primary contractor for the $600,000,000 million ( USD ) TRILOGY Program, a three ( 3 ) part program – intended to replace obsolete FBI computers with a then-new state-of-the-art cutting edge technology ‘secure high-speed computer network system’ that would install 500 computer network servers, 1600 scanners and thousands of desktop computers in FBI field offices – that on December 2003 delivered to the U.S. Department of Justice ( DOJ ) Federal Bureau of Investigation ( FBI ) its SAIC “Virtual Case File” ( VCF ), a $170,000,000 million ( USD ) software system designed to speed tracking of terrorists, better accurize communications amongst agents fighting criminals with this FBI ‘critical case management system’, however nineteen ( 19 ) different government managers involved 36 contract modifications averaging 1.3 FBI changes everyday totaling 399 changes during 15-months afterwhich the FBI continued arguing ( through its own intermediary, AEROSPACE CORPORATION ) changes until the U.S. Department of Justice ( DOJ ) Inspector General ( IG ) criticized its ‘FBI handling’ of SAIC software, whereon February 2005 SAIC ‘recommended’ the FBI at-least ‘begin using’ the SAIC TRILOGY VCF ‘case management system’.

On September 27, 2006 during a special meeting of SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) stockholders, employee-owners voted by a margin of 86% to proceed with the initial public offering ( IPO ) whereupon completion SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) also paid – to existing stockholders – a ‘special dividend’ of $1,600,000,000 billion to $2,400,000,000 billion ( USD ).

On October 17, 2006 SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) conducted an initial public offering ( IPO ) of common stock offering of 86,250,000 shares priced at $15.00 per share. Underwriters – BEAR STEARNS and MORGAN STANLEY – exercised over-allotment options resulting in 11,025,000 million shares seeing the IPO raise $1,245,000,000 billion ( USD ).

SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) had approximately 46,000 total employees, 16,000 employees were in McLean, Virginia ( USA ) and another 5,000 employees were in San Diego, California ( USA ).

SRA INTERNATIONAL INC. ( SRA ) [ http://www.sra.com/about-us/profile.php ]

SRA INTERNATIONAL INC., founded in 1978, headquartered in Fairfax, Virginia has additional U.S. offices.

SRA INTERNATIONAL INC. is a leading provider of information technology services and solutions to clients in national security, health care and public health, and civil government markets, requiring:

– Strategic Consulting; – Systems Design, Development, and Integration; – OutSourcing; and, – Operations Management.

SRA INTERNATIONAL INC. also delivers business solutions, for:

– Text mining; – Data mining; – Disaster and Contingency Response Planning; – Information Sssurance; – Environmental Strategies – Environmental Technology; – Enterprise Systems Management; and, – Wireless Integration.

SRA INTERNATIONAL INC. ORIONMagic ®

– –

Circa: 2002

IN-Q-TEL

Investments –

Portfolio Of Companies ( 2002 ) – Partial List

ARCSIGHT [ http://www.arcsight.com ] ( Security Management Software for The Enterprise )

ArcSight, founded in May 2000, is located in the heart of Silicon Valley, California ( USA ).

ArcSight is a leading supplier of enterprise software that provides the security “air traffic control system” for large, geographically dispersed organizations. These organizations are augmenting their network infrastructure with a wide variety of security devices such as firewalls, intrusion detection and identity management systems that produce a barrage of uncoordinated alarms and alerts that overwhelm the security staff.

With its ‘centralized view’ of ‘all security activity’ combined with ‘real time analysis’ of ‘events’, by both ‘operating at the perimeter and inside’ the organization, ArcSight provides a ‘single solution’, for:

Event capture; Log aggregation; Real time correlation; Incident investigation; and, Reporting.

ArcSight ‘separates’, the ‘true threats and attacks’ from the ‘millions of false alarms and non-threatening activities’ that occur each day, focusing attention and resources on high-priority problems.

The company has delivered enterprise, ‘security management solutions’ to leading ‘financial services’, ‘government’ and ‘manufacturing’ organizations while ‘attracting capital’ from ‘leading investors’, such as:

IN-Q-TEL; KLEINER PERKINS CAUFIELD & BYERS ( KPCB ); and, SUMITOMO CORPORATION.

ATTENSITY CORPORATION [ http://www.attensity.com ] ( Text Extraction for Threat Detection )

Attensity Corp., founded in 2000, is a privately held company with dual headquarters in Mountain View, California ( USA ) and Salt Lake City, Utah ( USA ).

Attensity Corp. provides enterprise, ‘analytic software’ and ‘services’, to:

Government agencies; and,

Fortune 500 companies.

Attensity has developed breakthrough text extraction technology that transforms information captured in free form text into structured, relational data.

Attensity enables government agencies to dramatically expand their analytical capabilities in the area of ‘threat detection’ by, powering:

Link analysis; Trending; Exception reporting; Other advanced analytics; and, Knowledge management applications.

Attensity technology is the culmination of nearly a decade [ 10-years ] of research in computational linguistics.

Attensity Corporation customers include:

IN-Q-TEL, a strategic venture group funded by the CIA; WHIRLPOOL; and, JOHN DEERE.

ATTENSITY CORPORATION investor, is:

IN-Q-TEL

BROWSE3D [ http://www.browse3d.com ] ( Advanced Web Navigation )

BROWSE3D, founded in 2000, is located in the Dulles Technology Corridor of northern Virgina.

The company’s first Knowledge Management product, the Browse3D Browser, enables Internet users to browse Web sites using a dynamic, interactive, 3 dimensional ( 3-D ) display environment.

One year later [ 2001 ] the Browse3D Browser was recognized as the Best Internet Software of 2001 at the COMDEX Fall Technology Show ( Las Vegas, Nevada, USA ).

Browse3D launched its ‘consumer product’ in January 2002.

For the past 2-years [ since 2000 ], Browse3D has been working to re-invent the online researcher’s tool set. A researcher’s ability to ‘harvest relevant online data’ is often limited by the tools available to view that data.

Future products and technologies promise additional improvements in the way users ‘find’, ‘organize’, ‘save’ and ‘exchange’ web-based ‘content’.

BROWSE3D early-stage venture funding provided, by:

IN-Q-TEL; and, angel investors.

CANDERA INC. [ http://www.candera.com ] ( Enterprise Storage )

Candera Incorporated, founded in 2000, is a development stage stealth mode company headquartered in Milpitas, California ( USA ).

Candera Inc. is developing a new generation, purpose built, network based storage management platform that gives businesses unprecedented ‘control over’ and ‘visibility into’ their networked storage environments.

With the Candera Confluence solution, businesses can dramatically improve the utilization of their existing heterogeneous storage assets by consolidating them into a centrally managed storage pool. These can then be quickly and dynamically allocated to meet the needs of current and future network based applications, giving large enterprises a strategic advantage.

Candera is building the first [ 1st ] system, of a new generation of systems, that will enable customers to unleash the ultimate value of networked information storage.

CONVERA [ http://www.convera.com ] ( Mission Critical Enterprise Search and Categorization Software )

Convera RetrievalWare is a high-performance intelligent search system that allows broad flexibility and scalability for implementation across corporate intranets and extranets, enabling users to index and search a wide range of distributed information resources, including text files, HTML, XML, over 200 proprietary document formats, relational database tables, document management systems and groupware repositories. Convera RetrievalWare excels in distributed client environments and server environments with hundreds or thousands of users, documents, images and / or multiple media assets.

Advanced search capabilities include concept and keyword searching, pattern searching and query by example.

Convera is a leading provider of enterprise mission-critical ‘search’, ‘retrieval’ and ‘categorizing’ solutions.

More than 800 customers – in 33 countries – rely on Convera search solutions to power a broad range of mission critical applications, including enterprise:

Portals; Knowledge management; Intelligence gathering; Profiling; Corporate policy compliance; Regulatory compliance; Customer service; and, More.

DECRU [ http://www.decru.com ] ( Secure Networked Storage )

Decru, founded in April 2001, is headquartered in Redwood City, California ( USA ).

Decru solves the problem of secure data storage with a robust, wire-speed encryption appliance that fits transparently into any SAN or NAS storage environment, protecting data from both internal and external threats.

Markets include essentially any organization with a need to protect proprietary or confidential information ( e.g. government, technology, financial services, health care ).

Investors, include:

IN-Q-TEL; NEA; GREYLOCK; and, BENCHMARK.

GRAVITRON [ http://www.graviton.com ] ( Early Warning Detection and Notification System for Homeland Security Over Wireless Mesh Networks )

GRAVITON, founded in 1999, is located in La Jolla, California, USA.

Solomon Trujillo, former head of U.S. WEST ( baby bell telephone company ), leads GRAVITRON.

GRAVITON is on leading edge of a fledgling ( small ) industry, known as:

Machine to Machine Communications ( M2M ).

GRAVITON is developing an advanced integrated wireless sensor platform uniquely optimized for large-scale distributed sensor network applications working with Micro Electro Mechanical Systems ( MEMS ) sensor and spread spectrum wireless technologies licensed exclusively to GRAVITON from the U.S. National Laboratory at Oakridge ( also known as ) Oakridge National Laboratory ( Tennessee, USA ) – managed by the U.S. Department of Energy ( DOE ).

GRAVITON products and solutions integrate wireless, sensor and data management technology enabling enterprises to efficiently and transparently monitor, control, send, receive, and update system information from devices anywhere in the world.

GRAVITON is supported and funded by a number of corporate partners and investors, including:

IN-Q-TEL; GLOBAL CROSSING; ROYAL DUTCH SHELL ( oil / petroleum ); MITSUI; SIEMENS; QUALCOM; OMRON; MOTOROLA; and, SUN MICROSYSTEMS.

GRAVITON ‘primary’ financial investors, include:

MERRILL LYNCH;

GRAVITON ‘venture capital’ firms, include:

KLEINER PERKINS CAUFIELD & BYERS ( KPCB ); and, EARLYBIRD.

INTELLISEEK [ http://www.intelliseek.com ] ( Enterprise Intelligence Solutions )

INTELLISEEK, founded in 1997, has since 1998 been changing the way organizations ‘understand’, ‘gather’ and ‘use’ enterprise ‘intelligence’.

INTELLISEEK ‘knowledge discovery tools’ [ as of: 2002 ] enable the nation’s largest enterprises with up-to-the-minute consumer, industry information and ‘competitive intelligence’.

INTELLISEEK ‘Enterprise Search Server’™ ( ESS ) search platform provides a suite of intelligent applications that automate ‘knowledge discovery’ and ‘knowledge aggregation’ from hundreds of disparate, and often hard-to-locate data sources.

INTELLISEEK ‘Knowledge Management’ and ‘Search and Discovery’ solutions solve the fundamental problem of “information overload” by identifying and searching relevant, targeted and personalized content from the internet, intranets and extranets.

INTELLISEEK clients, include:

FORD MOTOR COMPANY ( FOMOCO ); NOKIA; and, PROCTOR AND GAMBLE.

Investors include:

IN-Q-TEL; FORD VENTURES; RIVER CITIES CAPITAL; GENERAL ATLANTIC PARTNERS LLC; FLAT IRON PARTNERS; BLUE CHIP VENTURE COMPANY; NOKIA VENTURES; and, Other private investors.

METACARTA [ http://www.metacarta.com ] ( Geospatial Data Fusion )

MetaCarta, established in 1999, was launched on more than $1,000,000 million in funding from the U.S. Department Of Defense ( DOD ) Defense Advanced Projects Agency ( DARPA ) and private investors.

MetaCarta CEO John Frank, with a doctorate from the Massachusets Institute Of Technology ( MIT ) where during 1999 – as a Hertz Fellow in physics working on a PhD – conceived a new way to view – geographically – ‘collections of text’ that later saw MetaCarta combine his interests in algorithms, information design, and scientific models of real world phenomena.

Metacarta provides a new knowledge management platform that integrates ‘text data with geography’ providing a ‘cohesive system’ for ‘problem solving’.

METACARTA Geographic Text Search ( GTS ) appliance, the software solution, redefines how people interact with information, enabling analysts to view text reports and geographic information in one ( 1 ) logical view through integration of text and geography delivering new information not obtainable from any other source.

MetaCarta CEO John Frank graduated from Yale University.

MOHOMINE [ http://www.mohomine.com ] ( Transforming Unstructured Multi-Language Data Into Actionable Information )

MOHOMINE, founded in 1999, is privately-held and located in San Diego, California, USA.

MOHOMINE technology has been deployed by United States national security organizations.

MOHOMINE mohoClassifier for National Security Organizations ™ reviews ‘text information’ in ‘cables’, ‘e-mails’, ‘system files’, ‘intranets’, ‘extranets’ and ‘internet’ providing ‘automated document classification’, ‘routing’ – based upon ‘learn-by-example pattern recognition’ technology – and ‘reports’ on user defined properties such as ‘topic’, ‘subject’, ‘tone’ ( ‘urgent’, plus others ), ‘author’, ‘source’ ( geographic locations, ‘country’, etc. ), and more.

MOHOMINE mohoClassifier users can easily set up ‘filters’ to automatically ‘identify’ and ‘prioritize’ ( ‘read first’ requirement ) documents that are quickly processed – out-of large volumes of other data – and then quickly route prioritized information to quickly reach the proper people.

MOHOMINE, from Global 5000, currently [ since 2002 ] has more than one hundred fifty ( 150 ) customers across numerous vertical industries, including:

CITICORP; WELLS FARGO; INTEL; TEXAS INSTRUMENTS; PFIZER; BOEING; ORACLE; PEOPLESOFT; and, NIKE.

MOHOMINE investors, include:

IN-Q-TEL; HAMILTON APEX TECHNOLOGY VENTURES; and, WINDWARD VENTURES.

QYNERGY CORPORATION [ http://www.qynergy.com ] ( Long-Lasting Power Solutions For Multiple Applications And Small-Tech )

QYNERGY CORP., founded in 2001, is located in Albuquerque, New Mexico.

QYNERGY technology originated at the U.S. National Laboratory at Sandia ( also known as ) Sandia National Laboratories ( New Mexico, USA ) and at the University of New Mexico ( New Mexico, USA ).

QYNERGY Corp. develops leading-edge energy solutions based on QYNERGY proprietary QynCell ™ technology that made an exciting breakthrough – over other ‘battery’ or ‘portable energy’ devices – in ‘materials science’ allowing QYNERGY to possess several unique competitive advantages.

QYNERGY QynCell ™ is an ‘electrical energy device’ revolution, providing:

Long-lived Batteries – QynCell usable life is potentially over a period of ‘several decades’ ( 10-year multiples ), during which time the QynCell device ‘does not require external charging’;

Miniature and Micro Applications – QynCell™ technology is scaleable, thus can be ‘miniaturized’, for:

Micro Electro Mechanical Systems ( MEMS ); MicroPower™ applications; Small microelectronics; and, Power-on-a-chip applications.

SAFEWEB [ http://www.safewebinc.com ] ( Secure Remote Access )

SAFEWEB, established in April 2000, is based in Emeryville, California, USA.

SAFEWEB built the world’s largest ‘online privacy network’, however in 2001 its ‘free online service’ was ‘concluded’ – to focus on developing its ‘enterprise’ product.

SAFEWEB is a leading provider of innovative security and privacy technologies that are effective, economical and simple.

SAFEWEB Secure Extranet Appliance ( SEA ), the first [ 1st ] SAFEWEB enterprise security release – reduces the cost and complexity traditionally involved in securing corporate network resources.

SAFEWEB Secure Extranet Appliance ( SEA ), named Tsunami, is a fundamental ‘redesign of extranet architecture’ integrating disparate technologies into a ‘modular plug-in network appliance’ ( SEA Tsunami).

SAFEWEB SEA Tsunami is an ‘all-in-one solution’ simplifying implementation of ‘extranets’ and ‘Virtual Private Networks’ ( VPN ) reducing Total Cost of Ownership ( TCO ) by innovative architecture letting companies build – in less than 1-hour – ‘secure extranets’ providing ‘remote stationed’ enablement of ‘employees’, ‘clients’ and ‘partners’ to access ‘internal applications’ and ‘secure data’ from anywhere using a standard internet website browser.

SAFEWEB delivers, through established strategic partnerships, customized versions of its Secure Extranet Appliance ( SEA ) Tsunami technology to U.S. intelligence [ CIA, etc. ] and communications agencies [ NSA, etc. ].

SAFEWEB investors, include:

IN-Q-TEL; CHILTON INVESTMENTS; and, KINGDON CAPITAL.

STRATIFY INCORPORATED [  ] ( Unstructured Data Management Software )

In 1999, PURPLE YOGI was founded by former INTEL Microcomputer Research Laboratory scientists Ramana Venkata and Ramesh Subramonian.

PURPLE YOGI, became known as STRATIFY INCORPORATED ( a privately-held company ).

In early 2001, ORACLE CORPORATION veteran and senior executive Nimish Mehta became president and chief executive officer ( CEO ).

STRATIFY INC., headquartered in Mountain View, California ( USA ), is [ 2002 ] the ‘emerging’ leader in ‘unstructured data management’ software.

STRATIFY Discovery System is a ‘complete enterprise software platform’ helping todays [ 2002 ] organizations ‘harness vast information overload’ by ‘automating the process’ of ‘organizing’, ‘classifying’ and ‘presenting’ business-critical unstructured information usually found in ‘documents’, ‘presentations’ and internet website pages.

STRATIFY Discovery System platform ‘transforms unstructured internal and external data’ into ‘immediately accessible relevant information’ automatically organizing millions of documents displayed in easy navigational hierarchy.

STRATIFY INC. clients, include:

INLUMEN and INFOSYS TECHNOLOGIES LIMITED, named in 2001 as one ( 1 ) of The Red Herring 100.

STRATIFY INC. received funding, from:

IN-Q-TEL; H & Q AT INDIA (also known as ) H & Q ASIA PACIFIC; SOFTBANK VENTURE CAPITAL ( now known as ) MOBIUS VENTURE CAPITAL; SKYBLAZE VENTURES LLC; and, INTEL CAPITAL.

SRD [ http://www.srdnet.com ] ( Near Real Time Data Warehousing and Link Analysis )

SYSTEMS RESEARCH & DEVELOPMENT ( SRD ), founded in 1983, develops software applications to combat fraud, theft, and collusion.

SYSTEMS RESEARCH & DEVELOPMENT Non-Obvious Relationship Awareness ™ ( NORA ™ ) was originally developed for the gambling casino gaming industry

SYSTEMS RESEARCH & DEVELOPMENT NORA software is designed to identify correlations across vast amounts of structured data, from hundreds or thousands of data sources, in near real-time, and alert users to potentially harmful relationships between and among people.

SRD NORA software technology leverages SYSTEMS RESEARCH & DEVELOPMENT proven expertise in ‘aggregating’, ‘warehousing’ and ‘leveraging people data’ and ‘transaction data’ to strengthen corporate management and security systems.

SYSTEMS RESEARCH & DEVELOPMENT clients [ 2002 ], include:

U.S. Depaartment of Defense ( DOD ); CENDANT; TARGET; MGM MIRAGE; MANDALAY BAY RESORT GROUP; and, Food Marketing Institute.

TACIT [ http://www.tacit.com ] ( Enterprise Expertise Automation )

TACIT, founded in 1997, is located in Palo Alto, California ( USA ) with regional sales offices in Virginia, Maryland, Pennsylvania and Illinois.

David Gilmour serves as president and chief executive officer ( CEO ).

TACIT Knowledge Systems is the pioneer and leader in ‘Enterprise Expertise Automation’.

TACIT products ‘automatically and continuously inventories’ the ‘skills’ and ‘work focus’ of an ‘entire organization’ for ‘dynamic location’ of ‘connections to expertise needed’ – when needed to make decisions, solve problems, and serve customers.

TACIT products also include its award winning flagship product KnowledgeMail™. In June 200, TACIT was voted one of the “Hot 100 Private Companies,” by Upside Magazine.

In 2000 and 2001, TACIT was one ( 1 ) of the “100 Companies that Matter,” by KM World [ Knowledge Management World ].

TACIT attracted a ‘world class advisory board’ with interest from ‘venture capital’ and Fortune 500 ‘clients’ of ‘enterprise’ and ‘customers’, including:

IN-Q-TEL; JP MORGAN; CHEVRON-TEXACO ( petroleum and chemical ); UNISYS; HEWLETT-PACKARD; NORTHROP-GRUMAN ( aerospace & defense ); and, ELI LILLY ( pharmaceuticals ).

TACIT investors, include:

IN-Q-TEL; DRAPER FISHER JURVETSON; REUTERS GREENHOUSE FUND; and, ALTA PARTNERS.

TRACTION SOFTWARE [ http://www.tractionsoftware.com ] ( Harvest and Use Information from All Sources )

TRACTION SOFTWARE, founded in 1996, is located in Providence, Rhode Island ( USA ).

TRACTION® Software is the leader in ‘Enterprise Weblog’ software, bringing together working ‘communications’, ‘knowledge management’, ‘content management’, ‘collaboration’, and the ‘writable intranet portal’.

TRACTION TeamPage™ product addresses the need for ‘unified on-demand view’ of ‘team content’ and ‘team communication’ from ‘all document sources’ in ‘context’ and over ‘time’.

TRACTION TeamPage deploys quickly and easily on an existing network and delivers a ‘capstone communication system’ by turning ‘e-mail’ and ‘web browser’ into powerful tools for end-users.

TeamPage targets ‘program teams’ and ‘product management teams’ in ‘government’ and ‘business’.

TRACTION also supports a wide range of applications and business processes, including but not limited, to:

Business Intelligence and Market Research;

Collection Highlighting and Media Distribution;

Investor Relations E-Mail and Public Relations E-Mail Triage and Response; and,

Tracking Exception Process and Reporting Exception Process.

TRACTION SOFTWARE investors, include:

IN-Q-TEL; SLATER CENTER FOR INTERACTIVE TECHNOLOGY; and, private investors.

ZAPLET INCORPORATED [ http://www.zaplet.com ] ( Enterprise Collaboration Tools For Email )

ZAPLET INC., founded in 1999, is located in Redwood Shores, California ( USA ).

ZAPLET INC. is an enterprise software and services company and creator of the Zaplet Appmail System™ collaboration software that brings application functionality directly to a user’s inbox to complete business processes.

ZAPLET INC. Appmail, using a server-based platform, combines power, ‘centralized control’ and ‘robust security’ for traditional enterprise application systems with the convenience and ease-of-use of e-mail.

ZAPLET Appmail in-box becomes the gateway to a protected server where the application functionality and data securely reside.

Zaplet™ Appmail can be used, to:

Manage and Streamline mission-critical business processes; Requires no additional client-side upgrades; and, Instantly expandable for work teams ‘beyond’ the ‘enterprise’.

ZAPLET INC. has received numerous awards, including:

Red Herring 100; Enterprise Outlook – Investors’ Choice; and, Internet Outlook – Investors’ Choice.

ZAPLET INC. customers, include leading companies, in:

Finance; Telecommunication; High technologies; and, Government.

ZAPLET INC. is backed by world class investors, including:

KLEINER PERKINS CAUFIELD & BYERS ( KPCB ); ACCENTURE TECHNOLOGY VENTURES; QUESTMARK PARTNERS L.P.; RESEARCH IN MOTION LIMITED ( RIM ); INTEGRAL CAPITAL PARTNERS; ORACLE CORPORATION; CISCO SYSTEMS INC.; and, NOVELL INC.

– –

Circa: 2010

IN-Q-TEL

Investments –

Portfolio of Companies ( 2010 ) – Partial List

3VR Security AdaptivEnergy Adapx Arcxis Asankya Basis Technology Bay Microsystems CallMiner Cambrios Carnegie Speech CleverSafe ( SAIC ) CopperEye Destineer Elemental Technlogies Ember Corporation Endeca Etherstack FEBIT FireEye FluiDigm

Reference(s)

http://web.archive.org/web/20020630223724/http://www.inqtel.com/news/attachments/InQTelInvestmentPortfolio.pdf http://www.iqt.org/technology-portfolio/orionmagic.html http://defense-ventures.com/in-q-tel/

– – – –

Research References

“Information Technology Trends And Their Impact On CIA,” January 1999, declassified report of the U.S. Central Intelligence Agency by, CIA Chief Information Officer.

– – – –

 

Submitted for review and commentary by,

 

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

/

/