AntiMatter Technology Problems

AntiMatter Technology Problems

by, Paul Collin, host of Concept Activity Research Vault ( CARV ) on May 15, 2011 ( Originally Published: May 10, 2011 )

CALIFORNIA, Los Angeles – May 15, 2011 – The global scientific community is eyeing suspiciously a 1952 ‘experimental projects’ organization known as the Conseil Européen pour la Recherche Nucléaire ( CERN ) also known as European Organization for Nuclear Research ] wherein its Large Hadron Collider ( LHC ) consists of a huge 27-mile in diameter high-energy particle collider is conducting some extremely serious experiments involving what scientists and physicisys say involves something called a “CP-violation” that deals with creating a variety of new subatomic particles that are believed to have never existed anywhere on Earth.

There is quite a bit of controversy concerning something called a “strangelet” ( strangelets ) and other particle creations within the CERN experiment, which because of conjectures in scientific theories are feared by some professionals, could create a ’new subatomic particle’ that may upset the balance of Earth as we know it, and what is even more frightening is that if something goes out-of-control, it may take anywhere between 1-year to 5-years ‘before anyone notices a chain reaction having already been created that some have already identified as a ‘micro-blackhole’ that could theoretically begin consuming Earth from within its own magnetic iron core. Sounding like ‘science fiction’, apparently CERN experiments are ’definitely not’ something to be taken lightly.

This serious and highly controversial subject amongst scientists and physicists around the world is being touched-on in this report, amongst other related information, amongst which includes video clips ( below ) for better understanding some of the many aspects for public knowledge not being addressed by mainstream news broadcasts.

CERN went even further, though, by expanding its deep underground experiments to conduct related experiments in outerspace within what it calls the Alpha Magnetic Spectrometer ( AMS / AMS-02 ) now scheduled for launch aboard the U.S. Space Shuttle Endeavor STS-134 mission set for May 16, 2011. The AMS-02 is, however, to be delivered to the International Space Station ( ISS ) where it will continue CERN designated experiments.

Interestingly, during July 2010 the Alpha Magnetic Spectrometer ( AMS / AMS02 ) was ‘not’ launched as the video clip ( above ) depicted. The Alpha Magnetic Spectrometer ( AMS / AMS-02 ), being equated to that of the Hubble space telescope, actually holds far more technological advancements from CERN and is solely designed to focus on subatomic particles surrounding antimatter issues.

U.S. Space Shuttle Endeavor mission STS-134 was scheduled to launch on April 14, 2011 but was delayed until the end of April 2011, but then was delayed yet again until May 16, 2011. Why so many delays and reschedulings?

Earth anti-matter issues are rarely addressed by the mainstream news media with the public, however in-lieu of the recent NASA public warning that it is expecting a ‘significant’ “solar flare” to erupt, coming bound for Earth, as something “we all need to be concerned about,” the Alpha Magnetic Spectrometer ( AMS ) having just been recently placed onboard the U.S. Space Shuttle Endeavour mission – scheduled to deliver the AMS aboard the International Space Station ( ISS ) – is something the public really needs to take a closer look at.

AMS-02 onboard ISIS

[ PHOTO ( above ): Alpha Magnetic Spectrometer ( AMS / AMS-02 ) in U.S. Space Shuttle Endeavour cargo bay April 2011 ( click to enlarge ) ]

– –

Source: Nature.Com

AntiUniverse Here We Come by, Eugenie Samuel Reich

May 4, 2011

A controversial cosmic ray detector destined for the International Space Station will soon get to prove its worth.

The next space-shuttle launch will inaugurate a quest for a realm of the Universe that few believe exists.

Nothing in the laws of physics rules out the possibility that vast regions of the cosmos consist mainly of anti-matter, with anti-galaxies, anti-stars, even anti-planets populated with anti-life.

“If there’s matter, there must be anti-matter. The question is, where’s the Universe made of antimatter?” says Professor Samuel C.C. Ting, a Nobel prize winning physicist at the Massachusetts Institute of Technology ( MIT ) in Cambridge, Massachusetts. But most physicists reason that if such antimatter regions existed, we would have seen the light emitted when the particles annihilated each other along the boundaries between the antimatter and the matter realms. No wonder the Professor Samuel C.C. Ting brainchild, a $2,000,000,000 billion dollar space mission was sold ‘partly on the promise of looking for particles emanating from anti-galaxies’, is fraught with controversy.

Professor Ting’s project, however has other ‘more mainstream scientific goals’ so, most critics of which held their tongues last week as the U.S. Space Shuttle Endeavour STS-134 mission – prepared to deliver the Alpha Magnetic Spectrometer ( AMS version, known as the AMS-02 ) to the International Space Station ( ISS ) – flight was delayed ( because of problems ) until later this month ( May 2011 ).

Pushing The Boundaries –

Seventeen ( 17 ) years in the making, the Alpha Magnetic Spectrometer ( AMS ) is a product of the former NASA administrator Dan Goldin quest to find remarkable science projects for the Internation Space Station ( ISS ) and of the Ting fascination with anti-matter.

Funded by NASA, the U.S. Department of Energy ( DOE ), plus a sixteen ( 16 ) country consortium of partners, the Alpha Magnetic Spectrometer ( AMS ) has prevailed – despite delays and technical problems – along with the doubts of many high-energy and particle physicists.

“Physics is not about doubt,” says Roberto Battiston, deputy spokesman for the Alpha Magnetic Spectrometer ( AMS ) and physicist at the University of Perugia, Italy. “It is about precision measurement.”

As the Alpha Magnetic Spectrometer ( AMS ) experiment headed to the Space Shuttle Endeavour launch pad, Roberto Battiston and other scientists were keen to emphasize the Alpha Magnetic Spectrometer ( AMS ) ‘unprecedented sensitivity’ to the gamut of cosmic rays, that rain down on Earth, that should allow the Alpha Magnetic Spectrometer ( AMS ) to perform two ( 2 ) things:

1. Measure Cosmic Ray High-Energy Charged ‘Particles’ and ‘Properties’ ( thereof ), sent from:

– Sun ( Earth’s ); – Supernovae ( distant ); and, Gamma ( γ ) Ray Bursts ( GRB ).

AND,

2. Detect AntiMatter ( errant chunks ), sent from the:

a. Universe ( far-away ).

Cosmic rays ( on Earth ) can only be indirectly detected by their showers of ‘secondary particles’ produced – when slamming into molecules of atmosphere in high regions above the Earth, but the Alpha Magnetic Spectrometer ( AMS ) in space will get an undistorted view.

“We’ll be able to measure ( solar ) Cosmic Ray Flux very precisely,” says collaboration member physicist Fernando Barão of the Laboratory of Instrumentation and Experimental Particle Physics ( in Lisbon, Spain ). “The best place ( for detecting this ) is to be in ‘space’ because you don’t have Earth’s atmosphere that is going to destroy those cosmic rays.”

No matter what happens, with the more speculative search for antimatter, the Alpha Magnetic Spectrometer ( AMS ) should produce a definitive map of the cosmic ray sky – helping to build a kind of ‘astronomy not dependent on light’.

Alpha Magnetic Spectrometer ( AMS ) consists of a powerful permanent magnet surrounded by a suite of particle detectors.

Over 10-years ( or more ), that the Alpha Magnetic Spectrometer ( AMS ) experiment will run, the Alpha Magnetic Spectrometer ( AMS ) magnet will bend the paths of cosmic rays by an amount that reveals their energy and charge, thereby their identity.

Some will be ‘heavy atomic nuclei’, while others ( made from anti-matter ), will reveal themselves by ‘bending in the opposite direction’ from their ‘matter’ counterparts ( see, e.g. cosmic curveballs ).

By ‘counting positrons’ ( i.e. antimatter ‘electrons’ ), the Alpha Magnetic Spectrometer ( AMS ) could also ‘chase a tentative signal of dark matter’, the so-far ‘undetected stuff’ thought to account for ‘much of the mass of the Universe’.

In 2009, Russia and Italy researchers – with the Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics ( PAMELA ) onboard a Russia satellite – published evidence of an ‘excess amount of positrons in the space environment surrounding Earth’ ( O. Adriani et al. Nature 458 , 607–609; 2009 ). One potential source of this is the ‘annihilation of dark-matter particles’ within the ‘halo enveloping our Galaxy’.

Another speculative quest, is to follow up on hints of ‘strange matter’, a ‘hypothetical substance’ that should be found in ‘some collapsed stars’ containing ‘strange quarks’, ‘up quarks’ and ‘down quarks’ – within ordinary nuclei.

NASA Alpha Magnetic Spectrometer ( AMS ) program manager Mark Sistilli says hints of ‘strange matter’ were seen – during a 1998 pilot flight of the Alpha Magnetic Spectrometer ( AMS / AMS-01 ) aboard the Space Shuttle, however NASA determined results ‘too tentative to publish’.

Because the Alpha Magnetic Spectrometer ( AMS / AMS-02 ) status was made as an “exploration mission,” the Alpha Magnetic Spectrometer ( AMS ) ‘did not need to follow’ “peer review” NASA would ‘normally have required’ for a ”science mission.”

But Sistilli emphasizes the Alpha Magnetic Spectrometer ( AMS ) earned flying colors from committees convened by the U.S. Department of Energy ( DOE ), which is supplying $50,000,000 million of the funding.

Now their ( DOE ) confidence will be put to the test.

Reference

http://www.nature.com/news/2011/110504/full/473013a.html

While for some it may appear strangelet subatomic antimatter particle research is for advancing our knowledge of unlocking the secrets of life in the Universe, others are still asking NASA what they really know is behind ‘why’ an ‘expected significant’ Solar Energetic Particle Event ( SEPE ) is something “we all need to be concerned about” on Earth.

With Solar Energetic Particle Event ( SEPE ) high-energy effects capable of disrupting Earth ground-based and space-based electrical components and electricity grid infrastructure systems for up to 10-years, many wonder why billions upon billions of dollars were and are still being pumped into the CERN project studying ‘strangelets’ and people want to know just why we need ‘more immediate information detection capabilities’ on high-energy solar flare proton and electron ejections coming toward Earth soon, which NASA and other agencies ‘know far more about’ than they are willing to tell the public.

How advanced has government authorities grown from private-sector science and technology knowledge? The United States has already mapped internal magma flows of the Sun.

How could the U.S. government possibly ‘see inside the Sun’ to know when a ‘direct or near direct Earth facing’ Sun based Coronal Mass Ejection ( CME ) from a solar flare would occur in the future?

In layman terms, for government it was like looking through a clear glass Pyrex bowl positioned atop a stove burner, watching as water starts to boil inside it, and then predicting – based on the flame heating it the water – when bubbles will come to the surface, when one takes into account a government ’ground-based’ ( does ‘not’ require ‘space-based placement’ ) observatory telescope equipped with a “super lens” used for imaging ( observing ) ‘objects at great distances inside matter’ – a “superlens” that now even ‘defies light-speed’ and ‘matter’. ( Read Below )

– –

[ PHOTO ( above ): Antimatter photon ‘optic’ substrate structure material for ‘subsurface solar imaging plasma flows’ inside Sun enables plotting Coronal Mass Ejections ‘before solar surface eruptions’ ( click to enlarge ) ]

Source: U.S. Department of Energy, Lawrence Berkeley National Laboratory, Operated by the University of California

Optical Antimatter Structure Shows The Way For New Super Lens by, Aditi Risbud

April 21, 2009

A device, made from alternating layers of ‘air’ and ‘silicon photonic crystal’, behaves like a ‘super lens’ – providing the first experimental demonstration of optical antimatter.

Scientists at Berkeley Lab ( Berkeley, California, USA ) and the Institute for Microelectronics and Microsystems ( CNR ) in Naples, Italy have experimentally demonstrated – for the first time – the ‘concept of optical antimatter’ by ‘light traveling through a material without being distorted’.

By engineering a material focusing light through its own internal structure, a beam of light can enter and exit ( unperturbed ) after traveling through millimeters of material.

For years, optics researchers have struggled to bypass the ‘diffraction limit’, a physical law restricting imaging resolution to about 1/2 the wavelength of light used to make the image.

If a material with a negative index of refraction ( a property describing how light bends as it enters or exits a material ) could be designed, this diffraction hurdle could be lowered.

Such a material could also behave as a superlens, useful in observing objects from imaging equipment with ‘details finer than allowed by the diffraction limit’, a physical law restricting imaging resolution to about 1/2 the wavelength of light used to make the image.

Despite the intriguing possibilities posed, by a substance with a negative index of refraction, ‘this property is inaccessible through naturally occurring ( positive index ) materials’.

During the mid 1990s, English theoretical physicist Sir John Pendry proposed his clever ‘sleight of light’ using so-called metamaterials – engineered ‘materials’ whose underlying structure ‘can alter overall responses’ to ‘electrical fields’ and ‘magnetic fields’.

Inspired by the Sir John Pendry proposal, scientists have made progress in scaling metamaterials from microwave to infrared wavelengths while illuminating the nuances of light-speed and direction-of-motion in such engineered structures.

“We’ve shown a ‘completely new way to control and manipulate light’, ‘using a silicon photonic crystal’ as a ‘real metamaterial’ – and it works,” said Stefano Cabrini, Facility Director of the Nanofabrication Facility in the Molecular Foundry, a U.S. Department of Energy ( DOE ) User Facility located at Lawrence Berkeley National Laboratory ( LBNL ) providing support to nanoscience researchers around the world.

“Our findings will open-up an easier way to make structures and use them effectively as a ‘super-lens’.”

Through the Molecular Foundry user program, Cabrini and post-doctoral researcher Allan Chang collaborated with Vito Mocella, a theoretical scientist at the Institute of Microelectronics and Microsystems ( CNR ) in Naples, Italy to fabricate a 2 X 2 millimeter device consisting of alternating layers of air and a silicon based photonic crystal containing air holes.

Using high precision nanofabrication processes, the team designed the spacing and thicknesses of each layer to behave like the metamaterial Sir John Pendry had envisioned.

This device was then used to focus a beam of near-infrared ( I-R ) light, essentially ‘annihilating’ 2 millimeters of ‘space’.

“Now that we have a prototype to demonstrate the concept, our next step will be to find the geometry and material that will work for visible light,” said Cabrini.

Along with possibilities in imaging, the researchers’ findings could also be used to develop hybrid negative-index and positive-index materials, Cabrini added, which may lead to novel ‘devices’ and ‘systems’ unachievable through either material alone.

“Self-collimation of light over millimeter-scale distance in a quasi zero average index metamaterial,” by Vito Mocella, Stefano Cabrini, Allan S.P. Chang, P. Dardano, L. Moretti, I. Rendina, Deirdre Olynick, Bruce Harteneck and Scott Dhuey, appears in Physical Review Letters available in Physical Review Letters online.

Portions of this work were supported by the U.S. Department of Energy ( DOE ) Office of Science, Office of Basic Energy Sciences under Contract No. DE-AC0205CH11231.

The Molecular Foundry is one ( 1 ) of five ( 5 ) U.S. Department of Energy ( DOE ) Nanoscale Science Research Centers ( NSRC ) that are premier national user facilities for interdisciplinary research at the nanoscale. Together, the U.S. Department of Energy ( DOE ) Nanoscale Science Research Centers ( NSRC ) comprise a suite of complementary facilities providing researchers with state-of-the-art capabilities to fabricate, process, characterize and model nanoscale materials, which constitutes the ‘largest infrastructure investment’ of the National Nanotechnology Initiative ( NNI ).

U.S. Department of Energy ( DOE ) Nanoscale Science Research Centers ( NSRC ) are located at these six ( 6 ) locations:

– Argonne National Laboratory ( ANL ); – Brookhaven National Laboratory ( BNL ); – Lawrence Berkeley National Laboratory ( LBNL ); – Oak Ridge National Laboratory ( ORNL ); – Sandia National Laboratory ( SNL ); and, – Los Alamos National Laboratory ( LANL ).

For more information about the DOE NSRCs, please visit http://nano.energy.gov.

Berkeley Lab is a U.S. Department of Energy ( DOE ) National Laboratory located in Berkeley, California conducting ‘unclassified scientific research’ managed by the University of California.

References

http://www.lbl.gov http://foundry.lbl.gov http://newscenter.lbl.gov/feature-stories/2009/04/21/optical-antimatter

– –

If the public could keep its eye open for one second, it would see what is coming at them before it hits them with a surprise that only government knows anything about however governments continue conveying its double-speak vernacular to citizens over a very long period of decades; perhaps, a mere fact ’known today’ may eventually come as no surprise to many whom would have otherwise been kept in the dark while only a few know far more about what awaits the masses.

Perhaps, people may begin asking more questions of their country’s agencies spending so much money so quickly for apparently some ‘mysterious emergency purpose’, and if not for some ‘mysterious emergency purpose’, why is so much money being spent on science and space projects while the general public is told about ‘serious government budget cutbacks’ seeing so many people suffer?

If there is no ’emergency’, then people should know ‘why they are suffering financially more’ – just for the sake of ‘growing science experiment budgets’?

Might be a good idea for everyone to begin keeping their eyes open a little more often and trained on something more than light-hearted mainstream media news entraining entertainment broadcastings.

If people think they get real serious about ‘what they know’ as told on television news broadcasts, imagine how much more serious they will become when they learn about what they ‘were not told’?

Think about it. How Fast Is Technology Growing?

Just beginning to grasp something ‘new’?

Now think about something even newer than the Large Hadron Collider ( LHC ) at CERN.

Reference: https://web.archive.org/web/20120921064312/http://conceptactivityresearchvault.wordpress.com/2011/05/15/antimatter-technology-problems/

BNL Time Chamber

Running Time Backwards –

The Relativistic Heavy Ion Collider ( RHIC ) that added its Solenoidal Tracker At RHIC ( STAR ) claiming to “reverse time” by ultra super computers reconstructing sub-atomic particle interactions producing particles – emerging from each collision – that STAR is believed to be able to “run time backward” in a process equated to examining final products coming out-of a factory that scientists and physicists have no idea ’what kinds of machines produced the products’. Basically, they are developing items so fast, they do not know how they were formed, much less what the capabilities are. Fact is, ’they could easily produce a monster’ and ‘not know what it is until after they are eaten by it’. Scary, really, like kids being given matches to play with.

They are being educated beyond their own intelligence, so much so and to the point by which scientists and physicists ’cannot even grasp what ‘it’ is they’re looking at – much less know what they are trying to manipulate to ‘see what it does next’ – nevertheless they are conducting experiments like children playing with dynamite.

Think this is science fiction? Think they are mad scientists at play? Check the research reference links ( below ).

Think antimatter technology has advanced alot since you began reading this report? calculate ‘more’ because the public does not even know half of it.

Reference: https://unwantedpublicityintel.wordpress.com/2015/09/22/time-foolery/ and, Research ( 26MAY05 ): https://web.archive.org/web/20120925015521/http://www.bnl.gov/rhic/news2/news.asp?a=2647&t=today

CERN has been operating since 2002, and the “SuperLens” was worked-on ‘before’ 2002, making ‘both’ today now 10-years old.

Want newer ‘news’?

Superlenses – created from perovskite oxides – are simpler and easier to fabricate than ‘metamaterials’.

Superlenses are ideal for capturing light travelling in the mid-infra-red ( IR ) spectrum range, opening even newer technological highly sensitive imaging devices, and this superlensing effect can be selectively turned ‘on’ and ‘off’, opening yet another technology of ‘highly dense data storage writing’ for ‘far more advanced capability computers’.

Plasmonic whispering gallery microcavities, consisting of a silica interior coated with a thin layer of silver, ‘improves quality by better than an order of magnitude’ of current plasmonic microcavities. and paves the way for ‘plasmonic nanolasers’.

Expand your knowledge, begin researching the six ( 6 ) reference links ( below ) so that the next time you watch the ‘news’ you’ll begin to realize just how much you’re ‘not being told’ about what is ‘actually far more important’ – far more than you’re used to imagining.

Submitted for review and commentary by,

Paul Collin ( Concept Activity Research Vault ) E-MAIL: UnwantedPublicity@GMAIL.com  WWW: http://ConceptActivityResearchVault.WordPress.Com

References

http://www.bnl.gov/rhic/
http://www.bnl.gov/rhic/STAR.asp
http://www.bnl.gov/bnlweb/pubaf/pr/PR_display.asp?prID=1075&template=Today
http://newscenter.lbl.gov/news-releases/2011/03/29/perovskite-based-superlens-for-the-infrared/
http://newscenter.lbl.gov/news-releases/2009/01/22/plasmonic-whispering-gallery/
http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=1018060

Earth Event Alerts

Earth Event Alerts

[ IMAGE ( above ): IBM Stratus and Cirrus supercomputers analyze Global Environmental Intelligence ( click to enlarge ) ]

Earth Event Alerts

by, Kentron Intellect Research Vault [ E-MAIL: KentronIntellectResearchVault@Gmail.Com ]

August 17, 2012 19:00:42 ( PST ) Updated ( Originally Published: March 23, 2011 )

MARYLAND, Fort George G. Meade – August 17, 2012 – IBM Stratus and IBM Cirrus supercomputers as well as CRAY XK6m and CRAY XT5 ( Jaguar ) massive parallel supercomputers and vector supercomputers are securely controlled via the U.S. National Security Agency ( NSA ) for analyzing Global Environmental Intelligence ( GEI ) data extracted from ground-based ( terrestrial ) monitoring stations and space-based ( extraterrestrial ) spaceborne platforms studying Earth Event ( Space Weather ) effects via High-Performance Computing ( HPC ) as well as, for:

– Weather Forecasting ( including: Space Weather ); – U.S. Government Classified Projects; – Scientific Research; – Design Engineering; and, – Other Research.

[ IMAGE ( above ): CRAY XK6m supercomputers analyze Global Environmental Intelligence ( click to enlarge ) ]

CRAY INC. largest customers are U.S. government agencies, e.g. the U.S. Department of Defense ( DoD ) Defense Advanced Research Projects Agency ( DARPA ) and the U.S. Department of Energy ( DOE ) Oak Ridge National Laboratory ( ORNL ), which accounts for about 3/4 of revenue for CRAY INC. – as well as other supercomputers used worldwide by academic institutions ( universities ) and industrial companies ( private-sector firms ).

CRAY INC. additionally provides maintenance, support services and sells data storage products from partners ( e.g. BlueArc, LSI and Quantum ).

Supercomputer competitors, of CRAY INC., are:

– IBM; – HEWLETT-PACKARD; and, – DELL.

On May 24, 2011 CRAY INC. announced its new CRAY XK6 supercomputer, a hybrid supercomputing system combining its Gemini InterConnect, AMD Opteron™ 6200 Series processors ( code-named: InterLagos ) and NVIDIA Tesla 20 Series GPUs into a tightly integrated upgradeable supercomputing system capable of more than 50 petaflops ( i.e. ‘quadrillions of computing operations’ per ‘second’ ), a multi-purpose supercomputer designed for the next-generation of many-core High Performance Computing ( HPC ) applications.

The SWISS NATIONAL SUPERCOMPUTING CENTRE ( CSCS ) – located in Manno, Switzerland – is the CRAY INC. first ( 1st ) customer for the new CRAY XK6 system. CSCS ( Manno, Switzerland ) promotes and develops technical and scientific services in the field of High-Performance Computing ( HPC ) for the Swiss research community, and is upgrading its CRAY XE6m system ( nick-named: Piz Palu ) into a multiple cabinet new CRAY XK6 supercomputer. The SWISS NATIONAL SUPERCOMPUTING CENTRE ( CSCS ) supports scientists working, in:

– Weather Forecasting; – Physics; – Climatology; – Geology; – Astronomy; – Mathematics; – Computer Sciences; – Material Sciences; – Chemistry; – Biology; – Genetics; and, – Experimental Medicine.

Data additionally analyzed by these supercomputers, include:

– Ultra-Deep Sea Volcanoes located in continental plate fracture zones several miles beneath ocean basin areas ( e.g. Asia-Pacific Rim also known as the “Pacific Ring of Fire” where a circum-Pacific seismic belt of earthquakes frequently impact areas far across the Pacific Ocean in the Americas ).

Global geoscience realizes Earth ‘ground movement shaking’ earthquakes hide alot, of what people are actually walking on-top-of, large geographic land mass areas known as ‘continental shelves’ or “continental plates” that move ( tectonics ) because of superheated pressurized extrasuperconducting magnetic energy properties released from within molten magma material violently exploding beneath the surface of the Earth down in ultra-deep seas.

[ IMAGE ( above ): Global Tectonic Plate Boundaries & Major Volcano HotSpots ( click to enlarge ) ]

Significant volcanoes are positioned like dots along this global 25,000-mile circular region known as the “Pacific Ring of Fire” extending from south of Australia up the ‘entire eastcoast’ of Japan, China and the Kamchatka Pennisula of Russia to across the Aleutian Islands of Alaska and then south down the ‘entire westcoast’ of North America and Latin America.

[ IMAGE ( above ): Ultra-Deep Sea Pacific Ocean Basin ( click to enlarge ) ]

March 11, 2011 Tohoku-chiho Taiheiyo-oki Japan 9.0 earthquake held several secrets, including U.S. government contractors simultaneously monitoring a significant ”moment of magnitude” ( Mw ) Earth Event occurring parallel to the eastcoast of Japan beneath the Western Pacific Ocean where an entire suboceanic mountain range was being split in-half ( south to north ) 310-miles long and split open 100-feet wide ( east to west ), which the public was unaware of nor were they told details about.

Interestingly, the March 11, 2011 Japan island earthquakes have not yet stopped, as the swarm of 4.0, 5.0, 6.0 and 7.0 Richter scale earthquakes continue as a direct and proximate cause of erupting ‘suboceanic volcanoes‘ moving these large “plates” beginning to force yet others to slam into one another thousands of miles away.

Japan’s Western Pacific Ocean ‘eastcoast’ has a ’continental plate’ slamming point meeting the ’westcoast’ of North America near the Cascade mountain range ‘plate’ reacting in one ( 1 ) of two ( 2 ) ways, i.e. ’seaward’ ( plate thrusting toward Japan ) or ‘landward’ ( plate thrusting toward the Pacific Northwest ) of the United States and/or Canada.

What The Public Never Knew

Government leadership, globally, is acutely familiar with these aforementioned types of major Earth Events, including ‘monstrous plate tectonic pushing matches’, which usually collapse one or more ‘national infrastructures’ and typically spells ‘death’ and ‘serious injuries’ for populations in developed areas.

Extremely familiar with mass public panic resulting from Earth Event catastrophes, government ‘contingency actions’ pre-approved by ‘governing bodies’ and/or ‘national leadership’ Executive Order Directives, which although not advertised is a matter of public record, ‘immediately calls’ upon ‘all military forces’ to carry-out “risk reduction” ( ‘minimization of further damages and dangers’ ) through what is referred to as “mitigation” ( ‘disaster management’ ) within “National Disaster Preparedness Planning” ( ‘national contingency measures’ ) details citizens are unaware-of. Government decision-makers know a “national emergency can bring temporary suspension of Constitutional Rights and a loss of freedoms – a volatile subject few care to discuss because ’any significant natural disaster’ will result in government infringment on many civil liberties most populations are accustomed to enjoying.

Before 1-minute and 40-seconds had passed into the March 11, 2011 Tohoku, Japan earthquake ( Richter scale: M 9.0 ), key U.S. government decision-makers discussed the major Earth Event unfolding off Japan’s eastcoast Plate-Boundary subduction zone beneath the ultra-deep sea of the Western Pacific Ocean where Japan’s monstrous volcano mountain range had split at least 100-feet wide open and cracked 310-miles long in a northern direction headed straight for the Aleutian Islands of Alaska in the United States.

U.S. Military Contingent Standby “Red Alert” Notification

U.S. Air Force ( USAF ) ‘subordinate organization’ Air and Space Operations ( ASO ) Communications Directorate ( A6 ) ‘provides support’ over ‘daily operations’, ‘contingency actions’ and ‘general’ Command, Control, Communication and Computer Intelligence ( C4I ) for the U.S. Air Force Weather Agency ( USAFWA ) saw its 1st Weather Group ( 1ST WXG ) Directorate ready its USAFWAWXGOWS 25th Operational Weather Squadron ( OWS ) at Davis-Monthan Air Force Base ( Tucson, Arizona ) responsibile for conjuntive communication notification issuance of an Earth Event “Red Alert” immediately issued directly to U.S. Army Western Command ( WESTCOM ) with a “Standby-Ready” clause pausing Western Region ( CONUS ) mobilization of Active, Reserve and National Guard military forces at specific installations based on the Japan Earth Event “moment of magnitude” ( Mw ) Plate-Boundary consequential rebound expected to strike against the North America westcoast Plate-Boundary of the Cascadia Range reactionarily triggering its subduction zone into a Cascadia ‘great earthquake’.

CALTECH Public News Suppression Of Major Earth Event

Officials, attempting to diminish any clear public understanding of the facts only knowing a Richter scale level earthquake ‘magnitude’ ( never knowing or hearing about what a major Earth Event “moment of magnitude” ( Mw ) entailed ), only served-up ‘officially-designed double-speak psycho-babble terms’ unfamiliar to the public as ‘creative attention distraction’ announcing the “Japan earthquake experienced,” a:

– “Bilateral Rupture;” and,

– “Slip Distribution.”

The facts are that, the Japan ‘earthquake’ would ‘never have occurred’, ‘unless’:

1ST – “Bilateral Rupture” ( ‘suboceanic subterranean tectonic plate split wide open  ) occurred; followed by,

2ND – “Slip Distribution” ( ‘tectonic plate movement’ ); then finally,

3RD – “Ground Shaking” ( ‘earthquake’ ) response.   Officials failed the public without any notification a major Earth Event “moment of magnitude” ( Mw ) on the “Pacific Ring of Fire” ( circum-Pacific seismic belt ) in the Western Pacific Ocean had a, huge:

1. Continental Plate Break Off;

3. Undersea Plate Mountain Range Crack Wide Open; plus,

2. Mountain Range Split Open 310-Miles Long.

There are some, laying at rest, that might ‘not consider’ the aforementioned three ( 3 ) major Earth Event occurences significant, except those ‘still living’ on Earth.

Asia-Pacific Rim

This western Pacific Ocean huge ‘undersea mountain range’ moved ‘east’, crushing into the smaller portion of its tectonic plate’ toward the continent of Asia, which commenced continous streams of day and night significant earthquakes still registering 5.0 + and 6.0 + according to Richter scale levels of magnitude now and for over 12-days throughout the area surrounding Japan, the point nearest where the tectonic plate meets the continent of Asia within the western Pacific Ocean from where this ‘monstorous undersea mountain range’ suddenly split, sending the ‘eastern half’ – with the ‘tectonic plate’ broken beneath it – slamming into the continent of Asia.

Simultaneously pushed, even greater with more force outward ( note: explosives – like from out-of a cannon or from a force-shaped explosive – project blasts outward from the ‘initial explosive blast’ is blunted by a back-stop ) away-from the Asia continent, was this ‘monstorous undersea mountain range’ split-off ( 310-miles / 500-kilometers long ) ‘western half’ slammed west up against the Americas ‘western tectonic plates’ .

This ‘is’ the ‘major’ “Earth Event” that will have consequential global impact repurcussions, ‘officially minimized’ by ‘focusing public attention’ on a ‘surface’ Earth Event earthquake 9.0 Richter scale magnitude ( once ), while even further diminishing the hundreds of significant earthquakes that are still occuring 12-days after the initial earthquake.

Asia-Pacific Rim “Ring Of Fire”

Many are unaware the “Asia-Pacific Rim” is ( also known as ) the “Ring of Fire” whereunder the ”ultra-deep sea Pacific Ocean’ exists ‘numerous gigantic volatile volcanoes’ positioned in an ‘incredibly large circle’ ( “ring” ) around a ‘huge geographic land mass area’ comprised of ‘tectonic plates’ that ‘connect’ the ‘Eastern Asias’ to the ‘Western Americas’.

Yellowstone National Park Super Volcano

Many people are still wondering ‘why’ the Japan earthquakes have not yet stopped, and why they are being plagued by such a long swarm of siginificant earthquakes still to this very day nearly 60-days later. The multiple color video clips viewed ( below ) provides information on unusual earthquake swarm patterns and reversals while studying the World’s largest supervolcano in Wyoming ( USA ) located at Yellowstone National Park, a global public attraction viewing natural underground volcano steam vents known as geyser eruptions:

[ PHOTO ( above ): Major HotSpot at Yellowstone displays Half Dome cap of granite rock above unerupted volcano magma. ]

Ultra-Deep Sea Volcanoes

When huge undersea volcanoes erupt they dynamically force incredibly large geographic land mass plates to move whereupon simultaneously and consequentially movement is experienced on ‘surface land areas’ people know as ’earthquakes’ with their ’aftermath measurements’ provided in “Richter scale level” measurements that most do not understand. These Richter scale measurements are only ‘officially provided estimates’, as ’officials are never presented with totally accurate measurements’ because many of which are ‘not obtained with any great precision for up-to 2-years after the initial earthquake’.

Rarely are ‘precise measurements’ publicly provided, and at anytime during that 2-year interim the public may hear their previously reported earthquake Richter scale level measurement was either “officially upgraded” or “officially downgraded.” Often, this is apparently dependent when one sees ’many other countries contradicting U.S. public news announcements’ about the magnitude of a particularly controversial earthquake. An example of this was seen surrounding the March 12, 2011 earthquake in Japan:

– Japan 1st public announcement: 9.2 Richter scale;

– United States 1st public announcement: 8.9 Richter scale;

– United States 2nd public announcement: 9.0 Richter scale; and,

– United States 3rd public announcement: 9.1 Richter scale.

What will the March 12, 2011 Japan earthquake be officially reported as in 2-years? Who knows?

Never publicly announced, however are measurements of an earthquake ‘force strength pressure accumulation’ transmitted through suboceanic tectonic plates grinding against one another, a major Earth Event ‘geographic pushing process’, having been seen by U.S. NSA supercomputers from global ground and space-based monitoring analysis surrounding the “Asia-Pacific Rim Ring of Fire” – stretching from the ‘Eastern Asias’ to the ‘Western Americas’ and beyond.

This ‘domino plate tectonic principle’ results from combined amounts of ‘volcanic magmatic eruptive force strength’ and ‘tectonic plate accumulative pressure build-up’ against ‘adjacent tectonic plates’ causing ‘suboceanic, subterranean and surface land to move’ whereupon ‘how significant such amounts occur determines strength’ of both consequential ‘earthquakes’ and resultant ‘tsunamis’.

Waterway Tsunamis

When most of the public ‘hears about’ a “tsunami”, they ‘think’ ‘high waves’ near “ocean” coastal regions presented with significant floods over residents of cities nearby. Few realize the ‘vast majority of Earth’s population predominantly live all along ocean coastal regions. Few realize one ( 1 ) ‘gigantic tsunami’ could ‘kill vast populations living near oceans in the wake of popular beaches, a tough trade-off for some while logical others choose ‘living further inland’ – away from large bodies of water like ‘large lakes’ where ‘tide levels are also effected by the gravitational pull of the moon’ that can also can a ‘vast deep lake body’ bring a tsunami dependent on which direction tectonic plates move a ‘force directionalized earthquake’ creating a ‘tsunami’ with significant innundating floods over residents living in cities near those ‘large shoreline’ areas too.

What most of the public does not yet fully realize is that ‘large river bodies of water’, like the Mississippi River that is a ‘north’ to ‘south’ directional river’ could easily see ‘east to ‘west’ directional ‘tectonic plates’ move adjacent states – along the New Madrid Fault subduction zone – with significant ‘earthquakes’ – from tectonic plate movement easily capable of squeezing the side banks of even the Missippi River forcing huge amounts of water hurled out onto both ‘east’ and ‘west’ sides resulting in ‘seriously significant inland flooding’ over residents living in ‘low lying’ states of the Central Plains of the United States.

Japan “Pacific Ring Of Fire” Earthquakes To Americas Cascadia Fault Zone

Japan accounts, of a co-relative tsunami, suggest the Cascadia Fault rupture occurred from one ( 1 ) single earthquake triggering a 9-Mw Earth Event on January 26, 1700 where geological evidence obtained from a large number of coastal northern California ( USA ) up to southern Vancouver Island ( Canada ), plus historical records from Japan show the 1,100 kilometer length of the Cascadia Fault subduction zone ruptured ( split cauding that earthquake ) major Earth Event at that time. While the sizes of earlier Cascadia Fault earthquakes are unknown, some “ruptured adjacent segments” ( ‘adjacent tectonic plates’ ) within the Cascadia Fault subduction zone were created over periods of time – ranging from as little as ‘hours’ to ‘years’ – that has historically happened in Japan.

Over the past 20-years, scientific progress in understanding Cascadia Fault subduction zone behavior has been made, however only 15-years ago scientists were still debating whether ‘great earthquakes’ occured at ‘fault subduction zones’. Today, however most scientists realize ‘great earthquakes’ actually ‘do occur in fault subduction zone regions’.

Now, scientific discussions focus on subjects, of:

– Earth crust ‘structural changes’ when a “Plate Boundary” ruptures ( splits ) – Related tsunamis; – Seismogenic zone ( tectonic plate ‘locations’ and ‘width sizes’ ).

Japan America Earthquakes And Tsunamis Exchange

Great Cascadia earthquakes generate tsunamis, which most recently was at-least a ’32-foot high tidal wave’ onto the Pacific Ocean westcoast of Washington, Oregon, and California ( northern portion of state ), and that Cascadia earthquake tsunami sent a consequential 16-foot high todal wave onto Japan.

These Cascadia Fault subduction zone earthquake tsunamis threaten coastal communities all around the Pacific Ocean “Ring of Fire” but have their greatest impact on the United States westcoast and Canada being struck within ’15-minutes’ to ’40-minutes’ shortly ‘after’ a Cascadia Fault subduction zone earthquake occurs.

Deposits, from past Cascadia Fault earthquake tsunamis, have been identified at ‘numerous coastal sites’ in California, Oregon, Washington, and even as far ‘north’ as British Columbia ( Canada ) where distribution of these deposits – based on sophisticated computer software simulations for tsunamis – indicate many coastal communities in California, Oregon, Washington, and even as far ‘north’ as British Columbia ( Canada ) are well within flooding inundation zones of past Cascadia Fault earthquake tsunamis.

California, Oregon, Washington, and even as far ‘north’ as British Columbia ( Canada ) westcoast communities are indeed threatened by future tsunamis from Cascadia great earthquake event ‘tsunami arrival times’ are dependent on measuring the distance from the ‘point of rupture’ ( tectonic plate split, causing earthquake ) – within the Cascadia Fault subduction zone – to the westcoast “landward” side.

Cascadia Earthquake Stricken Damage Zone Data

Strong ground shaking from a “moment of magnitude” ( Mw ) “9″ Plate-Boundary earthquake will last 3-minutes or ‘more’, dominated by ‘long-periods of further earthquakes’ where ‘ground shaking movement damage’ will occur as far inland as the cities of Portland, Oregon; Seattle, Washington; and Vancouver, British Columbia ( Canada ).

Tsunami Optional Wave Patterns “Following Sea”

Large cities within 62-miles to 93-miles of the nearest point of the Cascadia Plate-Boundary zone inferred rupture, will not only experience ‘significant ground shaking’ but also experience ‘extreme duration ground shaking’ lasting far longer, in-addition to far more powerful tsunamis carrying far more seawater because of their consequential “lengthened  wave periods” ( ‘lengthier distances’ between ‘wave crests’ or ‘wave curls’ ) bringing inland akin to what fisherman describe as a deadly “following sea” ( swallowing everything within an ‘even more-so powerful waterpath’ ), the result of which inland causes ‘far more significant damage’ on many ‘tall buildings’ and ‘lengthy structures’ where ‘earthquake magnitude strength’ will be felt ‘strongest’ – all along the United States of America Pacific Ocean westcoast regional areas – experiencing ‘far more significant damage’.Data Assessments Of Reoccuring Cascadia Earthquakes

cascadia ‘great earthquakes’ “mean recurrence interval” ( ‘time period occuring between one earthquake with the next earthquake ) – specific ‘at the point’ of the Cascadia Plate-Boundary – time is between 500-years up-to 600-years, however Cascadia Fault earthquakes in the past have occurred well within the 300-year interval of even less time since the Cascadia ‘great earthquake’ of the 1700s. Time intervals, however between ‘successive great earthquakes’ only a few centuries up-to 1,000 years has little ‘well-measured data’ as to ‘reoccurance interval’ because the numbers of recorded Cascadia earthquakes have rarely measured over ‘five’ ( 5 ). Data additionally indicates Cascadia earthquake intermittancy with irregular intervals when they did occur, plus data lacks ‘random distribution’ ( ‘tectonic plate shift’ or ‘earth movement’ ) or ‘cluster’ of these Cascadia earthquakes over a lengthier period of time so ‘more accurate assessments are unavailable’ for knowning anything more about them. Hence, because Cascadia earthquake ‘recurrence pattern’ is so ‘poorly known’, knowing probabilities of the next Cascadia earthquake occurrence is unfortunately unclear with extremely sparse ‘interval information’ details.

Cascadia Plate-Boundary “Locked” And “Not Locked”

Cascadia Plate-Boundary zone is ‘currently locked’ off the U.S. westcoast shoreline where it has accumulating plate tectonic pressure build-up – from other tectonic plates crashing into it for over 300-years.

The Cascadia Fault subduction zone, at its widest point, is located northwest just off the coast of the State of Washington where the maximum area of seismogenic rupture is approximately 1,100 kilometers long and 50 kilometers up-to 150 kilometers wide. Cascadia Plate-Boundary seismogenic portion location and size data is key-critical for determining earthquake magnitude, tsunami size, and the strength of ground shaking.

Cascadia Plate-Boundary “landward limit” – of only its “locked” portion – where ‘no tectonic plate shift has yet to occur’ is located between the Juan de Fuca tectonic plate and North America tectonic plate were it came to be “locked” between Cascadia earthquakes, however this “ocked” notion has only been delineated from ‘geodetic measurements’ of ‘surface land deformation’ observations. Unfortunately, its “seaward limit” has ‘very few constraints’ up-to ‘no constraints’ for travelling – on its so-called “locked zone” portion – that could certainly move at any time.

Cascadia Plate Continues Sliding

Cascadia transition zone, separating its “locked zone” from its “continuous sliding zone” headed east into the continent of North America, is constrained ( held-back ) poorly so, Cascadia rupture may extend an unknown distance – from its now “locked zone” to its “continously sliding transition zone.”

On some Earth crust faults, near coastal regions, earthquakes may also experience ‘additional Plate-Boundary earthquakes’, ‘increased tsunami tidal wave size’ plus ‘intensification of local area ground shaking’.

Earth Event Mitigation Forces Global Money Flow

Primary ‘government purpose’ to ‘establishing international’ “risk reduction” is solely to ‘minimize global costs from damages’ associated with major magnitude Earth Events similar-to but even-greater than the what happend on March 11, 2011 all over Japan.

Historical earthquake damages assist in predictive projections of damage loss studies suggesting disastrous future losses will occur in the Pacific Northwest from a Cascadia Fault subduction ‘great earthquake’. National ‘loss mitigation efforts’ – studying ‘other seismically active regions’ plus ‘national cost-benefit studies’ indicate that ‘earthquake damage loss mitigation’ may effectively ‘reduce losses’ and ‘assist recovery’ efforts in the future. Accurate data acquired, geological and geophysical research and immediate ‘technological information transfer’ to ‘national key decision-makers’ was to reduce Pacific Northwest Cascadia Fault subduction zone additional risks to those of the Western North America coastal region.

Damage, injuries, and loss of life from the next great earthquake from the Cascadia Fault subduction zone will indeed be ‘great’, ‘widespread’ and ‘significantly ‘impact national economies’ ( Canada and United States ) for years to decades in the future, which has seen a global concerted increase, in:

– International Cooperative Research; – International Information Exchanges; – International Disaster Prepardeness; – International Damage Loss Mitigation Planning; – International Technology Applications; and, – More.

Tectonics Observatory

CALTECH Advanced Rapid Imaging and Analysis ( ARIA ) Project collaborative members of the NASA Jet Propulsion Laboratory ( JPL ), University of California Institute of Technology ( Pasadena ) Tectonics Observatory ARIA Project members, CALTECH scientists, Shengji Wei and Anthony Sladen ( of GEOAZUR ) modelled the Japan Tohoku earthquake fault zone sub-surface ( below surface ) ‘tectonic plate movement’, dervived from:

– TeleSeismic Body Waves ( long-distance observations ); and,

– Global Positioning Satellites ( GPS ) ( near-source observations ).

A 3D image of the fault moving, can be viewed in Google Earth ( internet website webpage link to that KML file is found in the “References” at the bottom of this report ) projects that fault rupture in three dimensional images, which can be viewed from any point of reference, with ‘that analysis’ depicting the rupture ( ground splitting open 100-feet ) resulting in the earthquake ( itself ) ‘triggered from 15-miles ( 24-kilometers ) beneath the ultra-deep sea of the Western Pacific Ocean, with the ‘entire island of Japan being moved east’ by 16-feet ( 5 meters ) from its ‘before earthquake location’.

[ IMAGE ( above ): NASA JPL Project ARIA Tectonic Plate Seismic Wave Direction Map ( click image to enlarge and read ) ]

National Aeronautics and Space Administration ( NASA ) Jet Propulsion Laboratory ( JPL ) at the University of California ( Pasadena ) Institute of Technology ( also known as ) CALTECH Project Advanced Rapid Imaging and Analysis ( ARIA ) used GEONET RINEX data with JPL GIPSY-OASIS software to obtain kinematic “precise point positioning solutions” from a bias fixing method of a ‘single station’ matched-up to JPL orbit and clock products to produce their seismic displacement projection map details that have an inherent ’95% error-rating’ that is even an ‘estimate’, which ‘proves’ these U.S. government organization claims that ‘all they supposedly know’ ( after spending billions of dollars ) are what they are ‘only willing to publicly provide may be ‘only 5% accurate’. So much for what these U.S. government organizations ‘publicly announce’ as their “precise point positioning solutions.”

Pay Any Price?

More ‘double-speak’ and ‘psycho-babble’ serves to ‘only distract the public away from the ‘truth’ as to ‘precisely what’ U.S. taxpayer dollars are ‘actually producing’, and ‘now knowing this’ if ‘those same officials’ ever ‘worked for a small business’ they would either be ‘arrested’ for ‘fraud’ or ‘fired’ because of ‘incompetence’, however since ‘none of them’ will ever ‘admit to their own incometence’ their ‘leadership’ needs to see ‘those responsible’ virtually ‘swing from’ the end of an ‘incredibly long U.S. Department of Justice rope’.

Unfortunately, the facts surrounding all this only get worse.

[ IMAGE ( above ): Tectonic Plates ( brown color ) Sinking and Sunk On Earth Core. ]

Earthquake Prediction Falacy

Earthquake prediction will ‘never be an accomplished finite science for people to ever rely upon’, even though huge amounts of money are being wasted on ‘technology’ for ‘detection sensors’ reading “Seismic Waveforms” ( also known as ) “S Waves” that ‘can be detected and stored in computer databases’, because of a significant fact that will never be learned no matter how much money or time may be devoted to trying to solve the unsolvable problem of the Earth’s sub-crustal regions that consist primarily of ‘molten lake regions’ filled with ‘floating tectonic plates’ that are ‘moving while sinking’ that ‘cannot be tested’ for ‘rock density’ or ‘accumulated pressures’ existing ‘far beneath’ the ‘land surface tectonic plates’.

The very best, and all, that technology can ever perform for the public is to record ‘surface tectonic plates grinding aganist one another’ where ‘only that action’ ( alone ) does in-fact emit the generation of upward ‘accoustic wave form patterns’ named as being ‘seismic waves’ or ‘s-waves’ that ‘do occur’ but ‘only when tectonic plates are moving’.

While a ‘public early warning’ might be helpful for curtailing ‘vehicular traffic’ crossing an ‘interstate bridge’ that might collapse or ‘train traffic’ travel being stopped, thousands of people’s lives could be saved but it would fail to serve millions more living in buildings that collapse.

Early Warning Exclusivity

Knowing governments, using publicly unfamiliar terms, have ‘statisticly analyzed’ “international economics” related to “national infrastructure preparedness” ( ‘early warning systems’ ) – both “public” ( i.e. ‘utility companies’ via ‘government’ with ‘industrial leadership’ meetings ) and “private” ( i.e. ‘residents’ via ‘television’, ‘radio’, ‘newspaper’ and ‘internet’ only ‘commercial advertisements’ ) between which two ( 2 ) sees “national disaster mitigation” ‘primary designated provisions’ for “high density population centers” near “coastal or low-lying regions” ( ‘large bodies of ocean, lake and river water’ ) “early warning” but for only one ( 1 ) being “public” ( i.e. ‘utility companies’ via ‘government’ with ‘industrial leadership’ meetings ) “in the interest of national security” limiting ‘national economic burdens’ from any significant Earth Event impact ‘aftermath’.

In short, and without all the governmentese ‘psycho-babble’ and double-speak’, costs continue being spent on ‘high technology’ efforts to ‘perfect’ a “seismic early warning” for the “exclusive use” ( ‘national government control’ ) that “provides” ( ‘control over’ ) “all major utility company distribution points” ( facilities from where ‘electrical power is only generated’ ) able to “interrupt power” ( ‘stop the flow of electricity nationwide’ from ‘distribution stations’ ), thus “saving additional lives” from “disasterous other problems” ( ‘aftermath loss of lives and injuries’ caused by ‘nuclear fallout radiation’, ‘exploding electrical transformers’, and ‘fires associated with overloaded electrical circuits’ ).

Logically, ‘much’ – but ‘not all’ – of the aforementioned ‘makes perfect sense’, except for “John Doe” or “Jane Doe” ‘exemplified anonomously’ ( herein ) as individuals whom if ‘earlier warned’ could have ‘stopped their vehicle ‘before crossing the bridge that collapsed’ or simply ‘stepped out of the way of a huge sign falling on them’ being ‘killed’ or ‘maimed’, however one might additionally consider ‘how many more would ‘otherwise be killed or maimed’ after an ‘ensuing mass public mob panics’ by ‘receiving’ an “early warning.” Tough call for many, but few.

Earth Data Publicly Minimized

Tohoku-oki earthquake ‘seismic wave form data’ showing the Japan eastcoast tectonic plate “bilaterally ruptured” ( split in-half for a distance of over 310-miles ) was obtained from the USArray seismic stations ( United States ) was analyzed and later modelled by Caltech scientists Lingsen Meng and Jean-Paul Ampuero whom created preliminary data animation demonstrating a ‘super major’ Earth Event simultaneously occurring when the ‘major’ earthquake struck Japan.

U.S. National Security Stations Technology Systems Projects

United States Seismic Array ( USArray ) Data Management Plan Earthscope is composed of three ( 3 ) Projects:

1. Incorporated Research Institutions for Seismology ( IRIS ), a National Science Foundation ( NSF ) consortium of universities, Data Management Center ( DMC ) is ‘managed’ by the “United States Seismic Array ( USArray )” Project;

2. UNAVCO INC. ‘implemented’ “Plate-Boundary Observatory ( PBO )” Project; and,

3. U.S. Geological Service ( USGS ) ‘operated’ “San Andreas Fault Observatory at Depth ( SAFOD )” Project at Stanford University ( California ).

Simultaneous Earth Data Management

USArray component “Earthscope” data management plan is held by USArray IRIS DMC.

USArray consists of four ( 4 ) data generating components:

Permanent Network

Advanced National Seismic System ( ANSS ) BackBone ( BB ) is a joint effort – between IRIS, USArray and USGS – to establish a ‘Permanent Network’ of approximately one-hundred ( 100 ) Earth monitoring ‘receiving stations’ ( alone ) located in the Continental United States ( CONUS ) or lowere 48 states of America, in-addition to ‘other stations’ located in the State of Alaska ( alone ).

Earth Data Multiple Other Monitors

USArray data contribution to the Advanced National Seismic System ( ANSS ) BackBone ( BB ) consists, of:

Nine ( 9 ) new ‘international Earth data accumulation receiving stations’ akin to the Global Seismic Network ( GSN );

Four ( 4 ) “cooperative other stations” from “Southern Methodist University” and “AFTAC”;

Twenty-six ( 26 ) ‘other receiving stations’ from the Advanced National Seismic System ( ANSS ) with ‘upgrade funding’ taken out-of the USArray Project “EarthScope;” plus,

Sixty ( 60 ) additional stations of the Advanced National Seismic System ( ANSS ) BackBone ( BB ) network that ‘currently exist’, ‘will be installed’ or ‘will be upgraded so that ‘data channel stream feeds’ can and ‘will be made seamlessly available’ through IRIS DMC where ‘data can be continuously recorded’ at forty ( 40 ) samples per second and where 1 sample per second can and ‘will be continously transmitted in real-time back into IRIS DMC where quality assurance is held at facilities located in ‘both’ Albuquerque, New Mexico and Golden, Colorado with ‘some’ U.S. Geological Survey ( USGS ) handling ‘some operational responsiblities’ thereof.

Albuquerque Seismological Laboratory ( ASL ) –

Albuquerque Seismological Laboratory ( ASL ) supports operation and maintenance of seismic networks for the U.S. Geological Survey ( USGS ) portion of the Global Seismographic Network ( GSN ) and Advanced National Seismic System ( ANSS ) Backbone network.

ASL runs the Advanced National Seismic System ( ANSS ) depot facility supporting the Advanced National Seismic System ( ANSS ) networks.

ASL also maintains the PASSCAL Instrument Center ( PIC ) facility at the University of New Mexico Tech ( Socorro, New Mexico ) developing, testing, and evaluating seismology monitoring and recording equipment.

Albuquerque Seismological Laboratory ( ASL ) staff are based in ‘both’ Albuquerque, New Mexico and Golden, Colorado.

Top-Down Bottom-Up Data Building Slows Earthquake Notifications

Seismic waveform ( ‘seismic Wave form frequency’ ) data is received by the Global Seismic Network ( GSN ) and Advanced National Seismic System ( ANSS ) BackBone ( BB ) network by electronic transmissions sent ‘slower than real-time’ by sending only ‘near-time data’ ( e.g. tape and compact disc recordings ) to the National Earthquake Information Center ( NEIC ) ‘station’ of the U.S. Geological Survey ( USGS ) ‘officially heralded’ for so-called “rapid earthquake response,”

Unbelieveably is the fact that in-addition to the aforementioned ‘slow Earth Event data delivery process’, an additional number of ‘data receiving stations’ have absolutely ‘no data streaming telemetry’ transmission capabilities whatsoever so, those station data recordings – on ‘tapes’ and ‘compact discs’ – are delivered by ‘other even more time consuming routes’ before that data can even reach the U.S. Geological Survey ( USGS ). In short, all the huge amounts of money being spent goes to ‘increasing computer technologies, sensors, satellites, ‘data stream channel networks’ and ‘secure facility building stations’ from the ‘top, down’ instead of building ‘monitoring stations’ and ‘recording stations’ from the ‘bottom, up’ until the entire earthquake monitoring and notification system is finally built properly. As it curreently stands, the ‘apple cart stations continue being built more and more’ while ‘apple tree stations are not receiving the proper technological nutrients’ to ‘delivery apples ( ‘data’ ) and ‘fed into notification markets’ ( ‘public’ ) where all this could do some good.

U.S. National Security Reviews Delay Already Slow Earthquake Notifications

IRIS Data Management Center ( DMC ) – after processing all incoming data streams from reporting stations around the world – then distributes seismic waveform data ‘back to’ both the Global Seismic Network ( GSN ) and Advanced National Seismic System ( ANSS ) BackBone ( BB ) network operations, but only ‘after seismic waveform data has been ‘thoroughly screened’ by what U.S. national security government Project leadership has deemed its ‘need to control all data’ by “limiting requirements” ( ‘red tape’ ) because ‘all data must undergo’ a long ardous ‘secure data clearing process’ before any data can be released’. Amusingly to some, the U.S. government – in its race to create another ‘official acronym’ of ‘double-speak’ – that national security requirement clearing process’ was ever so aptly named:

“Quality Assurance Framework” ( QUACK )

Enough said.

Let the public decide what to do with ‘those irresponsible officials’, afterall ‘only mass public lives’ are ‘swinging in the breeze’ at the very end-of a now-currently endless ‘dissinformation service rope’ being paid for by the tax-paying public.

In the meantime, while we are all ‘waiting for another Earth Event to take place far beyond, what ( besides this report ) might ‘slap the official horse’, spurring it to move quickly?

How about us? What shhould we do? Perhaps, brushing-up on a little basic knowledge might help.

Inner Earth Deeper Structure Deep Focus Earthquakes Rays And Related Anomalies

There is no substitute for knowledge, seeing information technology ( IT ) at the focal point of many new discoveries aided by supercomputing, modelling and analytics, but common sense does pretty good.

The following information, although an incredibily brief overview on such a wide variety of information topics surrounding a great deal of the in’s and out’s surrounding planet Earth, scratches more than just the surface but deep structure and deep focus impacting a multitude of generations from as far back as 700 years before the birth of Christ ( B.C. ).

Clearly referenced “Encyclopaedia Britannica” general public access information is all second-hand observations of records from other worldwide information collection sources, such as:

– Archives ( e.g. governments, institutions, public and private );

– Symposiums ( e.g. white papers );

– Journals ( professional and technical publications );

– Other information collection sources; and,

– Other information publications.

Encyclopaedias, available in a wide variety of styles and formats, are ’portable catalogs containing a large amount of basic information on a wide variety of topics’ available worldwide to billions of people for increasing their knowledge.

Encyclopedia information formats vary, and through ’volume reading’, within:

– Paper ‘books’ with either ’printed ink’ ( sighted ) or ’embossed dots’ ( Braille );

– Plastic ‘tape cartridges’ ( ‘electromagnetic’ media ) or ‘compact discs’ ( ‘optical’ media ) with ‘electronic device display’; or,

– Electron ‘internet’ ( ‘signal computing’ via ‘satellite’ or ‘telecomputing’ via ’landline’ or ‘node’ networking ) with ‘electronic device display’.

After thoroughly reviewing the Encyclopedia Britannica ‘specific compilation’, independent review found reasonable a facsimile of the original reformatted for easier public comprehension ( reproduced further below ).

Suprisingly, after that Encyclopedia Britannica ‘specific compilation’ information was reformatted for clearer reading comprehension, otherwise inner Earth ‘deep-structure’ geophysical studies formed an amazing correlation with additional factual activities within an equally amazing date chronology of man-made nuclear fracturing reformations of Earth geology geophysical – activities documented worldwide more than 1/2 century ago but somehow forgotten; either by chance or secret circumstance.

How could the Encyclopedia Britannica, or for that matter anyone else, missed something on such a grand scale that is now so obvious?

… [ TEMPORARILY EDITED-OUT FOR REVISION PURPOSES ONLY –  ] …

For more details, about the aforementioned, Click: Here!

Or,

To understand how all this relates, ‘begin with a better basic understanding’ by continuing to read the researched information ( below ):

====

Circa: March 21, 2012

Source:  Encyclopaedia Britannica

Earthquakes

Definition, Earthquake: Sudden shaking of Earth ground caused by passage of seismic waves through Earth rocks.

Seismic waves are produced when some form of energy stored in the Earth’s crust is suddenly released, usually when masses of rock straining against one another suddenly fracture and “slip.” Earthquakes occur most often along geologic faults, narrow zones where rock masses move in relation to one another. Major fault lines of the world are located at the fringes of the huge tectonic plates that make up the Earth’s crust. ( see table of major earthquakes further below )

By the early 20th Century ( 1900s ), little was understood about earthquakes until the emergence of seismology, involving scientific study of all aspects of earthquakes, now yielding answers to long-standing questions as to why and how earthquakes occur.

About 50,000 earthquakes, large enough to be noticed without the aid of instruments, occur every year over the entire Earth, and of these approximately one-hundred ( 100 ) are of sufficient size to produce substantial damage if their centers are near human habitation.

Very great earthquakes, occur on average about once a year, however over centuries these earthquakes have been responsible for millions of human life deaths and an incalculable amount of property damage.

Earthquakes A -Z

Earth’s major earthquakes occur primarily in belts coinciding with tectonic plate margins, apparent since early ( 700 B.C. ) experienced earthquake catalogs, and now more readily discernible by modern seismicity maps instrumentally depicting determined earthquake epicentres.

Most important, is the earthquake Circum-Pacific Belt affecting many populated coastal regions around the Pacific Ocean, namely:

South America;

– North America & Alaska;

Aleutian Islands;

Japan;

New Zealand; and,

New Guinea.

80% of the energy, estimated presently released in earthquakes, comes from those whose epicentres are in the Circum-Pacific Belt belt.

Seismic activity is by no means uniform throughout the belt, and there are a number of branches at various points. Because at many places the Circum-Pacific Belt is associated with volcanic activity, it has been popularly dubbed the “Pacific Ring of Fire.”

A second ( 2nd ) belt, known as the Alpide Belt, passes through the Mediterranean region eastward through Asia and joining the Circum-Pacific Belt in the East Indies where energy released in earthquakes from the Alpide Belt is about 15%of the world total.

There are also seismic activity ‘striking connected belts’, primarily along oceanic ridges including, those in the:

Arctic Ocean;

Atlantic Ocean;

Indian Ocean ( western ); and along,

East Africa rift valleys.

This global seismicity distribution is best understood in terms of its plate tectonic setting.

Forces

Earthquakes are caused by sudden releases of energy within a limited region of Earth rocks, and apparent pressure energy can be released, by:

Elastic strain;

– Gravity;

Chemical Reactions; and / or,

– Massive rock body motion.

Of all these, release of elastic rock strain is most important because this form of energy is the only kind that can be stored in sufficient quantities within the Earth to produce major ground disturbances.

Earthquakes, associated with this type of energy release, are called: Tectonic Earthquakes.

Tectonics

Tectonic plate earthquakes are explained by the so-called elastic rebound theory, formulated by the American geologist Harry Fielding Reid after the San Andreas Fault ruptured in 1906, generating the great San Francisco earthquake.

According to Reid theory of elastic rebound, a tectonic earthquake occurs when energy strains in rock masses have accumulated ( built-up ) to a point where resulting stresses exceed the strength of the rocks where then sudden fracturing results.

Fractures propagate ( travel ) rapidly ( see speeds further below ) through the rock, usually tending in the same direction and sometimes extending many kilometres along a local zone of weakness.

In 1906, for instance, the San Andreas Fault slipped along a plane 270-miles ( 430 kilometers) long, a line alongwhich ground was displaced horizontally as much as 20-feet ( 6 meters ).

As a fault rupture progresses along or up the fault, rock masses are flung in opposite directions, and thus spring back to a position where there is less strain.

At any one point this movement may take place not at-once but rather in irregular steps where these sudden slowings and restartings give rise to vibrations that propagate as seismic waves.

Such irregular properties of fault rupture are now included in ‘physical modeling” and ‘mathematical modeling’ earthquake sources.

Earthquake Focus ( Foci )

Roughnesses along the fault are referred to as asperities, and places where the rupture slows or stops are said to be fault barriers. Fault rupture starts at the earthquake focus ( foci ), a spot that ( in many cases ) is close to being from 5 kilometers to 15 kilometers ‘under the surface where the rupture propagates ( travels )’ in one ( 1 ) or both directions over the fault plane until stopped ( or slowed ) at a barrier ( boundary ).

Sometimes, instead of being stopped at the barrier, the fault rupture recommences on the far side; at other times the stresses in the rocks break the barrier, and the rupture continues.

Earthquakes have different properties depending on the type of fault slip that causes them.

The usual ‘fault model’ has a “strike” ( i.e., direction, from north, taken by a horizontal line in the fault plane ) and a “dip” ( i.e. angle from the horizontal shown by the steepest slope in the fault ).

Movement parallel to the dip is called dip-slip faulting.

In dip-slip faults, if the hanging-wall block moves downward relative to the footwall block, it is called “normal” faulting; the opposite motion, with the hanging wall moving upward relative to the footwall, produces reverse or thrust faulting. The lower wall ( of an inclined fault ) is the ‘footwall’, and laying over the footwall is the hanging wall.

When rock masses slip past each other ( parallel to the strike area ) movement is known as strike-slip faulting.

Strike-slip faults are right lateral or left lateral, depending on whether the block on the opposite side of the fault from an observer has moved to the right or left.

All known faults are assumed to have been the seat of one or more earthquakes in the past, though tectonic movements along faults are often slow, and most geologically ancient faults are now a-seismic ( i.e., they no longer cause earthquakes ).

Actual faulting, associated with an earthquake, may be complex and often unclear whether in one ( 1 ) particular earthquake, where total energy, is being issued from a single ( 1 ) fault plane.

Observed geologic faults sometimes show relative displacements on the order of hundreds of kilometres over geologic time, whereas the sudden slip offsets that produce seismic waves may range from only several centimetres to tens of metres.

During the 1976 Tangshan earthquake ( for example ), a surface strike-slip of about 1 meter was observed along the causative fault east of Beijing, China, and later ( as another example ) during the 1999 Taiwan earthquake the Chelung-pu fault slipped vertically up to 8 meters.

Volcanism & Earthquake Movement

A separate type of earthquake is associated with volcano activity known as a volcanic earthquake.

Although likely, even in such cases, disturbance is officially believed resultant from sudden slip of rock masses adjacent a volcano being consequential release of elastic rock strain energy, however stored energy may be partially of hydrodynamic origin due heat provided by magma flowing movements ( tidal ) throughout underground reservoirs beneath volcanoes or releasing under pressure gas, but then there certainly is a clear corresponding distinction between geographic distribution of volcanoes and major earthquakes particularly within the Circum-Pacific Belt traversing ocean ridges.

Volcano vents, however, are generally several hundred kilometres from epicentres of most ‘major shallow earthquakes’, and it is believed ’many earthquake sources’ occur ‘nowhere near active volcanoes’.

Even in cases where earthquake focus occurs where structures are marked ’directly below volcanic vents’, officially there is probably no immediate causal connection between the two ( 2 ) activities where likely both may be resultant on same tectonic processes.

Earth Fracturing

Artificially Created Inductions

Earthquakes are sometimes caused by human activities, including:

– Nuclear Explosion ( large megaton yield ) detonations underground;

– Oil & Gas wells ( deep Earth fluid injections )

– Mining ( deep Earth excavations );

– Reservoirs ( deep Earth voids filled with incredibly heavy large bodies of water ).

In the case of deep mining, the removal of rock produces changes in the strain around the tunnels.

Slip on adjacent, preexisting faults or outward shattering of rock into where new cavities may occur.

In fluid injection, the slip is thought to be induced by premature release of elastic rock strain, as in the case of tectonic earthquakes after fault surfaces are lubricated by the liquid.

Large underground nuclear explosions have been known to produce slip on already strained faults in the vicinity of test devices.

Reservoir Induction

Of the various earthquake causing activities cited above, the filling of large reservoirs ( see China ) being most prominent.

More than 20 significant cases have been documented in which local seismicity has increased following the impounding of water behind high dams. Often, causality cannot be substantiated, because no data exists to allow comparison of earthquake occurrence before and after the reservoir was filled.

Reservoir-induction effects are most marked for reservoirs exceeding 100 metres ( 330 feet ) in depth and 1 cubic km ( 0.24 cubic mile ) in volume. Three ( 3 ) sites where such connections have very probably occurred, are the:

Hoover Dam in the United States;

Aswan High Dam in Egypt; and.

Kariba Dam on the border between Zimbabwe and Zambia in Africa.

The most generally accepted explanation for earthquake occurrence in such cases assumes that rocks near the reservoir are already strained from regional tectonic forces to a point where nearby faults are almost ready to slip. Water in the reservoir adds a pressure perturbation that triggers the fault rupture. The pressure effect is perhaps enhanced by the fact that the rocks along the fault have lower strength because of increased water-pore pressure. These factors notwithstanding, the filling of most large reservoirs has not produced earthquakes large enough to be a hazard.

Specific seismic source mechanisms associated with reservoir induction have been established in a few cases. For the main shock at the Koyna Dam and Reservoir in India ( 1967 ), the evidence favours strike-slip faulting motion. At both the Kremasta Dam in Greece ( 1965 ) and the Kariba Dam in Zimbabwe-Zambia ( 1961 ), the generating mechanism was dip-slip on normal faults.

By contrast, thrust mechanisms have been determined for sources of earthquakes at the lake behind Nurek Dam in Tajikistan. More than 1,800 earthquakes occurred during the first 9-years after water was impounded in this 317 meter deep reservoir in 1972, a rate amounting to four ( 4 ) times the average number of shocks in the region prior to filling.

Nuclear Explosion Measurement Seismology Instruments

By 1958 representatives from several countries, including the United States and the Russia Soviet Union government, met to discuss the technical basis for a nuclear test-ban treaty where amongst matters considered was feasibility of developing effective means to detect underground nuclear explosions and to distinguish them seismically from earthquakes.

After that conference, much special research was directed to seismology, leading to major advances in seismic signal detection and analysis.

Recent seismological work on treaty verification has involved using high-resolution seismographs in a worldwide network, estimating the yield of explosions, studying wave attenuation in the Earth, determining wave amplitude and frequency spectra discriminants, and applying seismic arrays. The findings of such research have shown that underground nuclear explosions, compared with natural earthquakes, usually generate seismic waves through the body of the Earth that are of much larger amplitude than the surface waves. This telltale difference along with other types of seismic evidence suggest that an international monitoring network of two-hundred and seventy ( 270 ) seismographic stations could detect and locate all seismic events over the globe of magnitude 4.0 and above ( corresponding to an explosive yield of about 100 tons of TNT ).

Earthquake Effects

Earthquakes have varied effects, including changes in geologic features, damage to man-made structures, and impact on human and animal life. Most of these effects occur on solid ground, but, since most earthquake foci are actually located under the ocean bottom, severe effects are often observed along the margins of oceans.

Surface Phenomena

Earthquakes often cause dramatic geomorphological changes, including ground movements – either vertical or horizontal – along geologic fault traces; rising, dropping, and tilting of the ground surface; changes in the flow of groundwater; liquefaction of sandy ground; landslides; and mudflows. The investigation of topographic changes is aided by geodetic measurements, which are made systematically in a number of countries seriously affected by earthquakes.

Earthquakes can do significant damage to buildings, bridges, pipelines, railways, embankments, and other structures. The type and extent of damage inflicted are related to the strength of the ground motions and to the behaviour of the foundation soils. In the most intensely damaged region, called the meizoseismal area, the effects of a severe earthquake are usually complicated and depend on the topography and the nature of the surface materials. They are often more severe on soft alluvium and unconsolidated sediments than on hard rock. At distances of more than 100 km (60 miles) from the source, the main damage is caused by seismic waves traveling along the surface. In mines there is frequently little damage below depths of a few hundred metres even though the ground surface immediately above is considerably affected.

Earthquakes are frequently associated with reports of distinctive sounds and lights. The sounds are generally low-pitched and have been likened to the noise of an underground train passing through a station. The occurrence of such sounds is consistent with the passage of high-frequency seismic waves through the ground. Occasionally, luminous flashes, streamers, and bright balls have been reported in the night sky during earthquakes. These lights have been attributed to electric induction in the air along the earthquake source.

Tsunamis

Following certain earthquakes, very long-wavelength water waves in oceans or seas sweep inshore. More properly called seismic sea waves or tsunamis ( tsunami is a Japanese word for “harbour wave” ), they are commonly referred to as tidal waves, although the attractions of the Moon and Sun play no role in their formation. They sometimes come ashore to great heights—tens of metres above mean tide level—and may be extremely destructive.

The usual immediate cause of a tsunami is sudden displacement in a seabed sufficient to cause the sudden raising or lowering of a large body of water. This deformation may be the fault source of an earthquake, or it may be a submarine landslide arising from an earthquake.

Large volcanic eruptions along shorelines, such as those of Thera (c. 1580 bc) and Krakatoa (ad 1883), have also produced notable tsunamis. The most destructive tsunami ever recorded occurred on December 26, 2004, after an earthquake displaced the seabed off the coast of Sumatra, Indonesia. More than 200,000 people were killed by a series of waves that flooded coasts from Indonesia to Sri Lanka and even washed ashore on the Horn of Africa.

Following the initial disturbance to the sea surface, water waves spread in all directions. Their speed of travel in deep water is given by the formula (√gh), where h is the sea depth and g is the acceleration of gravity.

This speed may be considerable—100 metres per second ( 225 miles per hour ) when h is 1,000 metres ( 3,300 feet ). However, the amplitude ( i.e., the height of disturbance ) at the water surface does not exceed a few metres in deep water, and the principal wavelength may be on the order of hundreds of kilometres; correspondingly, the principal wave period—that is, the time interval between arrival of successive crests—may be on the order of tens of minutes. Because of these features, tsunami waves are not noticed by ships far out at sea.

When tsunamis approach shallow water, however, the wave amplitude increases. The waves may occasionally reach a height of 20 to 30 metres above mean sea level in U- and V-shaped harbours and inlets. They characteristically do a great deal of damage in low-lying ground around such inlets. Frequently, the wave front in the inlet is nearly vertical, as in a tidal bore, and the speed of onrush may be on the order of 10 metres per second. In some cases there are several great waves separated by intervals of several minutes or more. The first of these waves is often preceded by an extraordinary recession of water from the shore, which may commence several minutes or even half an hour beforehand.

Organizations, notably inJapan,Siberia,Alaska, andHawaii, have been set up to provide tsunami warnings. A key development is the Seismic Sea Wave Warning System, an internationally supported system designed to reduce loss of life in thePacific Ocean. Centred inHonolulu, it issues alerts based on reports of earthquakes from circum-Pacific seismographic stations.

Seiches

Seiches are rhythmic motions of water in nearly landlocked bays or lakes that are sometimes induced by earthquakes and tsunamis. Oscillations of this sort may last for hours or even for 1-day or 2-days.

The great Lisbon earthquake of 1755 caused the waters of canals and lakes in regions as far away as Scotland and Sweden to go into observable oscillations. Seiche surges in lakes in Texas, in the southwestern United States, commenced between 30 and 40 minutes after the 1964 Alaska earthquake, produced by seismic surface waves passing through the area.

A related effect is the result of seismic waves from an earthquake passing through the seawater following their refraction through the seafloor. The speed of these waves is about 1.5 km (0.9 mile) per second, the speed of sound in water. If such waves meet a ship with sufficient intensity, they give the impression that the ship has struck a submerged object. This phenomenon is called a seaquake.

Earthquake Intensity and Magnitude Scales

The violence of seismic shaking varies considerably over a single affected area. Because the entire range of observed effects is not capable of simple quantitative definition, the strength of the shaking is commonly estimated by reference to intensity scales that describe the effects in qualitative terms. Intensity scales date from the late 19th and early 20th centuries, before seismographs capable of accurate measurement of ground motion were developed. Since that time, the divisions in these scales have been associated with measurable accelerations of the local ground shaking. Intensity depends, however, in a complicated way not only on ground accelerations but also on the periods and other features of seismic waves, the distance of the measuring point from the source, and the local geologic structure. Furthermore, earthquake intensity, or strength, is distinct from earthquake magnitude, which is a measure of the amplitude, or size, of seismic waves as specified by a seismograph reading ( see below Earthquake magnitude )

A number of different intensity scales have been set up during the past century and applied to both current and ancient destructive earthquakes. For many years the most widely used was a 10-point scale devised in 1878 by Michele Stefano de Rossi and Franƈois-Alphonse Forel. The scale now generally employed in North America is the Mercalli scale, as modified by Harry O. Wood and Frank Neumann in 1931, in which intensity is considered to be more suitably graded.

A 12-point abridged form of the modified Mercalli scale is provided below. Modified Mercalli intensity VIII is roughly correlated with peak accelerations of about one-quarter that of gravity ( g = 9.8 metres, or 32.2 feet, per second squared ) and ground velocities of 20 cm (8 inches) per second. Alternative scales have been developed in bothJapan andEurope for local conditions.

The European ( MSK ) scale of 12 grades is similar to the abridged version of the Mercalli.

Modified Mercalli scale of earthquake intensity

  I. Not felt. Marginal and long-period effects of large earthquakes.

  II. Felt by persons at rest, on upper floors, or otherwise favourably placed to sense tremors.

  III. Felt indoors. Hanging objects swing. Vibrations are similar to those caused by the passing of light trucks. Duration can be estimated.

  IV. Vibrations are similar to those caused by the passing of heavy trucks (or a jolt similar to that caused by a heavy ball striking the walls). Standing automobiles rock. Windows, dishes, doors rattle. Glasses clink, crockery clashes. In the upper range of grade IV, wooden walls and frames creak.

  V. Felt outdoors; direction may be estimated. Sleepers awaken. Liquids are disturbed, some spilled. Small objects are displaced or upset. Doors swing, open, close. Pendulum clocks stop, start, change rate.

  VI. Felt by all; many are frightened and run outdoors. Persons walk unsteadily. Pictures fall off walls. Furniture moves or overturns. Weak plaster and masonry cracks. Small bells ring (church, school). Trees, bushes shake.

  VII. Difficult to stand. Noticed by drivers of automobiles. Hanging objects quivering. Furniture broken. Damage to weak masonry. Weak chimneys broken at roof line. Fall of plaster, loose bricks, stones, tiles, cornices. Waves on ponds; water turbid with mud. Small slides and caving along sand or gravel banks. Large bells ringing. Concrete irrigation ditches damaged.

  VIII. Steering of automobiles affected. Damage to masonry; partial collapse. Some damage to reinforced masonry; none to reinforced masonry designed to resist lateral forces. Fall of stucco and some masonry walls. Twisting, fall of chimneys, factory stacks, monuments, towers, elevated tanks. Frame houses moved on foundations if not bolted down; loose panel walls thrown out. Decayed pilings broken off. Branches broken from trees. Changes in flow or temperature of springs and wells. Cracks in wet ground and on steep slopes.

  IX. General panic. Weak masonry destroyed; ordinary masonry heavily damaged, sometimes with complete collapse; reinforced masonry seriously damaged. Serious damage to reservoirs. Underground pipes broken. Conspicuous cracks in ground. In alluvial areas, sand and mud ejected; earthquake fountains, sand craters.

  X. Most masonry and frame structures destroyed with their foundations. Some well-built wooden structures and bridges destroyed. Serious damage to dams, dikes, embankments. Large landslides. Water thrown on banks of canals, rivers, lakes, and so on. Sand and mud shifted horizontally on beaches and flat land. Railway rails bent slightly.

  XI. Rails bent greatly. Underground pipelines completely out of service.

  XII. Damage nearly total. Large rock masses displaced. Lines of sight and level distorted. Objects thrown into air.

With the use of an intensity scale, it is possible to summarize such data for an earthquake by constructing isoseismal curves, which are lines that connect points of equal intensity. If there were complete symmetry about the vertical through the earthquake’s focus, isoseismals would be circles with the epicentre (the point at the surface of the Earth immediately above where the earthquake originated) as the centre. However, because of the many unsymmetrical geologic factors influencing intensity, the curves are often far from circular. The most probable position of the epicentre is often assumed to be at a point inside the area of highest intensity. In some cases, instrumental data verify this calculation, but not infrequently the true epicentre lies outside the area of greatest intensity.

Earthquake Magnitude

Earthquake magnitude is a measure of the “size” or amplitude of the seismic waves generated by an earthquake source and recorded by seismographs.

Types and nature of these waves are described in Seismic waves ( further below ).

Because the size of earthquakes varies enormously, it is necessary for purposes of comparison to compress the range of wave amplitudes measured on seismograms by means of a mathematical device.

In 1935, American seismologist Charles F. Richter set up a magnitude scale of earthquakes as the logarithm to base 10 of the maximum seismic wave amplitude ( in thousandths of a millimetre ) recorded on a standard seismograph ( the Wood-Anderson torsion pendulum seismograph ) at a distance of 60-miles ( 100 kilometers ) from the earthquake epicentre.

Reduction of amplitudes observed at various distances to the amplitudes expected at the standard distance of 100 kilometers ( 50-miles ) is made on the basis of empirical tables.

Richter magnitudes ML are computed on the assumption the ratio of the maximum wave amplitudes at two ( 2 ) given distances is the same for all earthquakes and is independent of azimuth.

Richter first applied his magnitude scale to shallow-focus earthquakes recorded within 600 km of the epicentre in the southern California region. Later, additional empirical tables were set up, whereby observations made at distant stations and on seismographs other than the standard type could be used. Empirical tables were extended to cover earthquakes of all significant focal depths and to enable independent magnitude estimates to be made from body- and surface-wave observations.

A current form of the Richter scale is shown in the table.

Richter scale of earthquake magnitude

magnitude level

category

effects

earthquakes per year

less than 1.0 to 2.9

micro

generally not felt by people, though recorded on local instruments

more than 100,000

3.0-3.9

minor

felt by many people; no damage

12,000-100,000

4.0-4.9

light

felt by all; minor breakage of objects

2,000-12,000

5.0-5.9

moderate

some damage to weak structures

200-2,000

6.0-6.9

strong

moderate damage in populated areas

20-200

7.0-7.9

major

serious damage over large areas; loss of life

3-20

8.0 and higher

great

severe destruction and loss of life over large areas

fewer than 3

At the present time a number of different magnitude scales are used by scientists and engineers as a measure of the relative size of an earthquake. The P-wave magnitude (Mb), for one, is defined in terms of the amplitude of the P wave recorded on a standard seismograph. Similarly, the surface-wave magnitude (Ms) is defined in terms of the logarithm of the maximum amplitude of ground motion for surface waves with a wave period of 20 seconds.

As defined, an earthquake magnitude scale has no lower or upper limit. Sensitive seismographs can record earthquakes with magnitudes of negative value and have ‘recorded magnitudes up to’ about ‘9.0’ ( 1906 San Francisco earthquake, for example, had a Richter magnitude of 8.25 ).

A scientific weakness is that there is no direct mechanical basis for magnitude as defined above. Rather, it is an empirical parameter analogous to stellar magnitude assessed by astronomers. In modern practice a more soundly based mechanical measure of earthquake size is used—namely, the seismic moment (M0). Such a parameter is related to the angular leverage of the forces that produce the slip on the causative fault. It can be calculated both from recorded seismic waves and from field measurements of the size of the fault rupture. Consequently, seismic moment provides a more uniform scale of earthquake size based on classical mechanics. This measure allows a more scientific magnitude to be used called moment magnitude (Mw). It is proportional to the logarithm of the seismic moment; values do not differ greatly from Ms values for moderate earthquakes. Given the above definitions, the great Alaska earthquake of 1964, with a Richter magnitude (ML) of 8.3, also had the values Ms = 8.4, M0 = 820 × 1027 dyne centimetres, and Mw = 9.2

Earthquake Energy

Energy in an earthquake passing a particular surface site can be calculated directly from the recordings of seismic ground motion, given, for example, as ground velocity. Such recordings indicate an energy rate of 105 watts per square metre (9,300 watts per square foot) near a moderate-size earthquake source. The total power output of a rupturing fault in a shallow earthquake is on the order of 1014 watts, compared with the 105 watts generated in rocket motors.

The surface-wave magnitude Ms has also been connected with the surface energy Es of an earthquake by empirical formulas. These give Es = 6.3 × 1011 and 1.4 × 1025 ergs for earthquakes of Ms = 0 and 8.9, respectively. A unit increase in Ms corresponds to approximately a 32-fold increase in energy. Negative magnitudes Ms correspond to the smallest instrumentally recorded earthquakes, a magnitude of 1.5 to the smallest felt earthquakes, and one of 3.0 to any shock felt at a distance of up to 20 km ( 12 miles ). Earthquakes of magnitude 5.0 cause light damage near the epicentre; those of 6.0 are destructive over a restricted area; and those of 7.5 are at the lower limit of major earthquakes.

The total annual energy released in all earthquakes is about 1025 ergs, corresponding to a rate of work between 10,000,000 million and 100,000,000 million kilowatts. This is approximately one ( 1 ) 1,000th the ‘annual amount of heat escaping from the Earth interior’.

90% of the total seismic energy comes from earthquakes of ‘magnitude 7.0 and higher’ – that is, those whose energy is on the order of 1023 ergs or more.

Frequency

There also are empirical relations for the frequencies of earthquakes of various magnitudes. Suppose N to be the average number of shocks per year for which the magnitude lies in a range about Ms. Then log10 N = abMs fits the data well both globally and for particular regions; for example, for shallow earthquakes worldwide, a = 6.7 and b = 0.9 when Ms > 6.0. The frequency for larger earthquakes therefore increases by a factor of about 10 when the magnitude is diminished by one unit. The increase in frequency with reduction in Ms falls short, however, of matching the decrease in the energy E. Thus, larger earthquakes are overwhelmingly responsible for most of the total seismic energy release. The number of earthquakes per year with Mb > 4.0 reaches 50,000.

Earthquake Occurrences & Plate Tectonic associations

Global seismicity patterns had no strong theoretical explanation until the dynamic model called plate tectonics was developed during the late 1960s. This theory holds that the Earth’s upper shell, or lithosphere, consists of nearly a dozen large, quasi-stable slabs called plates. The thickness of each of these plates is roughly 50-miles ( 80 km ). Plates move horizontally relative to neighbouring plates at a rate of 0.4 to 4 inches ( 1-cm to 10-cm ) per year over a shell of lesser strength called the asthenosphere. At the plate edges where there is contact between adjoining plates, boundary tectonic forces operate on the rocks, causing physical and chemical changes in them. New lithosphere is created at oceanic ridges by the upwelling and cooling of magma from the Earth’s mantle. The horizontally moving plates are believed to be absorbed at the ocean trenches, where a subduction process carries the lithosphere downward into the Earth’s interior. The total amount of lithospheric material destroyed at these subduction zones equals that generated at the ridges. Seismological evidence ( e.g. location of major earthquake belts ) is everywhere in agreement with this tectonic model.

Earthquake Types:

– Shallow Earthquakes;

– Intermediate Earthquakes;

– Deep Focus ( Deep-Foci ) Earthquakes; and

– Deeper Focus ( Deeper-Foci ) Earthquakes.

Earthquake sources, are concentrated along oceanic ridges, corresponding to divergent plate boundaries.

At subduction zones, associated with convergent plate boundaries, deep-focus earthquakes and intermediate focus earthquakes mark locations of the upper part of a dipping lithosphere slab.

Focal ( Foci ) mechanisms indicate stresses aligned with dip of the lithosphere underneath the adjacent continent or island arc.

IntraPlate Seismic Event Anomalies

Some earthquakes associated with oceanic ridges are confined to strike-slip faults, called transform faults offset ridge crests. The majority of earthquakes occurring along such horizontal shear faults are characterized by slip motions.

Also in agreement with plate tectonics theory is high seismicity encountered along edges of plates where they slide past each other. Plate boundaries of this kind, sometimes called fracture zones include, the:

San Andreas Fault system in California; and,

– North Anatolian fault system in Turkey.

Such plate boundaries are the site of interplate earthquakes of shallow focus.

Low seismicity within plates is consistent with plate tectonic description. Small to large earthquakes do occur in limited regions well within the boundaries of plates, however such ‘intraplate seismic events’ can be explained by tectonic mechanisms other than plate boundary motions and their associated phenomena.

Most parts of the world experience at least occasional shallow earthquakes – those that originate within 60 km ( 40 miles ) of the Earth’s outer surface. In fact, the great ‘majority of earthquake foci ( focus ) are shallow’. It should be noted, however, that the geographic distribution of smaller earthquakes is less completely determined than more severe quakes, partly because the ‘availability of relevant data dependent on distribution of observatories’.

Of the total energy released in earthquakes, 12% comes from intermediate earthquakes—that is, quakes with a focal depth ranging from about 60 to 300 km. About 3 percent of total energy comes from deeper earthquakes. The frequency of occurrence falls off rapidly with increasing focal depth in the intermediate range. Below intermediate depth the distribution is fairly uniform until the greatest focal depths, of about 700 km (430 miles), are approached.

Deeper-Focus Earthquakes

Deeper-focus earthquakes commonly occur in patterns called Benioff zones dipping into the Earth, indicating presence of a subducting slab where dip angles of these slabs average about 45° – with some shallower – and others nearly vertical.

Benioff zones coincide with tectonically active island arcs, such as:

– Aleutian islands;

– Japan islands;

– Vanuatu islands; and

– Tonga.

Island arcs are, normally ( but not always ) associated, with:

Ultra-Deep Sea Ocean Trenches, such as the:

South America ( Andes mountain system ).

Exceptions to this rule, include:

– Romania ( East Europe ) mountain system; and

Hindu Kush mountain system.

Most Benioff zones,  deep-earthquake foci and intermediate-earthquake foci are usually found within a narrow layer, however recent more precise hypocentral locations – in Japan and elsewhere – indicate two ( 2 ) distinct parallel bands of foci only 12 miles ( 20 kilometers ) apart.

Aftershocks, Swarms and Foreshocks

Major or even moderate earthquake of shallow focus is followed by many lesser-size earthquakes close to the original source region. This is to be expected if the fault rupture producing a major earthquake does not relieve all the accumulated strain energy at once. In fact, this dislocation is liable to cause an increase in the stress and strain at a number of places in the vicinity of the focal region, bringing crustal rocks at certain points close to the stress at which fracture occurs. In some cases an earthquake may be followed by 1,000 or more aftershocks a day.

Sometimes a large earthquake is followed by a similar one along the same fault source within an hour or perhaps a day. An extreme case of this is multiple earthquakes. In most instances, however, the first principal earthquake of a series is much more severe than the aftershocks. In general, the number of aftershocks per day decreases with time.

Aftershock frequency, is ( roughly ):

Inversely proportional to time since occurrence of largest earthquake in series.

Most major earthquakes occur without detectable warning, but some principal earthquakes are preceded by foreshocks.

Japan 2-Years of Hundreds of Thousands Of Earthquakes

In another common pattern, large numbers of small earthquakes may occur in a region for months without a major earthquake.

In the Matsushiro region of Japan, for instance, there occurred ( between August 1965 and August 1967 ) a ‘series of earthquakes’ numbering in the hundreds of thousands – some sufficiently strong ( up to Richter magnitude 5.0 ) causing property damage but no casualties.

Maximum frequency? 6,780 small earthquakes on just April 17, 1966.

Such series of earthquakes are called earthquake swarms.

Earthquakes, associated with volcanic activity often occur in swarms – though swarms also have been observed in many nonvolcanic regions.

Study of Earthquakes

Seismic waves

Principal types of seismic waves

Seismic Waves ( S Waves ), generated by an earthquake source, are commonly classified into three ( 3 ) ‘leading types’.

The first two ( 2 ) leading types, propagate ( travel ) within the body of the Earth, are known as:

P ( Primary ) Seismic Waves; and,

S ( Secondary ) Seismic Waves ( S / S Waves ).

The third ( 3rd ) leading types, propagate ( travel ) along surface of the Earth, are known as:

L ( Love ) Seismic Waves; and,

R ( Rayleigh ) Seismic Waves.

During the 19th Century, existence of these types of seismic waves were mathematically predicted, and modern comparisons show close correspondence between such ‘theoretical calculations‘ and ‘actual measurements’ of seismic waves.

P seismic waves travel as elastic motions at the highest speeds, and are longitudinal waves transmitted by both solid and liquid materials within inner Earth.

P waves ( particles of the medium) vibrate in a manner ‘similar to sound waves’ transmitting media ( alternately compressed and expanded ).

The slower type of body wave, the S wave, travels only through solid material. With S waves, the particle motion is transverse to the direction of travel and involves a shearing of the transmitting rock.

Focus ( Foci )

Because of their greater speed, P waves are the first ( 1st ) to reach any point on the Earth’s surface. The first ( 1st ) P-wave onset ‘starts from the spot where an earthquake originates’. This point, usually at some depth within the Earth, is called the focus (also known as ) hypocentre.

Epicenter

Point ‘at the surface’ ( immediately ‘above the Focus / Foci’ ) is known as the ‘epicenter’.

Love waves and Rayleigh waves, guided by the free surface of the Earth, trail after P and S waves have passed through the body of planet Earth.

Rayleigh waves ( R Waves ) and Love waves ( L Waves ) involve ‘horizontal particle motion’, however ‘only Rayleigh waves exhibit ‘vertical ground displacements’.

Rayleigh waves ( R waves ) and Love ( L waves ) travel ( propagate ) and disperse into long wave trains, when occurring away-from ‘alluvial basin sources’, at substantial distances cause much of the Earth surface ground shaking felt during earthquakes.

Seismic Wave Focus ( Foci ) Properties

At all distances from the focus ( foci ), mechanical properties of rocks, such as incompressibility, rigidity and density play roles, in:

– Speed of ‘wave travel’;

– Duration of ‘wave trains’; and,

– Shape of ‘wave trains’.

Layering of the rocks and the physical properties of surface soil also affect wave characteristics.

In most cases, ‘elastic behaviors occur in earthquakes, however strong shaking ( of surface soils from the incident seismic waves ) sometimes result in ‘nonelastic behavior’, including slumping ( i.e., downward and outward movement of unconsolidated material ) and liquefaction of sandy soil.

Seismic wave that encounters a boundary separating ‘rocks of different elastic properties’ undergo reflection and refraction where a special complication exists because conversion between wave types usually also occur at such a boundary where an incident P or S wave can yield reflected P and S waves and refracted P and S waves.

Between Earth structural layers, boundaries give rise to diffracted and scattered waves, and these additional waves are partially responsible for complications observed in ground motion during earthquakes.

Modern research is concerned with ‘computing, synthetic records of ground motion realistic comparisons with observed actual ground shaking’, using wave theory in complex structures.

Grave Duration Long-Periods, Audible Earthquake Frequencies, and Other Earth Anomalies

Frequency range of seismic waves is widely varied, from being ‘High Frequency’ ( HF ) as an ‘audible range’ ( i.e. greater than > 20 hertz ) to, as Low Frequency ( LF ) as subtle as ‘free oscillations of planet Earth’ – with grave Long-Periods being 54-minutes ( see below Long-Period oscillations of the globe ).

Seismic wave attenuations in rock imposes High-Frequency ( HF ) limits, and in small to moderate earthquakes the dominant frequencies extend in Surface Waves from about 1.0 Hz to 0.1 Hertz.

Seismic wave amplitude range is also great in most earthquakes.

Displacement of ground ranges, from: 10−10 to 10−1 metre ( 4−12 to 4-inches ).

Great Earthquake Speed

Great Earthquake Ground Speed Moves Faster Than 32-Feet Per Second, Squared ( 9.8 Metres Per Second, Squared ) –

In the greatest earthquakes, ground amplitude of predominant P waves may be several centimetres at periods of 2-seconds to 5-seconds, however very close to seismic sources of ‘great earthquakes’, investigators measured ‘large wave amplitudes’ with ‘accelerations of the ground exceeding ( speed of gravity ) 32.2 feet per second squared ( 9.8 meters per second, squared ) at High Frequencies ( HF ) and ground displacements of 1 metre at Low Frequencies ( LF ).

Seismic Wave Measurement

Seismographs and Accelerometers

Seismographs ‘measure ground motion’ in both ‘earthquakes’ and ‘microseisms’ ( small oscillations described below ).

Most of these instruments are of the pendulum type. Early mechanical seismographs had a pendulum of large mass ( up to several tons ) and produced seismograms by scratching a line on smoked paper on a rotating drum.

In later instruments, seismograms ( also known as seismometers ) recorded via ‘rays of light bounced off a mirror’ within a galvanometer using electric current from electromagnetic induction ‘when the pendulum of the seismograph moved’.

Technological developments in electronics have given rise to ‘higher-precision pendulum seismometers’ and ‘sensors of ground motion’.

In these instruments electric voltages produced by motions of the pendulum or the equivalent are passed through electronic circuitry to ‘amplify ground motion digitized for more exactness’ readings.

Seismometer Nomenclature Meanings

Seismographs are divided into three ( 3 ) types of instruments knowingly confused by the public because of their varied names, as:

– Short-Period;

– Intermediate-Period ( also known as Long-Period );

– Long-Period ( also known as Intermediate-Period );

– Ultra-Long-Period ( also known as Broadband or Broad-Band ); or,

– Broadband ( also known as Ultra Long-Period or UltraLong-Period ).

Short-Period instruments are used to record P and S body waves with high magnification of the ground motion. For this purpose, the seismograph response is shaped to peak at a period of about 1-second or less.

Intermediate-period instruments, the type used by the World-Wide Standardized Seismographic Network ( WWSSN ) – described in the section Earthquake observatories – had about a 20-second ( maximum ) response.

Recently, in order to provide as much flexibility as possible for research work, the trend has been toward the operation of ‘very broadband seismographs’ digitizing representation of signals. This is usually accomplished with ‘very long-period pendulums’ and ‘electronic amplifiers’ passing signals in the band between 0.005 Hz and 50 Hertz.

When seismic waves close to their source are to be recorded, special design criteria are needed. Instrument sensitivity must ensure that the largest ground movements can be recorded without exceeding the upper scale limit of the device. For most seismological and engineering purposes the wave frequencies that must be recorded are higher than 1 hertz, and so the pendulum or its equivalent can be small. For this reason accelerometers that measure the rate at which the ground velocity is changing have an advantage for strong-motion recording. Integration is then performed to estimate ground velocity and displacement. The ground accelerations to be registered range up to two times that of gravity. Recording such accelerations can be accomplished mechanically with short torsion suspensions or force-balance mass-spring systems.

Because many strong-motion instruments need to be placed at unattended sites in ordinary buildings for periods of months or years before a strong earthquake occurs, they usually record only when a trigger mechanism is actuated with the onset of ground motion. Solid-state memories are now used, particularly with digital recording instruments, making it possible to preserve the first few seconds before the trigger starts the permanent recording and to store digitized signals on magnetic cassette tape or on a memory chip. In past design absolute timing was not provided on strong-motion records but only accurate relative time marks; the present trend, however, is to provide Universal Time ( the local mean time of the prime meridian ) by means of special radio receivers, small crystal clocks, or GPS ( Global Positioning System ) receivers from satellite clocks.

Prediction of strong ground motion and response of engineered structures in earthquakes depends critically on measurements of the spatial variability of earthquake intensities near the seismic wave source. In an effort to secure such measurements, special arrays of strong-motion seismographs have been installed in areas of high seismicity around the world.

Large-aperture seismic arrays (linear dimensions on the order of about 1/2 mile ( 0.6 mile ) to about 6 miles ( 1 kilometer to 10 kilometers ) of strong-motion accelerometers now used to improve estimations of speed, direction of propagation and types of seismic wave components.

Particularly important for full understanding of seismic wave patterns at the ground surface is measurement of the variation of wave motion with depth where to aid in this effort special digitally recording seismometers have been ‘installed in deep boreholes’.

Ocean-Bottom Measurements

70% of the Earth’s surface is covered by water so, ocean-bottom seismometers augment ( add to ) global land-based system of recording stations.

Field tests have established the feasibility of extensive long-term recording by instruments on the seafloor.

Japan has a ‘semi-permanent seismograph’ system of this type placed on the seafloor off the Pacific Ocean eastcoast of centralHonshu,Japan in 1978 by means of a ‘cable’.

Because of mechanical difficulties maintaining ‘permanent ocean-bottom instrumentation’, different systems have been considered.

They ‘all involve placement of instruments on the ocean bottom’, though they employ various mechanisms for data transmission.

Signals may be transmitted to the ocean surface for retransmission by auxiliary apparatus or transmitted via cable to a shore-based station. Another system is designed to release its recording device automatically, allowing it to float to the surface for later recovery.

Ocean bottom seismograph use should yield much-improved global coverage of seismic waves and provide new information on the seismicity of oceanic regions.

Ocean-bottom seismographs will enable investigators to determine the details of the crustal structure of the seafloor and, because of the relative ‘thinness of the oceanic crust‘, should make possible collection of clear seismic information about Earth’s upper mantle.

Ocean bottom seismograph systems are also expected to provide new data, on Earth:

– Continental Shelf Plate Boundaries;

– MicroSeisms ( origins and propagations ); and,

– Ocean to Continent behavior margins.

MicroSeisms Measurements

MicroSeisms ( also known as ) ‘small ground motions’ are commonly recorded by seismographs. Small weak seismic wave motions ( also known as ) MicroSeisms are ‘not generated by earthquakes’ but in some instances can complicate accurate earthquake measurement recording. MicroSeisms are of scientific interest because their form relates to Earth surface structure.

Microseisms ( some ) have ‘local cause’, for example:

Microseisms due to traffic ( or machinery ) or local wind effects, storms and rough surf against an extended steep coastline.

Another class of microseisms exhibits features that are very similar on records traced at earthquake observatories that are widely separated, including approximately simultaneous occurrence of maximum amplitudes and similar wave frequencies. These microseisms may persist for many hours and have more or less regular periods of about five to eight seconds.

The largest amplitudes of such microseisms are on the order of 10−3 cm ( 0.0004 inch ) and ‘occur in coastal regions’. Amplitudes also depend to some extent on local geologic structure.

Some microseisms are produced when ‘large standing water waves are formed far out at sea’. The period of this type of microseism is ‘half’ of the Standing Wave.

Observations of Earthquakes

Earthquake Observatories

During the late 1950s, there were only about seven-hundred ( 700 ) seismographic stations worldwide, equipped with seismographs of various types and frequency responses – few instruments of which were calibrated; actual ground motions could not be measured, and ‘timing errors of several seconds’ were common.

The World-Wide Standardized Seismographic Network ( WWSSN ), became the first modern worldwide standardized system established to remedy that situation.

Each of the WWSSN had six ( 6 ) seismograph stations with three ( 3 ) short-period and three ( 3 ) long-period seismographs with timing and accuracy maintained by quartz crystal clocks, and a calibration pulse placed daily on each record.

By 1967, the WWSSN consisted of about one-hundred twenty ( 120 ) stations throughout sixty ( 60 ) countries, resulting in data to provide the basis for significant advances in research, on:

– Earthquakes ( mechanisms );

– Plate Tectonics ( global ); and,

– Deep-Structure Earth ( interior ).

By the 1980s a further upgrading of permanent seismographic stations began with the installation of digital equipment by a number of organizations.

Global digital seismograph station networks, now in operation, consist of:

– Seismic Research Observatories ( SRO ) within boreholes drilled 330 feet ( 100 metres ) deep in Earth ground; and,

– Modified high-gain long-period earthquake observatories located on Earth ground surfaces.

The Global Digital Seismographic Network in particular has remarkable capability, recording all motions from Earth ocean tides to microscopic ground motions at the level of local ground noise.

At present there are about 128 sites. With this system the long-term seismological goal will have been accomplished to equip global observatories with seismographs that can record every small earthquake anywhere over a broad band of frequencies.

Epicentre Earthquakes Located

Many observatories make provisional estimates of the epicentres of important earthquakes. These estimates provide preliminary information locally about particular earthquakes and serve as first approximations for the calculations subsequently made by large coordinating centres.

If an earthquake’s epicentre is less than 105° away from an earthquake observatory, the epicentre position can often be estimated from the readings of three ( 3 ) seismograms recording perpendicular components of the ground motion.

For a shallow earthquake the epicentral distance is indicated by the interval between the arrival times of P and S waves; the azimuth and angle of wave emergence at the surface indicated by somparing sizes and directions of the first ( 1st ) movements indicated by seismograms and relative sizes of later waves – particularly surface waves.

Anomaly

It should be noted, however, that in certain regions the first ( 1st ) wave movement at a station arrives from a direction differing from the azimuth toward the epicentre. This ‘anomaly is usually explained’ by ‘strong variations in geologic structures’.

When data from more than one observatory are available, an earthquake’s epicentre may be estimated from the times of travel of the P and S waves from source to recorder. In many seismically active regions, networks of seismographs with telemetry transmission and centralized timing and recording are common. Whether analog or digital recording is used, such integrated systems greatly simplify observatory work: multichannel signal displays make identification and timing of phase onsets easier and more reliable. Moreover, online microprocessors can be programmed to pick automatically, with some degree of confidence, the onset of a significant common phase, such as P, by correlation of waveforms from parallel network channels. With the aid of specially designed computer programs, seismologists can then locate distant earthquakes to within about 10 km (6 miles) and the epicentre of a local earthquake to within a few kilometres.

Catalogs of earthquakes felt by humans and of earthquake observations have appeared intermittently for many centuries. The earliest known list of instrumentally recorded earthquakes with computed times of origin and epicentres is for the period 1899 – 1903, afterwhich cataloging of earthquakes became more uniform and complete.

Especially valuable is the service provided by the International Seismological Centre ( ISC ) in Newbury, UK that monthly receives more than 1,000,000 seismic readings from more than 2,000 seismic monitoring stations worldwide and preliminary estimates locations of approximately 1,600 earthquakes from national and regional agencies and observatories.

The ISC publishes a monthly bulletin about once ( 1 ) every 2-years. The bulletin, when published, provides ‘all available information that was’ on each of more than 5,000 earthquakes.

Various national and regional centres control networks of stations and act as intermediaries between individual stations and the international organizations.

Examples of long-standing national centers include, the:

Japan Meteorological Agency; and,

U.S. National Earthquake Information Center ( NEIC ), a subdivision of the U.S. Geological Survey ( USGS ).

Centers, such as the aforementioned, normally make ‘local earthquake estimates’, of:

– Magnitude;

– Epicentre;

– Time origin; and,

– Focal depth.

Global seismicity data is continually accessible via Incorporated Research Institutions for Seismology ( IRIS ) website.

An important research technique infers the character of faulting ( in an earthquake ) from recorded seismograms.

For example, observed distributions ( of the directions of the first onsets in waves arriving at the Earth’s surface ) have been effectively used.

Onsets are called “compressional” or “dilatational,” according to whether the direction is ‘away from’ or ‘toward’ the focus, respectively.

A polarity pattern becomes recognizable when the directions of the P-wave onsets are plotted on a map – there are broad areas in which the first onsets are predominantly compressions, separated from predominantly dilatational areas by nodal curves near which the P-wave amplitudes are abnormally small.

In 1926 the American geophysicist Perry E. Byerly used patterns of P onsets over the entire globe to infer the orientation of the fault plane in a large earthquake. The polarity method yields two P-nodal curves at the Earth’s surface; one curve is in the plane containing the assumed fault, and the other is in the plane ( called the auxiliary plane ) that passes through the focus and is perpendicular to the forces of the plane.

The recent availability of worldwide broad-based digital recording enabled computer programs written estimating the fault mechanism and seismic moment based on complete pattern of seismic wave arrivals.

Given a well-determined pattern at a number of earthquake observatories, it is possible to locate two ( 2 ) planes, one ( 1 ) of which is the plane containing the fault.

Earthquake Prediction

Earthquake Observations & Interpretations

Statistical earthquake occurrences are believed theorized, not widely accepted nor detecting periodic cycles, records of which old periodicities in time and space for major / great earthquakes cataloged are as old as 700 B.C. with China holding the ‘world’s most extensive catalog’ of approximately one ( 1 ) one-thousand ( 1,000 ) destructive earthquakes where ‘magnitude ( size )’ measurements were assessed based on ‘damage reports’ and experienced periods of ‘shaking’ and ‘other observations’ determining ‘intensity’ of those earthquakes.

Earthquake Attributions to Postulation

Precursor predictability approaches involve what some believe is sheer postulating what the initial trigger mechanisms are that force Earth ruptures, however where this becomes bizarre is where such forces have been attributed, to:

Weather Severity;

– Volcano Activity; and,

– Ocean Tide Force ( Moon ).

EXAMPLE: Correlations between physical phenomena assumed providing trigger mechanisms for earthquake repetition.

Professionals believe such must always be made to discover whether a causative link is actually present, and they further believe that to-date: ‘no cases possess any trigger mechanism’ – insofaras ‘moderate earthquakes’ to ‘large earthquakes’ unequivocally finding satisfaction with various necessary criteria.

Statistical methods also have been tried with populations of regional earthquakes with such suggested, but never established generally, that the slope b of the regression line between the logarithm of the number of earthquakes and the magnitude for a region may change characteristically with time.

Specifically, the claim is that the b value for the population of ‘foreshocks of a major earthquake’ may be ‘significantly smaller’ than the mean b value for the region averaged ‘over a long interval of time’.

Elastic rebound theory, of earthquake sources, allows rough prediction of the occurrence of large shallow earthquakes – for example – Harry F. Reid gave a crude forecast of the next great earthquake near San Francisco ( theory also predicted, of-course, the place would be along the San Andreas Fault or associated fault ). Geodetic data indicated that during an interval of 50 years relative displacements of 3.2 metres ( 10-1/2 feet ) had occurred at distant points across the fault. Elastic-rebound maximum offset ( along the fault in the 1906 earthquake ) was 6.5 metres. Therefore, ( 6.5 ÷ 3.2 ) × 50 or about 100-years would again elapse before sufficient strain accumulated for the occurrence of an earthquake comparable to that of 1906; premises being regional strain will grow uniformly and various constraints have not been altered by the great 1906 rupture itself ( such as by the onset of slow fault slip ).

Such ‘strain rates’ are now, however being more adequately measured ( along a number of active faults, e.g.San Andreas Fault) using networks of GPS sensors.

Earthquake Prediction Research

For many years prediction research has been influenced by the basic argument that ‘strain accumulates in rock masses in the vicinity of a fault, resulting in crustal deformation.

Deformations have been measured in ‘horizontal directions’ along active faults via ‘trilateration’ and ‘triangulation’ and in ‘vertical directions’ via ‘precise leveling and tiltmeters’.

Investigators ( some ) believe ‘ground-water level changes occur prior to earthquakes’ with variations of such reports from China.

Ground water levels respond to an array of complex factors ( e.g. ‘rainfall’ ) where such would have to be removed if changes in water level changes were studied in relation to earthquakes.

Phenomena Precursor Premonitories

Dilatancy theory ( i.e., volume increase of rock prior to rupture ) once occupied a central position in discussions of premonitory phenomena of earthquakes, but now receives less support based on observations that many solids exhibit dilatancy during deformation. For earthquake prediction, significance of dilatancy, if real, effects various measurable quantities of crustal Earth, i.e. seismic velocity, electric resistivity and ground and water levels. Consequences of dilatancy for earthquake prediction are summarized in the table ( below ):

The best-studied consequence is the effect on seismic velocities. The influence of internal cracks and pores on the elastic properties of rocks can be clearly demonstrated in laboratory measurements of those properties as a function of hydrostatic pressure. In the case of saturated rocks, experiments predict – for shallow earthquakes – that dilatancy occurs as a portion of the crust is stressed to failure, causing a decrease in the velocities of seismic waves. Recovery of velocity is brought about by subsequent rise of the pore pressure of water, which also has the effect of weakening the rock and enhancing fault slip.

Strain buildup in the focal region may have measurable effects on other observable properties, including electrical conductivity and gas concentration. Because the electrical conductivity of rocks depends largely on interconnected water channels within the rocks, resistivity may increase before the cracks become saturated. As pore fluid is expelled from the closing cracks, the local water table would rise and concentrations of gases such as radioactive radon would increase. No unequivocal confirming measurements have yet been published.

Geologic methods of extending the seismicity record back from the present also are being explored. Field studies indicate that the sequence of surface ruptures along major active faults associated with large earthquakes can sometimes be constructed.

An example is the series of large earthquakes inTurkeyin the 20th Century, which were caused mainly by successive westward ruptures of the North Anatolian Fault.

Liquefaction effects preserved in beds of sand and peat have provided evidence ( using radiometric dating methods ) for large paleoearthquakes back more than 1,000 years in many seismically active zones, including the U.S. Northwest Pacific Ocean Coastal Region.

Less well-grounded precursory phenomena, particularly earthquake lights and animal behaviour, sometimes draw more public attention than the precursors discussed above.

Unusual lights in the sky reported, and abnormal animal behaviour, preceding earthquakes are known to seismologists – mostly in anecdotal form.

Both phenomena, are usually explained away in terms of ( prior to earthquakes ) there being:

– Gaseous emmissions from Earth ground;

– Electric stimuli ( various ), e.g. HAARP, etcetera, from Earth ground; and,

– Acoustic stimuli ( various ), e.g. Seismic Wave subsonic emmissions from Earth ground.

At the present time, there is no definitive experimental evidence supporting reported claims of animals sometimes sensing an approaching earthquake.

… [ CENSORED-OUT ] …

Earthquake Hazard Reduction Methods

Considerable work has been done in seismology to explain the characteristics of the recorded ground motions in earthquakes. Such knowledge is needed to predict ground motions in future earthquakes so that earthquake-resistant structures can be designed.

Although earthquakes cause death and destruction via such secondary effects ( i.e. landslides, tsunamis, fires and fault rupture ), the greatest losses ( human lives and property ) result from ‘collapsing man-made structures’ amidst violent ground shaking.

The most effective way to mitigate ( minimize ) damage from earthquakes – from an engineering standpoint – is to design and construct structures capable of withstanding ‘strong ground motions’.

Interpreting recorded ground motions

Most ‘elastic waves’ recorded ( close to an extended fault source ) are complicated and difficult to interpret uniquely.

Understanding such, near-source motion, can be viewed as a 3 part problem.

The first ( 1st ) part stems from ‘elastic wave generations’ radiating ( from the slipping fault ) as the ‘moving rupture sweeps-out an area of slip’ ( along the fault plane ) – within a given time.

Wave pattern production dependencies on several parameters, such as:

Fault dimension and rupture velocity.

Elastic waves ( various types ) radiate, from the vicinity of the moving rupture, in all directions.

Geometric and frictional properties of the fault, critically affect wave pattern radiation from it.

The second ( 2nd ) part of the problem concerns the passage of the waves through the intervening rocks to the site and the effect of geologic conditions.

The third ( 3rd ) part involves the conditions at the recording site itself, such as topography and highly attenuating soils. All these questions must be considered when estimating likely earthquake effects at a site of any proposed structure.

Experience has shown that the ground strong-motion recordings have a variable pattern in detail but predictable regular shapes in general ( except in the case of strong multiple earthquakes ).

EXAMPLE: Actual ground shaking ( acceleration, velocity and displacement ) recorded during an earthquake ( see figure below ).

In a strong horizontal shaking of the ground near the fault source, there is an initial segment of motion made up mainly of P waves, which frequently manifest themselves strongly in the vertical motion. This is followed by the onset of S waves, often associated with a longer-period pulse of ground velocity and displacement related to the near-site fault slip or fling. This pulse is often enhanced in the direction of the fault rupture and normal to it. After the S onset there is shaking that consists of a mixture of S and P waves, but the S motions become dominant as the duration increases. Later, in the horizontal component, surface waves dominate, mixed with some S body waves. Depending on the distance of the site from the fault and the structure of the intervening rocks and soils, surface waves are spread out into long trains.

Expectant Seismic Hazard Maps Constructed

In many regions, seismic expectancy maps or hazard maps are now available for planning purposes. The anticipated intensity of ground shaking is represented by a number called the peak acceleration or the peak velocity.

To avoid weaknesses found in earlier earthquake hazard maps, the following general principles are usually adopted today:

The map should take into account not only the size but also the frequency of earthquakes.

The broad regionalization pattern should use historical seismicity as a database, including the following factors: major tectonic trends, acceleration attenuation curves, and intensity reports.

Regionalization should be defined by means of contour lines with design parameters referred to ordered numbers on neighbouring contour lines ( this procedure minimizes sensitivity concerning the exact location of boundary lines between separate zones ).

The map should be simple and not attempt to microzone the region.

The mapped contoured surface should not contain discontinuities, so that the level of hazard progresses gradually and in order across any profile drawn on the map.

Developing resistant structures

Developing engineered structural designs that are able to resist the forces generated by seismic waves can be achieved either by following building codes based on hazard maps or by appropriate methods of analysis. Many countries reserve theoretical structural analyses for the larger, more costly, or critical buildings to be constructed in the most seismically active regions, while simply requiring that ordinary structures conform to local building codes. Economic realities usually determine the goal, not of preventing all damage in all earthquakes but of minimizing damage in moderate, more common earthquakes and ensuring no major collapse at the strongest intensities. An essential part of what goes into engineering decisions on design and into the development and revision of earthquake-resistant design codes is therefore seismological, involving measurement of strong seismic waves, field studies of intensity and damage, and the probability of earthquake occurrence.

Earthquake risk can also be reduced by rapid post-earthquake response. Strong-motion accelerographs have been connected in some urban areas, such as Los Angeles, Tokyo, and Mexico City, to interactive computers.

Recorded waves are correlated with seismic intensity scales and rapidly displayed graphically on regional maps via the World Wide Web.

Exploration of the Earth’s interior with seismic waves

Seismological Tomography

Deep Structure Earth seismological data from several sources, including:

– Nuclear explosions containing P-Waves and S-Waves;

– Earthquakes containing P-Waves and S-Waves;

– Earth ‘surface wave dispersions’ from ‘distant earthquakes’; and,

– Earth ‘planetary vibration’ from ‘Great Earthquakes’

One of the major aims of seismology was to infer a minimum set of properties surrounding the planet interior of Earth that might explain recorded seismic ‘wave trains’ in detail.

Deep Structure Earth exploration made ‘tremendous progress during the first half of the 20th Century ( 1900s – 1950s ), realizing goals was severely limited until the 1960s because of laborious effort required just to evaluate theoretical models and process large amounts of recorded earthquake data.

Today’s application of supercomputer high-speed data processing enormous quantities of stored data and information retrieval capabilities opened information technology ( IT ) passageways leading to major advancements in the way data is manipulated ( data handling ) for advanced theoretical modeling, research analytics and developmental prototyping.

Earth structure realistic modeling studies by researchers since the middle 1970s include continental and oceanic boundaries, mountains and river valleys rather than simple structures such as those involving variation only with depth, and various technical developments have benefited observational seismology.

EXAMPLE: Deep Structure Earth significant exploration using 3D ( three dimensional ) imaging with equally impressive display ( monitor ) equipment possible from advanced microprocessor architecture redesign, new discoveries of materials and new concepts making seismic exploratory techniques developed by petroleum industry adaptations ( e.g. seismic reflection ) highly recognized as adopted procedures.

Deep Structure Earth major methods for determining planet interior is detailed analysis of seismograms of seismic waves; noting earthquake readings additionally provide estimates of, Earth internal:

Wave velocities;

– Density; and,

– Parameters of ‘elasticity’ ( stretchable ) and ‘inelasticity’ ( fixed ).

Earthquake Travel Time

Primary procedure is to measure the travel times of various wave types, such as P and S, from their source to the recording seismograph. First, however, identification of each wave type with its ray path through the Earth must be made.

Seismic rays for many paths of P and S waves leaving the earthquake focus F are shown in the figure.

Deep-Focus Deep-Structure Earth Coremetrics

Rays corresponding to waves that have been reflected at the Earth’s outer surface (or possibly at one of the interior discontinuity surfaces) are denoted as PP, PS, SP, PSS, and so on. For example, PS corresponds to a wave that is of P type before surface reflection and of S type afterward. In addition, there are rays such as pPP, sPP, and sPS, the symbols p and s corresponding to an initial ascent to the outer surface as P or S waves, respectively, from a deep focus.

An especially important class of rays is associated with a discontinuity surface separating the central core of the Earth from the mantle at a depth of about 2,900 km (1,800 miles) below the outer surface. The symbol c is used to indicate an upward reflection at this discontinuity. Thus, if a P wave travels down from a focus to the discontinuity surface in question, the upward reflection into an S wave is recorded at an observing station as the ray PcS and similarly with PcP, ScS, and ScP. The symbol K is used to denote the part (of P type) of the path of a wave that passes through the liquid central core. Thus, the ray SKS corresponds to a wave that starts as an S wave, is refracted into the central core as a P wave, and is refracted back into the mantle, wherein it finally emerges as an S wave. Such rays as SKKS correspond to waves that have suffered an internal reflection at the boundary of the central core.

The discovery of the existence of an inner core in 1936 by the Danish seismologist Inge Lehmann made it necessary to introduce additional basic symbols. For paths of waves inside the central core, the symbols i and I are used analogously to c and K for the whole Earth; therefore, i indicates reflection upward at the boundary between the outer and inner portions of the central core, and I corresponds to the part (of P type) of the path of a wave that lies inside the inner portion. Thus, for instance, discrimination needs to be made between the rays PKP, PKiKP, and PKIKP. The first of these corresponds to a wave that has entered the outer part of the central core but has not reached the inner core, the second to one that has been reflected upward at the inner core boundary, and the third to one that has penetrated into the inner portion.

By combining the symbols p, s, P, S, c, K, i, and I in various ways, notation is developed for all the main rays associated with body earthquake waves.

Hidden Inner Earth Deep Structure Anomalies

The symbol J, introduced to correspond with S waves located within Earth’s inner core, is only evidence if such ( if ever ) be found for such waves. Use of times of travel along rays to infer a hidden structure is analogous to the use of X-rays in medical tomography. The method involves reconstructing an image of internal anomalies from measurements made at the outer surface. Nowadays, hundreds of thousands of travel times of P and S waves are available in earthquake catalogs for the tomographic imaging of the Earth’s interior and the mapping of internal structure.

Inner Earth Deep Structure

Thinest & Thickest Part of Earth’s Crust

Inner Earth, based on earthquake records and imaging studies, are officially represented, as:

A solid layer flowing patterns of a mantle, at its ’thickest point’ being about 1,800-miles ( 2,900 kilometers ) thick, although at its ‘thinest point’ less than 6-miles ( 10 kilometers ) beneath the ocean seafloor bed beneath the surface of the ultra-deep sea.

The thin surface rock layer surrounding the mantle is the crust, whose lower boundary is called the Mohorovičić discontinuity. In normal continental regions the crust is about 30 kilometers to 40 km thick; there is usually a superficial low-velocity sedimentary layer underlain by a zone in which seismic velocity increases with depth. Beneath this zone there is a layer in which P-wave velocities in some places fall from 6 to 5.6 km per second. The middle part of the crust is characterized by a heterogeneous zone with P velocities of nearly 6 to 6.3 km per second. The lowest layer of the crust ( about 10 km thick ) has significantly higher P velocities, ranging up to nearly 7 km per second.

In the deep ocean there is a sedimentary layer that is about 1 km thick. Underneath is the lower layer of the oceanic crust, which is about 4 km thick. This layer is inferred to consist of basalt that formed where extrusions of basaltic magma at oceanic ridges have been added to the upper part of lithospheric plates as they spread away from the ridge crests. This crustal layer cools as it moves away from the ridge crest, and its seismic velocities increase correspondingly.

Below Earth’s mantle at its ‘thickest point’ exists a shell depth of 1,800 miles ( 2,255 km ), which seismic waves indicate, has liquid property form, and at Earth’s ‘shallowest point’ only 6-miles ( 10 kilometers ) located beneath the ultra-deep seafloor of the planet ocean.

At the very centre of the planet is a separate solid core with a radius of 1,216 km. Recent work with observed seismic waves has revealed three-dimensional structural details inside the Earth, especially in the crust and lithosphere, under the subduction zones, at the base of the mantle, and in the inner core. These regional variations are important in explaining the dynamic history of the planet.

Long-Period Global Oscillations

Sometimes earthquakes can be so great, the entire planet Earth will vibrate like a ringing bell’s echo, with the deepest tone of vibration recorded by modern man on planet Earth is a period of measurement where the length of time between the arrival of successive crests in a wave train has been 54-minutes considered by human beings as ‘grave’ ( an extremely significant danger ).

Knowledge of these vibrations has come from a remarkable extension in the ‘range of periods of ground movements’ now able to be recorded by modern ‘digital long-period seismographs’ spanning the entire allowable spectrum of earthquake wave periods, from: ordinary P waves ( with periods of tenths of seconds ) to vibrations ( with periods on the order of 12-hours and 24-hours ), i.e. those movements occuring within Earth ocean tides.

The measurements of vibrations of the whole Earth provide important information on the properties of the interior of the planet. It should be emphasized that these free vibrations are set up by the energy release of the earthquake source but continue for many hours and sometimes even days. For an elastic sphere such as the Earth, two types of vibrations are known to be possible. In one type, called S modes, or spheroidal vibrations, the motions of the elements of the sphere have components along the radius as well as along the tangent. In the second [ 2nd ] type, which are designated as T modes or torsional vibrations, there is shear but no radial displacements. The nomenclature is nSl and nTl, where the letters n and l are related to the surfaces in the vibration at which there is zero motion. Four ( 4 ) examples are illustrated in the figure. The subscript n gives a count of the number of internal zero-motion ( nodal ) surfaces, and l indicates the number of surface nodal lines.

Several hundred types of S and T vibrations have been identified and the associated periods measured. The amplitudes of the ground motion in the vibrations have been determined for particular earthquakes, and, more important, the attenuation of each component vibration has been measured. The dimensionless measure of this decay constant is called the quality factor Q. The greater the value of Q, the less the wave or vibration damping. Typically, for oS10 and oT10, the Q values are about 250.

The rate of decay of the vibrations of the whole Earth with the passage of time can be seen in the figure, where they appear superimposed for 20 hours of the 12-hour tidal deformations of the Earth. At the bottom of the figure these vibrations have been split up into a series of peaks, each with a definite frequency, similar to that of the spectrum of light.

Such a spectrum indicates the relative amplitude of each harmonic present in the free oscillations. If the physical properties of the Earth’s interior were known, all these individual peaks could be calculated directly. Instead, the internal structure must be estimated from the observed peaks.

Recent research has shown that observations of long-period oscillations of the Earth discriminate fairly finely between different Earth models. In applying the observations to improve the resolution and precision of such representations of the planet’s internal structure, a considerable number of Earth models are set up, and all the periods of their free oscillations are computed and checked against the observations. Models can then be successively eliminated until only a small range remains. In practice, the work starts with existing models; efforts are made to amend them by sequential steps until full compatibility with the observations is achieved, within the uncertainties of the observations. Even so, the resulting computed Earth structure is not a unique solution to the problem.

Extraterrestrial Seismic Phenomena

Space vehicles have carried equipment onto the our Moon and Mars surface recording seismic waves from where seismologists on Earth receive telemetry signals from seismic events from both.

By 1969, seismographs had been placed at six sites on the Moon during the U.S. Apollo missions. Recording of seismic data ceased in September 1977. The instruments detected between 600 and 3,000 moonquakes during each year of their operation, though most of these seismic events were very small. The ground noise on the lunar surface is low compared with that of the Earth, so that the seismographs could be operated at very high magnifications. Because there was more than one station on the Moon, it was possible to use the arrival times of P and S waves at the lunar stations from the moonquakes to determine foci in the same way as is done on the Earth.

Moonquakes are of three types. First, there are the events caused by the impact of lunar modules, booster rockets, and meteorites. The lunar seismograph stations were able to detect meteorites hitting the Moon’s surface more than 1,000 km (600 miles) away. The two other types of moonquakes had natural sources in the Moon’s interior: they presumably resulted from rock fracturing, as on Earth. The most common type of natural moonquake had deep foci, at depths of 600 to 1,000 km; the less common variety had shallow focal depths.

Seismological research on Mars has been less successful. Only one of the seismometers carried to the Martian surface by the U.S. Viking landers during the mid-1970s remained operational, and only one potential marsquake was detected in 546 Martian days.

Historical Major Earthquakes

Major historical earthquakes chronological listing in table ( below ).

 

Major Earthquake History

Year

Region / Area

Affected

* Mag.

Intensity

Human Death

Numbers

( approx. )

Remarks

c. 1500 BCE

Knossos,

Crete

(Greece)

X

One of several events that leveled the capital of Minoan civilization, this quake accompanied the explosion of the nearby volcanic islandof Thera.

27 BCE

Thebes

(Egypt)

This quake cracked one of the statues known as the Colossi of Memnon, and for almost two centuries the “singing Memnon” emitted musical tones on certain mornings as it was warmed by the Sun’s rays.

62 CE

Pompeii

and Herculaneum

(Italy)

X

These two prosperous Roman cities had not yet recovered from the quake of 62 when they were buried by the eruption of Mount Vesuvius in 79.

115

AntiochAntakya,

(Turkey)

XI

A centre of Hellenistic and early Christian culture, Antiochsuffered many devastating quakes; this one almost killed the visiting Roman emperor Trajan.

1556

Shaanxi

( province )

China

IX

830,000

Deadliest earthquake ever recorded, possible.

1650

Cuzco

(Peru)

8.1

VIII

Many ofCuzco’s Baroque monuments date to the rebuilding of the city after this quake.

1692

Port Royal (Jamaica)

2,000

Much of thisBritish West Indiesport, a notorious haven for buccaneers and slave traders, sank beneath the sea following the quake.

1693

southeasternSicily,

(Italy)

XI

93,000

Syracuse, Catania, and Ragusa were almost completely destroyed but were rebuilt with a Baroque splendour that still attracts tourists.

1755

Lisbon,Portugal

XI

62,000

The Lisbon earthquake of 1755 was felt as far away asAlgiers and caused a tsunami that reached theCaribbean.

1780

Tabriz

(Iran)

7.7

200,000

This ancient highland city was destroyed and rebuilt, as it had been in 791, 858, 1041, and 1721 and would be again in 1927.
1811 – 1812

NewMadrid,Missouri

(USA)

7.5 – 7.7

XII

A series of quakes at the New Madrid Fault caused few deaths, but the New Madrid earthquake of 1811 – 1812 rerouted portions of the Mississippi River and was felt fromCanada to theGulf of Mexico.

1812

Caracas

(Venezuela)

9.6

X

26,000

A provincial town in 1812,Caracasrecovered and eventually becameVenezuela’s capital.

1835

Concepción,

(Chile)

8.5

35

British naturalist Charles Darwin, witnessing this quake, marveled at the power of the Earth to destroy cities and alter landscapes.

1886

Charleston,South Carolina

(USA)

IX

60

This was one of the largest quakes ever to hit the easternUnited States.

1895

Ljubljana

(Slovenia)

6.1

VIII

ModernLjubljanais said to have been born in the rebuilding after this quake.

1906

San Francisco,California

(USA)

7.9

XI

700

San Franciscostill dates its modern development from the San Francisco earthquake of 1906 and the resulting fires.

1908

Messina and Reggio di Calabria,Italy

7.5

XII

110,000

These two cities on theStraitofMessinawere almost completely destroyed in what is said to beEurope’s worst earthquake ever.

1920

Gansu

( province )

China

8.5

200,000

Many of the deaths in this quake-prone province were caused by huge landslides.

1923

Tokyo-Yokohama,

(Japan)

7.9

142,800

Japan’s capital and its principal port, located on soft alluvial ground, suffered severely from the Tokyo-Yokohama earthquake of 1923.

1931

Hawke Bay,New Zealand

7.9

256

The bayside towns of Napier and Hastings were rebuilt in an Art Deco style that is now a great tourist attraction.

1935

Quetta (Pakistan)

7.5

X

20,000

The capital of Balochistan province was severely damaged in the most destructive quake to hitSouth Asiain the 20th century.

1948

Ashgabat (Turkmenistan)

7.3

X

176,000

Every year,Turkmenistancommemorates the utter destruction of its capital in this quake.

1950

Assam,India

8.7

X

574

The largest quake ever recorded inSouth Asiakilled relatively few people in a lightly populated region along the Indo-Chinese border.

1960

Valdivia

and

Puerto Montt,

(Chile)

9.5

XI

5,700

The Chile earthquake of 1960, the largest quake ever recorded in the world, produced a tsunami that crossed the Pacific Ocean toJapan, where it killed more than 100 people.

1963

Skopje,Macedonia

6.9

X

1,070

The capital ofMacedoniahad to be rebuilt almost completely following this quake.

1964

Prince William Sound,Alaska,U.S.

9.2

131

Anchorage, Seward, and Valdez were damaged, but most deaths in the Alaska earthquake of 1964 were caused by tsunamis inAlaska and as far away asCalifornia.

1970

Chimbote,Peru

7.9

70,000

Most of the damage and loss of life resulting from the Ancash earthquake of 1970 was caused by landslides and the collapse of poorly constructed buildings.

1972

Managua,Nicaragua

6.2

10,000

The centre of the capital ofNicaraguawas almost completely destroyed; the business section was later rebuilt some 6 miles (10 km) away.

1976

Guatemala City,Guatemala

7.5

IX

23,000

Rebuilt following a series of devastating quakes in 1917–18, the capital ofGuatemalaagain suffered great destruction.

1976

Tangshan,

(China)

7.5

X

242,000

In the Tangshan earthquake of 1976, this industrial city was almost completely destroyed in the worst earthquake disaster in modern history.

1985

Michoacán state and Mexico City,Mexico

8.1

IX

10,000

The centre of Mexico City, built largely on the soft subsoil of an ancient lake, suffered great damage in the Mexico City earthquake of 1985.

1988

Spitak and Gyumri,Armenia

6.8

X

25,000

This quake destroyed nearly one-third ofArmenia’s industrial capacity.

1989

Loma Prieta,California,U.S.

7.1

IX

62

The San Francisco–Oakland earthquake of 1989, the first sizable movement of the San Andreas Fault since 1906, collapsed a section of the San Francisco–Oakland Bay Bridge.

1994

Northridge,

California

(USA)

6.8

IX

60

Centred in the urbanized San Fernando Valley, the Northridge earthquake of 1994 collapsed freeways and some buildings, but damage was limited by earthquake-resistant construction.

1995

Kobe,

(Japan)

6.9

XI

5,502

The Great Hanshin Earthquake destroyed or damaged 200,000 buildings and left 300,000 people homeless.

1999

Izmit,Turkey

7.4

X

17,000

The Izmit earthquake of 1999 heavily damaged the industrial city ofIzmit and the naval base at Golcuk.

1999

Nan-t’ou county,Taiwan

7.7

X

2,400

The Taiwan earthquake of 1999, the worst to hitTaiwan since 1935, provided a wealth of digitized data for seismic and engineering studies.

2001

Bhuj,

Gujarat

( state )

India

8.0

X

20,000

The Bhuj earthquake of 2001, possibly the deadliest ever to hitIndia, was felt acrossIndia andPakistan.

2003

Bam

(Iran)

6.6

IX

26,000

This ancientSilk Roadfortress city, built mostly of mud brick, was almost completely destroyed.

2004

Aceh

( province )

Sumatra

(Indonesia)

9.1

200,000

The deaths resulting from this offshore quake actually were caused by a tsunami originating in the Indian Ocean that, in addition to killing more than 150,000 inIndonesia, killed people as far away asSri Lanka andSomalia.

2005

Azad Kashmir

(Pakistanadministered )

( Kashmir )

7.6

VIII

80,000

The Kashmir earthquake of 2005, perhaps the deadliest shock ever to strikeSouth Asia, left hundreds of thousands of people exposed to the coming winter weather.

2008

Sichuan

( province )

(China

7.9

IX

69,000

The Sichuan earthquake of 2008 left over 5 million people homeless across the region, and over half of Beichuan city was destroyed by the initial seismic event and the release of water from a lake formed by nearby landslides.

2009

L’Aquila,

(Italy)

6.3

VIII

300

The L’Aquila earthquake of 2009 left more than 60,000 people homeless and damaged many of the city’s medieval buildings.

2010

Port-au-Prince,

(Haiti)

7.0

IX

316,000

The Haiti earthquake of 2010 devastated the metropolitan area ofPort-au-Prince and left an estimated 1.5 million survivors homeless.

2010

Maule,

(Chile)

8.8

VIII

521

The Chile earthquake of 2010 produced widespread damage inChile’s central region and triggered tsunami warnings throughout the Pacific basin.

2010

Christchurch,(New Zealand)

7.0

VIII

180

Most of the devastation associated with the Christchurch earthquakes of 2010–11 resulted from a magnitude-6.3 aftershock that struck on February 22, 2011.

2011

Honshu,

(Japan)

9.0

VIII

20,000

The powerful Japan earthquake and tsunami of 2011, which sent tsunami waves across the Pacific basin, caused widespread damage throughout easternHonshu.

2011

Erciş

And

Van,

(Turkey)

7.2

IX

The Erciş-Van earthquake of 2011 destroyed several apartment complexes and shattered mud-brick homes throughout the region.
  Data Sources: National Oceanic and Atmospheric Administration ( NOAA ), National Geophysical Data Center ( NGDC ), Significant Earthquake Database ( SED ), a searchable online database using the Catalog of Significant Earthquakes 2150 B.C. – 1991 A.D. ( with Addenda ), and U.S. Geological Survey ( USGS ), Earthquake Hazards Program.  * Measures of magnitude may differ from other sources.

ARTICLE

AdditionalReading

Earthquakes are covered mainly in books on seismology.

Recommended introductory texts, are:

Bruce A. Bolt, Earthquakes, 4th ed. (1999), and Earthquakes and Geological Discovery (1993); and,

Jack Oliver, Shocks and Rocks: Seismology and the Plate Tectonics Revolution (1996).

Comprehensive books on key aspects of seismic hazards, are:

Leon Reiter, Earthquake Hazard Analysis – Issues and Insights (1990); and,

Robert S. Yeats, Kerry Sieh, and Clarence R. Allen, The Geology of Earthquakes (1997).

A history of discrimination, between:

Underground nuclear explosions and natural earthquakes, is given by:

Bruce A. Bolt, “Nuclear Explosions and Earthquakes: The Parted Veil” ( 1976 ).

More advanced texts that treat the theory of earthquake waves in detail, are:

Agustín Udías, Principles of Seismology (1999);

Thorne Lay and Terry C. Wallace, Modern Global Seismology (1995);

Peter M. Shearer, Introduction to Seismology (1999); and,

K.E. Bullen and Bruce A. Bolt, An Introduction to the Theory of Seismology, 4th ed. (1985).

LINKS

Year in Review

Britannica provides coverage of “earthquake” in the following Year in Review articles.

Bhutan  ( in  Bhutan )

geophysics  ( in  geophysics )

Japan

Kyrgyzstan  (in  Kyrgyzstan)

Nepal  (in  Nepal)

New Zealand  (in  New Zealand )

Chile  (in  Chile: Year In Review 2010)

China  (in  China: Year In Review 2010)

“Engineering for Earthquakes”  ( in  Engineering for Earthquakes: Year In Review 2010 (earthquake) )

geophysics  (in  Earth Sciences: Year In Review 2010)

Haiti  (in  Haiti: Year In Review 2010; in  Haiti earthquake of 2010 )

Mauritius  (in  Mauritius: Year In Review 2010)

New Zealand  (in  New Zealand: Year In Review 2010 )

Bhutan  (in  Bhutan: Year In Review 2009)

Costa Rica  (in  Costa Rica: Year In Review 2009 )

geophysics  (in  Earth Sciences: Year In Review 2009)

Indonesia  (in  Indonesia: Year In Review 2009)

Italy  (in  Italy: Year In Review 2009; in  Vatican City State: Year In Review 2009 )

Samoa  (in  Samoa: Year In Review 2009)

“Major Earthquake Shakes China’s Sichuan Province, A”  ( in  A Major Earthquake Shakes China’s Sichuan Province: Year In Review 2008 (earthquake) )

China  (in  China: Year In Review 2008; in  United Nations: Year In Review 2008 )

Congo, Democratic Republic of the  (in  Democratic Republic of the Congo: Year In Review 2008)

geology  (in  Earth Sciences: Year In Review 2008)

geophysics  (in  Earth Sciences: Year In Review 2008 )

geophysics  (in  Earth Sciences: Year In Review 2007)

paleontology  (in  Life Sciences: Year In Review 2007)

Peru  (in  Peru: Year In Review 2007 )

geophysics  (in  Earth Sciences: Year In Review 2006)

glaciers  (in  Earth Sciences: Year In Review 2006)

Mozambique  (in  Mozambique: Year In Review 2006)

archaeology  ( in  Anthropology and Archaeology: Year In Review 2005 )

geophysics  (in  Earth Sciences: Year In Review 2005)

India  (in  India: Year In Review 2005 )

Pakistan(in  Pakistan: Year In Review 2005 )

“Cataclysm in Kashmir”  (in  Cataclysm in Kashmir: Year In Review 2005 (Jammu and Kashmir))

geophysics  (in  Earth Sciences: Year In Review 2004)

Japan  (in  Japan: Year In Review 2004)

tsunami  (in  The Deadliest Tsunami: Year In Review 2004 (tsunami))

geophysics  (in  Earth Sciences: Year In Review 1996)

geophysics  (in  Earth and Space Sciences: Year In Review 1995)

LINKS

Other Britannica Sites

Get involved Share

Articles from Britannica encyclopedias for elementary and high school students.

Earthquake – Children’s Encyclopedia ( Ages 8-11 ) – During an earthquake, huge masses of rock move beneath the Earth’s surface and cause the ground to shake. Earthquakes occur constantly around the world. Often they are too small for people to feel at all. Sometimes, however, earthquakes cause great losses of life and property.

Earthquake – Student Encyclopedia ( Ages 11 and up ) – Sudden shaking of the ground that occurs when masses of rock change position below Earth’s surface is called an earthquake. The shifting masses send out shock waves that may be powerful enough to alter the surface, thrusting up cliffs and opening great cracks in the ground.

The topic earthquake is discussed at the following external Web sites.

Citations

To cite this page: MLAAPAHarvardChicago Manual of Style

MLA Style: “earthquake.” Encyclopædia Britannica. Encyclopædia Britannica Online. Encyclopædia Britannica Inc., 2012. Web. 21 Mar. 2012. http://www.britannica.com/EBchecked/topic/176199/earthquake

Reference

http://www.britannica.com/EBchecked/topic/176199/earthquake/247989/Shallow-intermediate-and-deep-foci?anchor=ref105456

– – – –

Feeling ‘educated’? Think you’re out-of the earthquake and tsunami water subject?

March 23, 2012 news, however contradicts decades of professional scientific knowledge and studies so, if you were just feeling ‘overly educated’ about earthquakes and tsunamis – don’t be. You’re now lost at sea, in the same proverbial ‘boat’, with all those global government scientific and technical ( S&T ) professionals who thought they understood previous information surrounding earthquakes and tsunamis.

After comparing Japan 9.0 ‘earthquake directional arrows’, depicted on the charts ( further above ), with ocean currents, tidal charts and trade winds from the global jet stream there’s a problem that cannot be explained when on March 23, 2012 British Columbia, Canada reported its northwest Pacific Ocean coastal sea waters held a 100-foot fishing boat ‘still afloat’ – more than 1-year after the Japan tsunami from its 9.0 earthquake on March 11, 2011.

[ IMAGE ( above ): 11MAR11 Japan 9.0 earthquake tsunami vistim fishing boat ( 50-metre ) found more than 1-year later still adrift in the Pacific Ocean – but thousands of miles away – off North America Pacific Ocean west coastal territory of Haida Gwaii, British Columbia, Canada ( Click on image to enlarge ) ]

– – – –

Source: CBS News – British Columbia ( Canada )

Tsunami Linked Fishing Boat Adrift Off B.C.

Nobody Believed Aboard 50-Meter Vessel Swept Away In 2011 Japanese Disaster CBC News

March 23, 2012 21:35 ( PST ) Updated from: 23MAR12 18:59 ( PST )

A Japanese fishing boat that was washed out to sea in the March 2011 Japanese tsunami has been located adrift off the coast of British Columbia ( B.C. ), according to the federal Transport Ministry.

The 50-metre vessel was spotted by the crew of an aircraft on routine patrol about 275 kilometres off Haida Gwaii, formerly known as the Queen Charlotte Islands, ministry spokeswoman Sau Sau Liu said Friday.

“Close visual aerial inspection and hails to the ship indicate there is no one on board,” Liu said. “The owner of the vessel has been contacted and made aware of its location.”

U.S. Senator Maria Cantwell, ofWashington, said in a release that the boat was expected to drift slowly southeast.

“On its current trajectory and speed, the vessel would not [ yet ] make landfall for approximately 50-days,” Cantwell said. Cantwell did not specify where landfall was expected to be.

First large debris

The boat is the first large piece of debris found following the earthquake and tsunami that struckJapanone year ago.

Scientists, at the University of Hawaii say a field of about 18,000,000 million tonnes of debris is slowly being carried by ocean currents toward North America. The field is estimated to be about 3,200 kilometres long and 1,600 kilometres wide.

Scientists have estimated some of the debris would hit B.C. shores by 2014.

Some people on the west coast of Vancouver Island believe ‘smaller pieces of debris have already washed ashore there’.

The March 11, 2011, tsunami was generated after a magnitude 9.0 earthquake struck off the coast of northern Japan. The huge waves and swells of the tsunami moved inland and then retreated back into the Pacific Ocean, carrying human beings, wreckage of buildings, cars and boats.

Nearly 19,000 people were killed.

Reference

http://www.cbc.ca/news/canada/british-columbia/story/2012/03/23/bc-fishing-boat-tsunami-debris.html?cmp=rss

– – – –

Submitted for review and commentary by,

Kentron Intellect Research Vault

E-MAIL: KentronIntellectResearchVault@Gmail.Com

WWW: http://KentronIntellectResearchVault.WordPress.Com

References

http://earthquake.usgs.gov/earthquakes/world/japan/031111_M9.0prelim_geodetic_slip.php
http://en.wikipedia.org/wiki/Moment_magnitude_scale
http://www.gsi.go.jp/cais/topic110315.2-index-e.html
http://www.seismolab.caltech.edu
http://www.tectonics.caltech.edu/slip_history/2011_taiheiyo-oki
http://supersites.earthobservations.org/ARIA_japan_co_postseismic.pdf
ftp://sideshow.jpl.nasa.gov/pub/usrs/ARIA/README.txt
http://speclib.jpl.nasa.gov/documents/jhu_desc
http://earthquake.usgs.gov/regional/pacnw/paleo/greateq/conf.php
http://www.passcal.nmt.edu/content/array-arrays-elusive-ets-cascadia-subduction-zone
http://wcda.pgc.nrcan.gc.ca:8080/wcda/tams_e.php
http://www.pnsn.org/tremor
http://earthquake.usgs.gov/earthquakes/recenteqscanv/Quakes/quakes_all.html
http://nthmp.tsunami.gov
http://wcatwc.arh.noaa.gov
http://www.pnsn.org/NEWS/PRESS_RELEASES/CAFE/CAFE_intro.html
http://www.pnsn.org/WEBICORDER/DEEPTREM/summer2009.html
http://earthquake.usgs.gov/prepare
http://www.passcal.nmt.edu/content/usarray
http://www.iris.washington.edu/hq
http://www.iris.edu/dms/dmc
http://www.iris.edu/dhi/clients.htm
http://www.iris.edu/hq/middle_america/docs/presentations/1026/MORENO.pdf
http://www.unavco.org/aboutus/history.html
http://earthquake.usgs.gov/monitoring/anss
http://earthquake.usgs.gov/regional/asl
http://earthquake.usgs.gov/regional/asl/data
http://www.usarray.org/files/docs/pubs/US_Data_Plan_Final-V7.pdf
http://neic.usgs.gov/neis/gis/station_comma_list.asc
http://earthquake.usgs.gov/research/physics/lab
http://earthquake.usgs.gov/regional/asl/data
http://pubs.usgs.gov/gip/dynamic/Pangaea.html
http://coaps.fsu.edu/scatterometry/meeting/docs/2009_august/intro/shimoda.pdf
http://coaps.fsu.edu/scatterometry/meeting/past.php
http://eqinfo.ucsd.edu/dbrecenteqs/anza
http://www.ceri.memphis.edu/seismic
http://conceptactivityresearchvault.wordpress.com/2011/03/28/global-environmental-intelligence-gei
http://www.af.mil/information/factsheets/factsheet_print.asp?fsID=157&page=1
https://login.afwa.af.mil/register/
https://login.afwa.af.mil/amserver/UI/Login?goto=https%3A%2F%2Fweather.afwa.af.mil%3A443%2FHOST_HOME%2FDNXM%2FWRF%2Findex.html
http://www.globalsecurity.org/space/library/news/2011/space-110225-afns01.htm
http://www.wrf-model.org/plots/wrfrealtime.php

MSN Warns Disasters

MSN Warns Disasters

 

MSN Warns Disasters by, Concept Activity Research Vault

March 7, 2012 18:22:42 ( PST ) Updated ( Originally Published: May 16, 2011 )

Los Angeles – May 16, 2011 – MSN Slate News reported ( read article below ) that a host of disasters are coming, which ‘the public should not become overly worried about’, but suggests throwing a celebration-like “18th Century Weekend” in-advance so ‘people can experience what a solar flare disaster might be like to live through’.

While the suggested Medieval celebratory affair ‘concept’ is ‘unique’, MSN suggesting the public ‘stock up on batteries’ was a bit off because apparently the journalist did not realize ‘batteries become drained’ subsequent to an environmental anomaly ‘overcharging’ from an ambient auroral current attributable to what occurs during a Solar Energetic Particle Event ( SEPE );  ‘candles’ or ‘lumeniscent gel sticks’ ( shake lights ) would work amidst such, however mainstream news media broadcasts and print media, without thoroughly researching facts first, have a habit of passing inaccurate information on to the general public, and at the same time, doing it mostly in a whimsical fashion so it can be easily swallowed by the public. That type of reporting does ‘not’ help the public, but only serves to provide the illusion that what is being reported about will probably never happen. Big mistake!

News reporting, as a public service, should take far more care when reporting about emergency disaster preparedness on ‘what to do’ and just ‘how to prepare’; especially when it comes to mentioning a ‘significant’ solar flare ( also known as ) a Solar Energetic Particle Event ( SEPE ) that could quickly and very seriously disable the national electricity infrastructure without warning.

To let CARV readers review how MSN Slate News recently put it to the general public, we cordially invite ‘you’ ( our readers ) to review the MSN Slate News report ( below ) so, you can be the judge on whom to rely on for delivering your emergency disaster preparedness information from.

– –

Source: MSN Slate News

Meltdowns. Floods. Tornadoes. Oil spills. Grid crashes. Why more and more things seem to be going wrong, and what we can do about it.

The Century of Disasters by, Joel Achenbach

May 13, 2011 5:56 PM ( EST )

This will be the century of disasters.

In the same way that the 20th century was the century of world wars, genocide, and grinding ideological conflict, the 21st century will be the century of natural disasters and technological crises and unholy combinations of the two.

It will be the century when the things we count on to go right will – for whatever reason – go wrong.

Late last month ( April 2011 ), as the Mississippi River rose in what is destined to be the worst flood in decades, residents of Alabama and other states rummaged through the debris of a historic tornado outbreak.

Physicists at a meeting in Anaheim, California had a discussion about the dangers posed by the Sun.

Solar flares, scientists believe, are a disaster waiting to happen. Thus one of the sessions at the American Physical Society annual meeting was devoted to discussing the hazard of electromagnetic pulses ( EMP ) caused by solar flares – or terrorist attacks. Such pulses ( EMP ) could fry transformers and knock out the electrical grid over much of the nation. Last year the Oak Ridge National Laboratory released a study saying the damage might take years to fix and cost trillions of dollars.

But maybe even that is not the disaster people should be worrying about.

Maybe they should worry instead about the “ARkStorm.” That’s the name the U.S. Geological Survey ( USGS ) Multihazards Demonstration Project ( MDP ) gave to a hypothetical storm that would essentially turn much of the California Central Valley into a bathtub. It has happened before, in 1861 – 1862, when it rained for 45-days continously. USGS explains, “The ARkStorm draws heat and moisture from the tropical Pacific, forming a series of “Atmospheric Rivers” ( AR ) that approach the ferocity of hurricanes and then slam into the United States West Coast over several weeks.” The result, the USGS determined, could be a flood that would cost $725,000,000 billion in direct property losses and economic impact.

While pondering this, don’t forget the Cascadia subduction zone, the plate boundary off the coast of the Pacific Northwest, that could generate a tsunami much like the one that devastated Japan in March 2011. The Cascadia subduction zone, runs from Vancouver Island to northern California, last rupturing in a major tsunami spawning earthquake on January 26, 1700. It could break at any moment, with catastrophic consequences.

All of these things have the common feature of low probability and high consequence.

They are known as “black swan” events.

They are unpredictable in any practical sense.

There are also things ordinary people probably should not worry about on a daily basis.

You can’t fear the Sun.

You cannot worry a rock will fall out of the sky and smash the Earth, or that the ground will open up and swallow you like a vitamin.

A key element of maintaining one’s sanity is ‘knowing how to ignore risks’ that are highly improbable at any given point in time.

And yet in the coming century, these or other ‘black swan events’ will seem to occur with surprising frequency.

There are several reasons for this.

We have chosen to engineer the planet.

We have built vast networks of technology.

We have created systems that, in general, work very well, but are still vulnerable to catastrophic failures.

It is harder and harder for any one person, institution, or agency to perceive all the interconnected elements of the technological society.

Failures can cascade.

There are unseen weak points in the network.

Small failures can have broad consequences.

Most importantly, we have more people and more stuff standing in the way of calamity.

We are not suddenly having more earthquakes, but there are now 7,000,000,000 billion of us, a majority living in cities.

In 1800, only Beijing, China could count 1,000,000 inhabitants, but at last count there were 381 cities with at least 1,000,000 people.

Many are MegaCities in seismically hazardous places like Mexico City, Caracas, Venezuela; Tehran, Iran and Kathmandu amongst those with a lethal combination of weak infrastructure ( unreinforced masonry buildings ) and shaky foundations.

Natural disasters will increasingly be accompanied by technological crises, and the other way around.

In March 2011, the Japan earthquake triggered the Fukushima Dai-Ichi nuclear power plant meltdown.

Last year ( 2010 ), a technological failure on the Deepwater Horizon drilling rig – in the Gulf of Mexico – led to the environmental crisis of the oil spill. ( I chronicle the Deepwater Horizon blowout and the ensuing crisis management in a new book: A Hole at the Bottom of the Sea: The Race to Kill the BP Oil Gusher. )

In both the Deepwater Horizon and Fukushima disasters, the safety systems were not nearly as robust as the industries believed.

In these technological accidents, there are hidden pathways for the gremlins to infiltrate the operation.

In the case of Deepwater Horizon, a series of decisions by BP ( oil company ) and its contractors led to a loss of well control — the initial blowout. The massive blowout preventer on the sea floor was equipped with a pair of pinchers known as ‘blind shear rams’. They were supposed to cut the drillpipe and shear the well. The forensic investigation indicated the initial eruption of gas buckled the pipe and prevented the blind shear rams from getting a clean bite on it so, the “backup” plan — of cutting the pipe — was effectively eliminated in the initial event; the loss of well control.

Fukushima also had a backup plan that was not far enough back. The nuclear power plant had backup generators – in case the grid went down – but the generators were on ‘low’ ground and were blasted by the tsunami.

Without electricity the power company had no way to cool the nuclear fuel rods.

In a sense, it was a very simple problem: a power outage.

Some modern reactors coming online have passive cooling systems for backups that rely on gravity and evaporation to circulate the cooling water.

Charles Perrow, author of Normal Accidents, told me that computer infrastructure is a disaster in the making.

“Watch out for failures in cloud computing,” he said by e-mail, “They will have consequences for medical monitoring systems and much else.”

Technology also mitigates disasters, of course.

Pandemics remain a threat, but modern medicine can help us stay a step ahead of evolving microbes.

Satellites and computer models helped meteorologists anticipate the deadly storms of April 27, 2011 and warn people to find cover in advance of the twisters.

Better building codes save lives in earthquakes. Chile, which has strict building codes, was hit with a powerful earthquake last year ( 2010 ) but suffered only a fraction of the fatalities and damage that impoverished Haiti endured just weeks earlier.

The current ( 2011 ) Mississippi flood is an example of technology at work for better and for worse.

As I write, the Army Corps of Engineers are poised to open the Morganza spillway and flood much of the Atchafalaya basin. That’s not a “disaster” but a solution of sorts, since the alternative is the flooding of cities downstream and possible levee failure. Of course, the levees might still fail. We’ll see. But this is how the system is ‘supposed’ to work.

On the other hand, the broader drainage system of the Mississippi River watershed is set up in a way that it makes floods more likely. Corn fields, for example in parts of the upper Midwest, have been “tiled” with pipes that carry excess rainwater rapidly to the rip-rap ( small stone ladden ) streams and onward down to rivers lined with levees. We gave up natural drainage decades ago.

The Mississippi is like a catheter, at this point. Had nature remained in charge, the river would have mitigated much of its downstream flooding by spreading into natural floodplains further up river ( and the main channel would have long ago switched to the Atchafalaya river basin — see John McPhee “The Control of Nature” — and New Orleans would no longer be a riverfront city).

One wild card for how disastrous this century will become is climate change.

There’s been a robust debate on the blogs about whether the recent weather events ( tornadoes and floods ) can be attributed to climate change.

It is a briar patch of an issue and I’ll exercise my right to skip past it for the most part.

But I think it’s clear that climate change will exacerbate natural disasters in general in coming years, and introduce a new element of risk and uncertainty into a future in which we have plenty of risks and uncertainties already. This, we don’t need.

And by the way, any discussion of “geoengineering” as a solution to climate change needs to be examined with the understanding that engineering systems can and will fail.

You don’t want to bet, the future of the planet, on an elaborate technological fix in which everything has to work perfectly. If failure is not an option, maybe you ‘should not’ try-it to begin-with.

So if we cannot engineer our-way out-of our ‘engineered disasters’, and if ‘natural disasters’ are going to keep pummeling us – as they have since the dawn of time — what is our strategy? Other than, you-know, despair? Well, that has always worked for me, but here are a few more practical thoughts to throw in the mix:

First [ 1st ], we might want to try some regulation by people with no skin in the game. That might mean, for example, government regulators who make as much money as the people they’re regulating. Or it could even mean a ‘private-sector regulatory apparatus policing the industry’, cracking down on rogue operators. The point is, we don’t want every risky decision made by people with pecuniary interests.

Second [ 2nd ], we need to keep things in perspective. The apparent onslaught of disasters does not portend the end of the world. Beware of ‘disaster hysteria in the news media’. The serial disasters of the 21st century will be – to some extent – a matter of perception. It will feel like we are bouncing from disaster-to-disaster in-part because of the shrinking of the world and the ubiquity of communications technology. Anderson Cooper and Sanjay Gupta are always in a disaster zone somewhere – demanding to know why the cavalry [ emergency first responders ] has not showed up.

Third [ 3rd ], we should think in terms of ‘how we can boost’ our “societal resilience;” the buzz-word in the ‘disaster preparedness industry’.

Think of what you would do, and what your community would do, after a disaster.

You cannot always dodge the disaster, but perhaps you can still figure-out how to recover quickly.

How would we ‘communicate’ if we got [ solar ] flared by the Sun and the [ electricity ] grid went down over 2/3rds of the country?

How would we even know what was going on?

Maybe we need to have the occasional “18th Century weekend” – to see how people might get through a couple of days without the [ electricity ] grid, cell [ telephone ] towers, cable TV [ television ], iTunes downloads – the full Hobbesian nightmare. And make an emergency plan: Buy some ‘batteries’ [ < ? > NOTE: solar flare effects, during a Solar Energetic Particle Event ( SEPE ), renders ‘all batteries dead’. ] and jugs of water – just for starters.

Figure-out how things around you work.

Learn about your community infrastructure.

Read about science, technology, engineering and ‘do not worry if you do not understand all the jargon’.

And then – having done that – go on about your lives, pursuing happiness on a planet that, though sometimes dangerous, is by-far the best one we’ve got.

Reference

http://www.slate.com/id/2294013/pagenum/all/#p2

– –

Hopefully, people will take an opportunity to read the CARV report on Solar Energetic Particle Event Effects so they can ‘really know what to prepare for soon’,  ’before celebrating’ an “18th Century weekend” affair – complete with “batteries” – as suggested by the MSN Slate News article ( above ).

Although the aforementioned Slate News article indicates, “Solar flares, scientists believe, are a disaster waiting to happen. Thus one of the sessions at the American Physical Society annual meeting was devoted to discussing the hazard of electromagnetic pulses ( EMP ) caused by solar flares – or terrorist attacks. Such pulses ( EMP ) could fry transformers and knock out the electrical grid over much of the nation. Last year the Oak Ridge National Laboratory released a study saying the damage might take years to fix and cost trillions of dollars. But maybe even that is not the disaster people should be worrying about,” – we actually ‘may’ have ‘something “people should be worrying about,” as MSNBC puts it, or “concerned about,” according to NASA, in-lieu of the following MSNBC Space.Com report ( below ):

– – – –

 

Source: MSNBC.COM

 

Huge Solar Flare’s Magnetic Storm May Disrupt Satellites, Power Grids

 

March 7, 2012 13:19 p.m. Eastern Standard Time ( EST )

 

A massive solar flare that erupted from the Sun late Tuesday ( March 6, 2012 ) is unleashing one of the most powerful solar storms in more than 5-years, ‘a solar tempest that may potentially interfere with satellites in orbit and power grids when it reaches Earth’.

 

“Space weather has gotten very interesting over the last 24 hours,” Joseph Kunches, a space weather scientist at the National Oceanic and Atmospheric Administration ( NOAA ), told reporters today ( March 7, 2012 ). “This was quite the Super Tuesday — you bet.”

 

Several NASA spacecraft caught videos of the solar flare as it hurled a wave of solar plasma and charged particles, called a Coronal mass Ejection ( CME ), into space. The CME is not expected to hit Earth directly, but the cloud of charged particles could deliver a glancing blow to the planet.

 

Early predictions estimate that the Coronal Mass Ejections ( CMEs ) will reach Earth tomorrow ( March 8, 2012 ) at 07:00 a.m. Eastern Standard Time ( EST ), with the ‘effects likely lasting for 24-hours and possibly lingering into Friday ( March 9, 2012 )’, Kunches said.

 

The solar eruptions occurred late Tuesday night ( March 6, 2012 ) when the sun let loose two ( 2 ) huge X-Class solar flares that ‘ranked among the strongest type’ of sun storms. The biggest of those 2 flares registered as an X Class Category 5.4 solar flare geomagnetic storm on the space weather scale, making it ‘the strongest sun eruption so far this year’.

 

Typically, Coronal Mass Ejections ( CMEs ) contain 10,000,000,000 billion tons of solar plasma and material, and the CME triggered by last night’s ( March 6, 2012 ) X-Class Category 5.4 solar flare is ‘the one’ that could disrupt satellite operations, Kunches said.

 

“When the shock arrives, the expectation is for heightened geomagnetic storm activity and the potential for heightened solar radiation,” Kunches said.

 

This heightened geomagnetic activity and increase in solar radiation could impact satellites in space and ‘power grids on the ground’.

 

Some high-precision GPS ( Global Positioning Satellite ) users could also be affected, he said.

 

“There is the potential for ‘induced currents in power grids’,” Kunches said. “‘Power grid operators have all been alerted’. It could start to ’cause some unwanted induced currents’.”

 

Airplanes that fly over the polar caps could also experience communications issues during this time, and some commercial airliners have already taken precautionary actions, Kunches said.

 

Powerful solar storms can also be hazardous to astronauts in space, and NOAA is working close with NASA’s Johnson Space Center to determine if the six ( 6 ) spacecraft residents of the International Space Station ( ISS ) need to take shelter in more protected areas of the orbiting laboratory, he added.

 

The flurry of recent space weather events could also supercharge aurora displays ( also known as the Northern Lights and Southern Lights ) for sky-watchers at high latitudes.

 

“Auroras are probably the treat that we get when the sun erupts,” Kunches said.

 

Over the next couple days, Kunches estimates that brightened auroras could potentially be seen as far south as the southern Great Lakes region, provided the skies are clear.

 

Yesterday’s ( March 6, 2012 ) solar flares erupted from the giant active sunspot AR1429, which spewed an earlier X Class Category 1.1 solar flare on Sunday ( March 4, 2012 ). The CME from that one ( 1 ) outburst mostly missed Earth, passing Earth by last night ( March 6, 2012 ) at around 11 p.m. EST, according to the Space Weather Prediction Center ( SWPC ), which is jointly managed by NOAA and the National Weather Service ( NWS ).

 

This means that the planet ( Earth ) is ‘already experiencing heightened geomagnetic and radiation effects in-advance’ of the next oncoming ( March 8, 2012 thru March 9, 2012 ) Coronal Mass Ejection ( CME ).

 

“We’ve got ‘a whole series of things going off’, and ‘they take different times to arrive’, so they’re ‘all piling on top of each other’,” Harlan Spence, an astrophysicist at the University of New Hampshire, told SPACE.com. “It ‘complicates the forecasting and predicting’ because ‘there are always inherent uncertainties with any single event’ but now ‘with multiple events piling on top of one another’, that ‘uncertainty grows’.”

 

Scientists are closely monitoring the situation, particularly because ‘the AR1429 sunspot region remains potent’. “We think ‘there will be more coming’,” Kunches said. “The ‘potential for more activity’ still looms.”

 

As the Sun rotates, ‘the AR1429 region is shifting closer to the central meridian of the solar disk where flares and associated Coronal Mass Ejections ( CMEs ) may ‘pack more a punch’ because ‘they are more directly pointed at Earth’.

 

“The Sun is waking up at a time in the month when ‘Earth is coming into harms way’,” Spence said. “Think of these ‘CMEs somewhat like a bullet that is shot from the sun in more or less a straight line’. ‘When the sunspot is right in the middle of the sun’, something ‘launched from there is more or less directed right at Earth’. It’s kind of like how getting sideswiped by a car is different than ‘a head-on collision’. Even still, being ‘sideswiped by a big CME can be quite dramatic’.” Spence estimates that ‘sunspot region AR 1429 will rotate past the central meridian in about 1-week’.

 

The sun’s activity ebbs and flows on an 11-year cycle. The sun is in the midst of Solar Maximum Cycle 24, and activity is expected to ramp up toward the height of the Solar Maximum in 2013.

 

Reference

 

http://www.msnbc.msn.com/id/46655901/

 

– – – –

Do we need “Planetary Protection?” NASA has a specific website, referenced here ( below ) as do others ( below ), including The Guardians of the Millennium.

Submitted for review and commentary by,

Concept Activity Research Vault E-MAIL: ConceptActivityResearchVault@Gmail.Com WWW: http://ConceptActivityResearchVault.WordPress.Com

Reference

http://planetaryprotection.nasa.gov/about/ [ Planetary Protection ) http://www.lpi.usra.edu/captem/ [ CAPTEM ] http://www.nrl.navy.mil/pao/pressRelease.php?Y=2008&R=39-08r [ U.S. Naval Research Laboratory ] http://hesperia.gsfc.nasa.gov/sftheory/imager.htm [ RHESSI ]

 

Global Environmental Intelligence ( GEI )

Global Environmental Intelligence - GEI

Global Environmental Intelligence ( GEI )
by, Kentron Intellect Research Vault ( KIRV )

TUCSON, Arizona – March 7, 2012 – Nebraska is part of America’s heartland where less than fifty ( 50 ) people are ‘officially designated’ – by the U.S. Air Force Weather Agency ( AFWA ) – to analyze the pulsebeat of ‘doom and gloom’ through ‘scientific terms’ calculated within a new ‘strategic center’ using the latest ultra high-tech computer system, software application program modules and networking capabilities for what ‘global government national infrastructures’ expect, based on what the National Aeronautic Space Administration ( NASA ) warned would hit Earth during the current ”Solar Maximum” ( Cycle 24 ) sending a ‘significant’ “Solar Energetic Particle Event” ( SEPE ) to Earth that “we all need to be concerned about.”

What possibly could go wrong?

We’re all protected, aren’t we?

We are ‘not protected against natural events’ that we have ‘no control over’, but governments have been doing their best to monitor activities, which is one ( 1 ) reason ‘why’ the United States Air Force ( USAF ) ‘subordinate organization’ known as its Air Force Weather Agency ( AFWA ) analyzes significant scientific data from horizons above and below the Earth, searching for clues as to what ‘natural significant event’ ( i.e. Solar Energetic Particle Event ( SEPE ), tectonic plate shift ( continental earthquakes ),  magnetic pole shift ( climate change ), or something else as dramatic will occur.

The sole “Mission” of the U.S. Air Force Weather Agency ( USAFWA ) is tasked as the only U.S. government Agency ‘officially designated to analyze’ incredible amounts of scientific doom and gloom information the public might think came from ‘doomsday prophets’ concerned about Earth calamities going to happen soon.

Space Weather Earth Forecasting

Current official information ( received 24-hours a day and 365-days a year ) by the U.S. Air Force Weather Agency ( USAFWA ) and its host Directorate of Operational Weather Squadrons ( OWS ), group of ‘solar observatories’, ‘satellites’, ‘spacecraft’, ‘interstellar lightwave spectral and radio-frequency ( RF ) imaging sensors’, ‘telescopes’ with a wide array of ‘cameras’ as well as ‘human observers’ are continuing to report enough worrisome information that would scare many of us, but – ‘officially’ – the public will never know ‘what’ is about to befall them or ‘when’ that will occur.

After a considerable amount of research into this subject, the U.S. Air Force was caught ‘publicly admitting’ they are able to “know within minutes” as to ‘what significant Earth Event is coming’ and just ‘when it will arrive’ – anywhere between a few hours up-to days in-advance’, but the public will never be informed. Why?

Doom & Gloom or Fantasy Island?

Undoubtedly, classified ( in the ‘interest of national security’ ), would be the officially given reason to the public. Some might ask, “But, for ‘whose’ “security?” People, for the most part, are ‘not secure’ – especially when facing a pending injurious consequence resulting in ruination of life as they may have come to know their’s – and most gravitate toward wanting to “be with family” ( or “loved ones” ) as any pending cataclysmic Earth Event draws nearer to them all. The U.S. Air Force Weather Agency ( AFWA ), however realizes that being occupants of planet Earth is ‘not the same thing as’ “Fantasy Island” or a trip to “McDonaldland” as the rest of the public at-large has come to know ‘their version’ of what planet Earth is for ‘them’ and their children.

Preparedness

What may even be more interesting to the public is to review official U.S. data surrounding what U.S. Air Force commands ‘are already preparing for’, what ‘they already have set in-place’ for ‘everyone living in the Continental United States ( CONUS )’, and ‘how they are going to accomplish’ their secret-sensitive “Mission” on ‘people’ throughout the United States. This is ‘definitely not’ a ‘science fiction movie’ or ‘Orwellian Theory’ within this report, but strictly what the U.S. Air Force Weather Agency and another U.S. Air Force ‘subordinate organization’, the Air and Space Operations ( ASO ), Director of Weather ( USAF Deputy Chief of Staff ) has ‘recently laid-out publicly’, but for the ‘public’ to decipher ‘all of what was so officially presented’ many were lost in the government psychobabble resulting in no impact on the public as to the ‘seriousness’ for what they need to ‘begin preparing for’.

Television Commercial ‘Official Advertisement’ Already Warned Public!

Recently ( 2010 and 2011 ), the U.S. Federal Emergency Management Agency ( FEMA ) began advertising its own advertising commercial on American television. The FEMA TV ad commercial ’warns the public to prepare’ for an “unexpected Event” that “can suddenly turn your life – and those around you – ‘upside down’.”

FEMA’s TV commercial advertisement ( see video clip – below ) is easily recalled by those who watch / watched it – depicting a ‘family inside a home’ where all of a sudden everything inside the home’ begins floating up in the air; as though ‘gravity was somehow suddeny lost’. This ‘professionally produced television commercial advertisement’ from FEMA ‘stylishly warns people to prepare for any unexpected natural disaster Earth Event, however FEMA did not describe ‘what type of event strike for people to expect’.

The sadest part about this is that the U.S. government actually considers this television commercial advertisement an “official public warning to prepare for a national disaster” while simultaneously the U.S. government hopes the commercial will ‘not’ create ‘panic’ or ‘chaos’ in the streets that would disrupt ‘skyrocketing profits’ received by ‘global elite’ and/or ‘big business’ through their stock market portfolios. In short, this alluding to a national disaster is nothing short of being the ‘Biggest Show On Earth’ continuing profits from basic living expenses saddling the ‘little people’, ‘worthless eaters’, and ‘consumers of “blue gold” ( water ) the global elite predict will rise – more-so than precious metals – after such an Earth Disaster Event ( EDE ).

Survival By Chance or Circumstance

One of the most worrisome concerns, after reviewing this report ( filled with official U.S. government information’ ), is: ‘What’, ‘when’ and ‘how’ U.S. government ‘decision-makers’ react to their ‘official alert’. Many will undoubtedly and instantly gather with ‘their families’, however ‘what about the rest of us’ gathering with ‘our families? We will not be provided the same ‘early alert’ to even know – until after it ( significant Earth Event ) already takes place! Is this ‘fair’? Is this ‘just’? Is this ‘humane’? Or, is this pure and unadulterated ‘elitist selfishness’ that government affords for only a ‘select few’ lucky souls fortunate enough to be provided with sufficient warning in-advance and additionally told were to go and what to do in order to escape the wrath of such a cataclysmic Earth Event? Few people realize that those ‘decision-makers’ have already been instructed in-advance as to ‘what’ they need to do and ‘when’ they need to do it.

AFWA Notifies 350 Military Installations About Earth Disaster Event

Even more frightening is, ‘what’ the U.S. military has ‘already been commanded’ as to ‘how to accomplish’ their “Mission” based on a highly-classified U.S. Presidential Executive Order ( EO ) turning the U.S. population over to U.S. military control. The U.S. Air Force Weather Agency ( AFWA ) was also ‘officially designated’ as the “Agency” ‘controlling internet ( cyberspace ) shutdown’, according to the report ( below ) and further researching the capabilities of its ‘subordinate organization’ ( un-named ) “Strategic Center.”

Current Reporting Revelations

This report contains ‘no old news’ but ‘new official U.S. government revelations’, which cause the public to re-think where they are and what is coming. The only information this report does ‘not’ provide publicly are the ‘names’, ‘addresses, ‘phone numbers’, and ‘photographs’ of those ‘key individuals’ whom are ‘already designated’ as those ‘ready’, ‘willing’ and ‘able’ to ‘perform as ordered on-command’ – even if that command is ‘unable to be given’ after a cataclysmic Earth Event!

Reviewers Need To Know

There are only two ( 2 ) ‘official U.S. government reports’ provided ( below ) in this report, which contains internet links to more ‘official information’ on this subject.

Knowing the public at-large, and hot it predominantly enjoys ‘entertaining flowery written articles’ – they never bother to  ‘research’ anything further about’ – this report was purposely kept ‘short and sweet’ ( no flowers – no frills ). It includes an ‘official government article’ that was converted into a ‘very simple outline’, and that is followed-up with yet another ‘official government article’ left as-is. Hopefully, some may consume this report far-easier than others at this website so its point on this subject may be even-better publicly received.

The first ( 1st ) report ( included immediately below ) was ‘thoroughly analyzed’ in its ‘original format’, and then – based on ‘even more detailed official information’ discovered – was ‘re-formatted into an even-more proper perspective’, and – for the sake of making this report far-more easily comprehendible for public review – it was finally ‘converted into a simple outline format’ but only because of the fashion by-which government complicated the original public information release’; saying everything but not saying it very well for the public to easily understand.

The second ( 2nd ) report ( further below ) was left in its ‘original format’ because anyone reviewing it can quickly understand the nature of the ‘true problems’ surrounding what we all are going to be facing soon enough.

You be the judge, as to ‘what’ you will do, to ‘prepare’ for the ‘officially expected’ Earth Event. Enjoy the report ( below ) and be sure to click on the embedded links to learn more about ‘what effects’ are ‘coming’ and ‘what’ the U.S. Department of Homeland Security ( DHS ) is preparing for.

Tell a friend about what you review ( below ) and your reply may be, “You can’t trust what you read on the internet.” If anyone does ‘not trust’ this report ( below ) – or anything else on this website – they should click on the ‘official government website links’ ( provided at the bottom of this report, and other links found herein ) – if they are equipped with an ‘attention-span’ ( longer than a bug ) or ‘not easily distracted’ by something as simple as a ‘horn honk’ – then they can ‘research it all’ for themselves and try to prove this information and you are incorrect.

Many will be amazed at how much these official government revelations will help them to prepare for more than ‘entertainment’.

– – – –

Source: United States Air Force ( USAF ) Weather Agency ( AFWA )

Air Force Weather Agency ( AFWA ) 106 Peacekeeper Drive, Suite 2 Offutt Air Force Base, Nebraska 68113-4039 USA TEL: +1 (402) 232-8166 ( Public Affairs ) TEL: DSN TEL: 272-8166 ( DSN )

HISTORY –

United States Air Force Weather Agency ( USAFWA ) – aka – Air Force Weather Agency ( AFWA ) claims heritage extrapolated back to World War I, when the U.S. Army ( USA ) ‘subordinate organization’ Signal Corps ( ASC ) ‘subordinate organization’ “Meteorological Service” ( SCMC ) provided ‘weather support’ for U.S. Army defense aircraft pilots.

Amidst the legacy of the U.S. Secretary of War, ( known ‘today’ as ) U.S. Department of Defense ( DoD ), it – having realized its ‘subordinate organization’ U.S. Army had ‘increasing military personnel numbers’ due to an ‘increasing defense stockpile inventory’ of U.S. Army ‘militarized aircraft’ – created a ‘then-new subordinate organization’ known as the U.S. Army ( USA ) Air Corps ( AAC ) that became well-known as the U.S. Army Air Corps ( USAAC ).

On July 1, 1937 the U.S. Secretary of War ‘reorganized’ the U.S. Army ‘subordinate organization’ Signal Corps ( ASC ) ‘subordinate organization’ Metereological Service ( MS ) ‘out-of’ the Signal Corps ( SC ) and ‘in-to’ the U.S. Army ( USA ) ‘subordinate organization’ Air Corps ( ASC ) as its ‘then-new subordinate organization’ Meteorological Service ( ACMS ).

On April 14, 1943 the U.S. Army Air Corps ( USAAC ) ‘subordinate organization’ Meteorological Service ( MS ) was ‘renamed’ “Weather Wing” ( WW ) and ‘physically moved’ to an Asheville, North Carolina location – where ‘today’ the U.S. Air Force Weather Agency ( USAFWA ) has its 14th Operational Weather Squadron ( OWS ) located.

In 1945, the U.S. Army Air Corps ( USAAC ) Weather Wing ( ACWW ) was ‘renamed’ the U.S. Army Air Corps “Weather Service” ( ACWS ).

In early 1946, the U.S. Army Air Corps Weather Service ( ACWS ) was ‘physically moved’ from Asheville, North Carolina to the U.S. Army Air Corps Langley Field Station – where ‘today’ Langley Air Force Base ( LAFB ) is located.

On March 13, 1946 the U.S. Army Air Corps Weather Service ( ACWS ) was ‘reorganized’ under the U.S. Army Air Corps ( USAAC ) ‘new subordinate organization’ Air Transport Command ( ATC ) where its ‘then-new subordinate organization’ “Weather Service” ( ACWS ) was ‘renamed’ “Air Weather Service” ( AWS ).

Later-on, in 1946, the U.S. Army Air Corps ( USAAC ) Air Transport Command ( ATC ) Air Weather Service ( AWS ) was ‘physically moved’ to Gravelly Point, Virginia.

In 1947, the U.S. Army Air Corps ( USAAC ) was ‘renamed’ United States Air Force ( USAF ) whereupon the “Air Weather Service” ( AWS ) became ‘directly under it’ and assumed ‘sole responsibility’ over ‘all global weather reporting’ and ‘all global weather forecasting’ for ‘both’ the U.S. Air Force and U.S. Army.

In 1948, the USAF ‘newly activated’ Military Air Transport Service ( MATS ) – ( known ‘today’ as ) Military Airlift Command ( MAC ) – then-received its ‘newly-transferred subordinate organization’ Air Weather Service ( AWS ) that was ‘removed from being directly under’ USAF ‘headquarters’. USAF Military Air Transport Service ( MATS ) ‘subordinate organization’ Air Weather Service ( AWS ) was then ‘physically moved’ to Andrews Air Force Base ( AAFB ) in Maryland.

In 1958, the USAF Military Airlift Command ( MAC ) ‘subordinate organization’ Air Weather Service ( AWS ) was then ‘physically moved’ to Scott Air Force Base ( SAFB ) in Illinois where – for almost 40-years – it remained.

In 1991, the USAF Military Airlift Command ( MAC ) ‘subordinate organization’ Air Weather Service ( AWS ) was ‘redesignated’ as a “field operating agency” and then ‘reorganized’ directly back under USAF ‘headquarters’.

On October 15, 1997 the USAF Air Weather Service ( AWS ) was ‘redesignated’ as the “Air Force Weather Agency” ( AFWA ) and ‘physically moved’ to Offutt Air Force Base ( OAFB ) in Nebraska where it remains ‘today’ – at least for the moment.

– –

United States Air Force ( USAF )

USAF Weather Agency ( USAFWA )

BUDGET –

$183,000,000 million U.S. dollars ( annually ) includes $98,000,000 million dollars for ‘Operations’ and ‘Maintenance’.

ESTABLISHED –

October 15, 1997.

LOCATION –

Offutt Air Force Base ( OAFB ) in Nebraska ( USA ).

REPORTING –

USAF Air and Space Operations ( ASO ), Director of Weather ( DOW ), Deputy Chief of Staff ( DCS ).

MISSION –

Enable U.S. decision makers’ exploitation of all ‘resources’ based-on ‘relevant’ Global Environmental Intelligence ( GEI ) for a ‘global spectrum of military warfare’ by ‘maximizing U.S. power, over:

– Land; – Air; – Space; and, – Internet ( cyberspace ).

PERSONNEL –

1,400 + manned, by:

– Military ( active-duty and reserve ); – Contractors ( government contracts ); and, – Civilians.

ORGANIZATION –

– ? ( # ) Agencies ( staff ); – Two ( 2 ) Weather Groups ( WXG – Coordinators ); – Fourteen + ( 14 + ) Weather Squadrons ( OWS – Directorates ); – Five ( 5 ) Observatories ( solar ); and, – One ( 1 ) Strategic Operation Systems Center ( Unknown – ‘subordinate organization’ ).

– –

USAFWA

Global Environmental Intelligence ( GEI )

DATA –

Global Environmental Intelligence ( GEI ) data is ‘collected’ from a variety of ‘information monitoring sources’ ( Earth based and space based locations ) providing a variety of ‘data feed streams’ of information from a computer operated system network out-of a ‘Strategic Center’ processing indications detecting almost any ‘significant natural negative impact’ that ‘could occur’ on ‘people’ and ‘national infractures’ throughout ‘global geographic regions’ and in ‘outerspace’.

SOURCES –

GEI data is provided by, many sources, amongst-which, its ( subordinate ) Strategic Center that processes data through its Computer Operation Systems Network that receives ‘channeled data streams’ from ‘encoded data telemetry’ sent from ‘ground-base sourced’ and ‘space-based sourced’ detection monitors comprised, of:

– Ground Sensors ( infra-red, spectral and sonic );; – Submerged Sensors ( infra-red, spectral and sonic );; – Space-based Sensors ( lightwave, spectral and sonic ); – Long-Range Telescopes ( radio-frequency wave, optical lightwave and sonic ); – Imaging Cameras ( lightwave, spectral and sonic ); and, – Human Observers ( various means ).

SPECIAL NOTE:

USAF ASO A6 –

The U.S. Air Force ( USAF ) Air and Space Operations ( ASO ) Communications Directorate ( ” A6 ” ) ‘provides’ the ‘policy oversight’ and ‘planning  oversight’ of Command, Control, and Communication Intelligence ( C3I ) for the U.S. Air Force Weather Agency ( USAFWA ).

USAF Air and Space Operations ( ASO ) Communications Directorate ( A6 ) also ‘provides support’ over ‘daily operations’, ‘contingency actions’, and ‘general’ Command, Control, Communication and Computer Intelligence ( C4I ) for the U.S. Air Force Weather Agency ( USAFWA ).

– –

USAFWA

Weather Groups ( WXD ) –

Weather Groups ( WXD ) are specifically designated to, coordinate:

– Global Environmental Intelligence ( GEI ) – Operational Weather Squadrons ( OWS ).

– –

USAFWA

Operational Weather SquadronS ( OWS ) –

STATIONS –

A ‘minimum’ of fourteen ( 14 ) Operational Weather Squadrons ( OWS ) are on-station ( active ) 24 hours per day 7 days per week 365 weeks per year to mitigate ( handle ) Global Environmental Intelligence ( GEI ) issues.

Operational Weather Squadrons ( OWS ) – also known as – Weather Squadrons ( WS ) are ‘designated’, to:

– ‘Process’ Global Environmental Intelligence ( GEI ) data; and, – ‘Provisionally Distribute’ Global Environmental Intelligence ( GEI ) data.

Operational Weather Squadron Global Environmental Intelligence ( OWS GEI )

– –

2nd Weather Group ( WXD )

2ND WXD

OVERSIGHT –

– 2nd ( Strategic Computing Network Center ) Systems Operations Squadron ( SOS ) – Offutt Air Force Base, Nebraska; – 2nd Operational Weather Squadron ( OWS ) – Offutt Air Force Base, Nebraska; and, – 14th Operational Weather Squadron ( OWS ) – Asheville, North Carolina.

2ND WXD Global Environmental Intelligence ( GEI )

DISTRIBUTION –

– U.S. Agency ‘decision-makers’; – U.S. Department of Defense ( DoD ) ‘decision-makers’; – U.S. Allied Foreign Nation ‘decision-makers’; and, – U.S. Joint Operations ‘Warfighters’.

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘develops’ and ‘maintains’ the ‘concepts of operations’ for how U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) supports “most weather sensitive” areas of ‘joint capabilities’.

REPORTS –

– Timely; – Relevant; and, – Specialized.

SOURCES –

Solar Observatories ( Detachments – Det. ), four ( 4 ):

Det. 1 ( 2ND WXG SO Detachment 1 – Learmonth, Australia ); Det. 2 ( 2ND WXG SO Detachment 2 – Sagamore Hill, Massachusetts ); Det. 4 ( 2ND WXG SO Detachment 3 – Holloman Air Force Base – New Mexico ); and, Det. 5 ( 2ND WXG SO Detachment 4 – Palehua, Hawaii ).

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘works in conjunction with’ MAJCOM ‘functional counterpart users’ of products, data and services supplied from U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ).

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) is ‘one ( 1 ) agent’ of the ‘lead Command’ for ‘gathering operational requirements’.

USAGES –

– Planning – Military Operation Missions ( global spectrum ); and, – Executing – Military Operation Missions ( global spectrum ).

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘coordinates’ U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) policy issues.

TASKS –

Four ( 4 ) Operational Weather Squadrons ( OWS ) ‘specifically conduct’ Global Environmental Intelligence ( GEI ):

– Warnings ( 24-hours / 7-days a week / 365-weeks a year ); – Military Mission Briefings ( 24-hours / 7-days a week / 365-weeks a year ) – Forecasts ( 24-hours / 7-days a week / 365-weeks a year ); and, – Analysis ( 24-hours / 7-days a week / 365-weeks a year ).

PROCESSING –

Strategic Center ( cost: $277,000,000 million dollars ) ‘subordinate organization’, provides:

– Computer System Complex; – Computer System Network Productions; – Computer System Applications; – Computer System Operations; – Computer System Sustainment; and, – Computer System Maintenance.

Air Force Weather Enterprise ( AFWE ) is a ‘computer system’ of the U.S. Department of Defense  (DoD ). AFWE computer system access is ‘restricted’ to members of the United States military ( Active Duty, National Guard, or Reserve Forces), U.S. Government, or U.S. government contractors that do business with the U.S. government and require ‘weather information’.

SPECIAL NOTE(S):

USAF ASO A8 –

The U.S. Air Force ( USAF ) Air and Space Operations ( ASO ) Strategic Plans and Programs Directorate ( ” A8 ” ) ‘directs’ the ‘planning’, ‘programming’, ‘budgeting’, ‘development’, ‘acquisition’, ‘engineering’, ‘configuration management’, ‘modification’, ‘installation’, ‘integration’, ‘logistics’ and ‘life cycle maintenance support’ over ‘all’ of the ‘computer processing equipment’ and over ‘all’ of the ‘standard weather systems’.

– –

USAFWA

1st Weather Group ( 1ST WXG ) Directorate

1ST WXG

Global Environmental Intelligence ( GEI )

GEI DISTRIBUTION –

The USAF Air Force Weather Agency ( AFWA ) 1st Weather Group ( 1ST WXG ) Directorate uses ‘four’ ( 4 ) Operational Weather Squadrons ( OWS ) ready 24-hours a day 7-days a week 365-days a year to provide its Global Environmental Intelligence ( GEI ) notification distributions to U.S. ‘military forces’ ( see military branches below ) at 350 ‘specific installations’ located in ‘five’ ( 5 ) Continental United States ( CONUS ) Regions.

OWS GEI NOTIFICATION REGIONS –

SOUTHEASTERN ( CONUS )

9th Operational Weather Squadron ( OWS ) Shaw Air Force Base South Carolina

9TH OWS – Notifies:

– U.S. Air Force; – U.S. Air Force Reserve; – U.S. Air Force National Guard; – U.S. Army; – U.S. Army Reserve; and, – U.S. Army National Guard.

NORTHERN ( CONUS ) AND, NORTHEASTERN ( CONUS )

15th Operational Weather Squadron ( OWS ) Scott Air Force Base Illinois

15TH OWS – Notifies ( Both Regions ):

– U.S. Air Force; – U.S. Air Force Reserve; – U.S. Air Force National Guard; – U.S. Army; – U.S. Army Reserve; and, – U.S. Army National Guard.

WESTERN ( CONUS )

25th Operational Weather Squadron ( OWS ) Davis-Monthan Air Force Base Tucson, Arizona

25TH OWS – Notifies:

– U.S. Air Force; – U.S. Air Force Reserve; – U.S. Air Force National Guard; – U.S. Army; – U.S. Army Reserve; and, – U.S. Army National Guard.

SOUTHERN ( CONUS )

26th Operational Weather Squadron ( OWS ) Barksdale Air Force Base Louisiana

26TH OWS – Notifies:

– U.S. Air Force; – U.S. Air Force Reserve; – U.S. Air Force National Guard; – U.S. Army; – U.S. Army Reserve; and, – U.S. Army National Guard.

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘works with’ USAF Air and Space Operations ( ASO ) staff to integrate U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) Continental Operations ( CONOPS ) with U.S. Air Force ‘plans’ and ‘Programs’.

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘assists’ in the ‘exploitation of weather information’ for ‘warfighting operations’.

– –

The aforementioned four ( 4 ) Operational Weather Squadrons ( OWS ) additionally, provide:

– USAF Weather Agency officer ‘upgrade’ training; and, – Apprentice forecaster ‘initial’ qualification.

– –

USAFWA Manpower & Personnel Directorate ( A1 ) ‘provides’ the global spectrum, of:

– Manpower; – Organization; – Personnel; and, – Training.

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘oversees’ and ‘executes’ U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) Standardization and Evaluation Program for ‘Weather Operations’.

U.S. Air Force ( USAF ) Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘assists’ the USAF Air and Space Operations ( ASO ) staff with managing U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) the ‘process’ of ‘career field-training’ by ‘obtaining training’ and ‘implementing training’ to meet ‘career field-training requirements’.

– –

USAF

Air Force Combat Weather Center ( USAFCWC – aka – AFCWC )

USAF Combat Weather Center ( Hurlburt Field, Florida ) ‘develops’, ‘evaluates’, ‘exploits’ and ‘implements’ a variety of ‘new tactics’, ‘new techniques’, ‘new procedures’ and ‘new technologies’ across U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) ‘enhancing effectiveness’ for U.S.  military force branches, of the:

U.S. Special Forces; U.S. Joint Operations; U.S. Combined Operations; U.S. Air Force; and, U.S. Army.

– – – –

Source: Global Security

Space

AFNS

Space Weather Team Readies For Upcoming Solar Max by, Ryan Hansen ( 55th Wing Public Affairs )

February 25, 2011

NEBRASKA, Offutt Air Force Base – February 25, 2011 – Solar Maximum may sound like the name of a super hero, but it’s certainly no comic book or 3-D movie. Solar Maximum is actually the name for the Sun’s most active period in the solar cycle, consistently producing solar emissions, solar flares and sun spots.

For a little background on the sun’s activities, the star goes through roughly 11-year cycles of where it is very active and also relatively calm. The Sun’s last Solar Maximum occurred in 2000 and it is expected to awaken from its current solar minimum and get more active this year.

According to the members of the 2nd Weather Squadron ( WS ), an active sun can cause all sorts of problems for us. “Solar weather plays a huge part in the warfighter’s mission,” said Staff Sgt. Matthew Money, a forecaster with the space weather flight. “Impacts from solar weather can cause radio blackouts, satellite communication failure, satellite orbit changes, satellite surface charging, or short circuits, and radar clutter.”

That is why the squadron’s worldwide space weather team of roughly 50 active-duty members, civilians and contractors continually analyze, forecast and provide alert notifications for the entire U.S. Department of Defense ( DoD ), as well as a slew of other government agencies.

“When ‘space weather’ causes impacts to earth that meet or exceed warning thresholds our end users are informed within minutes,” said Staff Sgt. Jonathan Lash, space weather flight forecaster. We send out warning bulletins through a computerized distribution system, [ and ] we have other graphical products that show what happened in the past 6-hours around the globe as well as what we expect to happen in the upcoming 6-hours,” he said.

Members of the 2nd Weather Squadron [ OWS ] rely on five ( 5 ) ground-based Solar Observatories, as well as a network of satellites orbiting the earth, to accomplish their mission.

“There aren’t too many opportunities to be the Air Force’s sole provider of something,” said Lt. Colonel Jim Jones, 2nd Weather Squadron [ OWS ] Commander. “In this case, the mission is unique to the entire DoD.”

Solar Observatories are ‘strategically placed’ around the globe in such places, as:

– Australia; – Hawaii; – Italy; – Massachusetts; and, – New Mexico.

They include both optical telescopes and radio telescopes and ensure the Weather Squadron always has one eye, or ear, on the sun.

“The optical telescope network monitors solar surface features,” said Master Sgt. Shane McIntire, the space weather flight chief. “It automatically tracks the sun and directs light to the instruments, which collect data and are controlled by computers. It scans specific regions at a rate of at least twice per minute.”

Through filtered lenses space weather analysts are able to perform flare patrol and view sunspots to determine the magnetic complexity of the region.

“The telescope has special filters that isolate a single optical wavelength,” said Master Sgt. Shane Siebert, who leads the Detachment 4 solar observatory for the 2nd Weather Squadron at Holloman Air Force Base, New Mexico. This wavelength ( 6563 angstroms ) is called ‘hydrogen alpha’ ( H-Alpha ) where the majority of solar activity occurs,” he said, “Analysts monitor this wavelength from sunrise to sunset, and are looking for specific signatures that may lead to solar flares and other adverse activity.”

But not all of the sun’s activities can be captured using optical telescopes.

Some events have a unique radio-frequency signature that can also be measured.

Using a mixture of technology from the 1970s to the present day, radio observatories are able to monitor frequencies in the 25 MhZ to 180 megahertz range, as well as eight ( 8 ) other discrete frequencies. Their digitized output is collected by a computer and then processed and analyzed for solar activity.

“We actually are able to detect the specific strength at a given radio frequency,” said Major Bradley Harbaugh, who commands the Detachment 5 solar observatory for the 2nd Weather Squadron [ OWS ] at Palehua, Hawaii. What we detect are energetic solar emissions in [ specific ] frequency bands or ranges. When detected, we [ are able to describe ] the start time, duration, intensity and type of solar emission. This helps describe the potential impacts by identifying the characteristics of what may impact earth,” he said.

Identifying these solar emissions is crucial to warfighter communication abilities.

“If there is solar energy that increases on your frequency, you can try to talk into your radio, but the noise from the sun will be stronger than your transmission, therefore drowning-out what you are saying,” Major Harbaugh said, “As an operator, you can increase your radio power to try and ‘out-broadcast’ the sun but you are also now broadcasting over a much larger area, making your transmission more susceptible to enemy detection. Therefore, the Sun’s impact must be a consideration when planning a mission.”

The Weather Squadron network of satellites includes those owned and operated by the U.S. Department of Defense ( DoD ), National Aeronautic and Space Administration ( NASA ) and the National Oceanic and Atmospheric Administration ( NOAA ) that include a combination of systems that are dedicated solely to space weather as well as a few utilizing space weather sensors.

“We gather a significant amount of data from satellites,” Sergeant McIntire said. “Imagery from [ satellites ] can augment the ground-based network, providing real-time monitoring of solar features at wavelengths that can’t be seen from the ground.”

Data from all of these sources combined, are continually pushed to the space weather operations center at the U.S. Air Force Weather Agency ( USAFWA ) here.

With this information in hand, the Weather Squadron can produce the most reliable space weather forecast possible, however even with all of this data, producing a space weather forecast is still much more difficult than creating one for terrestrial weather.

“Space weather is a terribly difficult science and it takes a lot of training and experience,” Colonel Jones said.

“Space weather forecasting is very reactive,” Sergeant Money said. “The ‘knowledge and tools are not quite up to par in order to do accurate forecasting’ like we do here on Earth.”

It is also important to note that today the world is much more reliant on space-based assets than they were during the last Solar Maximum, officials said. With cellphones, portable navigation devices and satellite television receivers, all part of our daily lives, a huge solar weather event could wreak havoc on quite a few different platforms.

“The impact of a solar storm in 2000 was probably not as great, due to the lower density of space technology, and the limited number of consumers utilizing the data,” Major Harbaugh said, “However, the ripple from a ‘major solar event now will more likely be felt across a much broader consumer base’ [ the public ] since there are many more assets and many more users of space data.”

However, with improved technology and an increased knowledge of the sun’s activities, the Weather Squadron ( WS ) is more prepared than ever for the upcoming Solar Maximum, Colonel Jones said. “Since the last solar maximum, we’ve upgraded most of our numerical models in terms of both their basic science and the data they ingest,” he said. “That’s a direct result of the advances in sensors and the technology that enables rapid data transfer. We can react faster and see farther than ever before.”

“We already have members within the unit developing forecast techniques based on signatures we see on the sensors,” Sergeant Money said.

So it’s a safe bet, that ‘the next 2-years’ will be ‘hectic’ for the 2nd Weather Squadron [ OWS ].

Their mission, to provide ‘situational awareness to key decision-makers’ will certainly keep ‘them’ busy.

“In the last 30-days alone ( February 2011 ), we’ve had [ more than 30 ] reportable [ solar ] energy events,” Major Harbaugh said. “The workload has ‘already increased’ and ‘will continue to do so’ for probably ‘the next 1-year’ or ’2-years’.”

“About 1-year ago, it was not uncommon for an analyst to only have one ( 1 ) very small Solar Region of the Sun to monitor,” Sergeant Siebert said. “Today, it is normal for analysts to keep fairly busy monitoring 4 Solar Regions to 6 Solar Regions. Studies, of the last Solar Maximum, show that typically 1-day included twenty-two ( 22 ) active Solar Regions – almost 4 times our current workload,” he added.

Regardless, Weather Squadron [ OWS ] ‘space weather’ analysts, forecasters and technicians globally are ready for the upcoming solar fury [ see, e.g. Solar Energetic Particle Event ], Colonel Jones said.

– – – –

The U.S. Air Force obviously needs ‘far-more’:

– More personnel for U.S. Air Force Weather Agency ( USAFWA ) Weather Group ( WXD ) Operational Weather Squadrons ( OWS ) effortings;

– Independent ’governing board’ or ‘blue ribbon watchdog committee’ overseeing an ’independent monitoring agency’ placing the U.S. Air Force Weather Agency ( AFWA ) and its USAF Air and Space Operations ( ASO ) under an independent ‘microscope’; and,

– Public Transparency notifications, would also be nice, but then…

– – – –

Source: MSNBC.COM

Huge Solar Flare’s Magnetic Storm May Disrupt Satellites, Power Grids

March 7, 2012 13:19 p.m. Eastern Standard Time ( EST )

A massive solar flare that erupted from the Sun late Tuesday ( March 6, 2012 ) is unleashing one of the most powerful solar storms in more than 5-years, ‘a solar tempest that may potentially interfere with satellites in orbit and power grids when it reaches Earth’.

“Space weather has gotten very interesting over the last 24 hours,” Joseph Kunches, a space weather scientist at the National Oceanic and Atmospheric Administration ( NOAA ), told reporters today ( March 7, 2012 ). “This was quite the Super Tuesday — you bet.”

Several NASA spacecraft caught videos of the solar flare as it hurled a wave of solar plasma and charged particles, called a Coronal mass Ejection ( CME ), into space. The CME is not expected to hit Earth directly, but the cloud of charged particles could deliver a glancing blow to the planet.

Early predictions estimate that the Coronal Mass Ejections ( CMEs ) will reach Earth tomorrow ( March 8, 2012 ) at 07:00 a.m. Eastern Standard Time ( EST ), with the ‘effects likely lasting for 24-hours and possibly lingering into Friday ( March 9, 2012 )’, Kunches said.

The solar eruptions occurred late Tuesday night ( March 6, 2012 ) when the sun let loose two ( 2 ) huge X-Class solar flares that ‘ranked among the strongest type’ of sun storms. The biggest of those 2 flares registered as an X Class Category 5.4 solar flare geomagnetic storm on the space weather scale, making it ‘the strongest sun eruption so far this year’.

Typically, Coronal Mass Ejections ( CMEs ) contain 10,000,000,000 billion tons of solar plasma and material, and the CME triggered by last night’s ( March 6, 2012 ) X-Class Category 5.4 solar flare is ‘the one’ that could disrupt satellite operations, Kunches said.

“When the shock arrives, the expectation is for heightened geomagnetic storm activity and the potential for heightened solar radiation,” Kunches said.

This heightened geomagnetic activity and increase in solar radiation could impact satellites in space and ‘power grids on the ground’.

Some high-precision GPS ( Global Positioning Satellite ) users could also be affected, he said.

“There is the potential for ‘induced currents in power grids’,” Kunches said. “‘Power grid operators have all been alerted’. It could start to ’cause some unwanted induced currents’.”

Airplanes that fly over the polar caps could also experience communications issues during this time, and some commercial airliners have already taken precautionary actions, Kunches said.

Powerful solar storms can also be hazardous to astronauts in space, and NOAA is working close with NASA’s Johnson Space Center to determine if the six ( 6 ) spacecraft residents of the International Space Station ( ISS ) need to take shelter in more protected areas of the orbiting laboratory, he added.

The flurry of recent space weather events could also supercharge aurora displays ( also known as the Northern Lights and Southern Lights ) for sky-watchers at high latitudes.

“Auroras are probably the treat that we get when the sun erupts,” Kunches said.

Over the next couple days, Kunches estimates that brightened auroras could potentially be seen as far south as the southern Great Lakes region, provided the skies are clear.

Yesterday’s ( March 6, 2012 ) solar flares erupted from the giant active sunspot AR1429, which spewed an earlier X Class Category 1.1 solar flare on Sunday ( March 4, 2012 ). The CME from that one ( 1 ) outburst mostly missed Earth, passing Earth by last night ( March 6, 2012 ) at around 11 p.m. EST, according to the Space Weather Prediction Center ( SWPC ), which is jointly managed by NOAA and the National Weather Service ( NWS ).

This means that the planet ( Earth ) is ‘already experiencing heightened geomagnetic and radiation effects in-advance’ of the next oncoming ( March 8, 2012 thru March 9, 2012 ) Coronal Mass Ejection ( CME ).

“We’ve got ‘a whole series of things going off’, and ‘they take different times to arrive’, so they’re ‘all piling on top of each other’,” Harlan Spence, an astrophysicist at the University of New Hampshire, told SPACE.com. “It ‘complicates the forecasting and predicting’ because ‘there are always inherent uncertainties with any single event’ but now ‘with multiple events piling on top of one another’, that ‘uncertainty grows’.”

Scientists are closely monitoring the situation, particularly because ‘the AR1429 sunspot region remains potent’. “We think ‘there will be more coming’,” Kunches said. “The ‘potential for more activity’ still looms.”

As the Sun rotates, ‘the AR1429 region is shifting closer to the central meridian of the solar disk where flares and associated Coronal Mass Ejections ( CMEs ) may ‘pack more a punch’ because ‘they are more directly pointed at Earth’.

“The Sun is waking up at a time in the month when ‘Earth is coming into harms way’,” Spence said. “Think of these ‘CMEs somewhat like a bullet that is shot from the sun in more or less a straight line’. ‘When the sunspot is right in the middle of the sun’, something ‘launched from there is more or less directed right at Earth’. It’s kind of like how getting sideswiped by a car is different than ‘a head-on collision’. Even still, being ‘sideswiped by a big CME can be quite dramatic’.” Spence estimates that ‘sunspot region AR 1429 will rotate past the central meridian in about 1-week’.

The sun’s activity ebbs and flows on an 11-year cycle. The sun is in the midst of Solar Maximum Cycle 24, and activity is expected to ramp up toward the height of the Solar Maximum in 2013.

Reference

http://www.msnbc.msn.com/id/46655901/

– – – –

Cordially submitted for review and commentary by,

 

Kentron Intellect Research Vault ( KIRV )
E-MAIL: KentronIntellectResearchVault@Gmail.Com
WWW: http://KentronIntellectResearchVault.WordPress.Com

/

/

Research References

http://www.af.mil/information/factsheets/factsheet_print.asp?fsID=157&page=1
https://login.afwa.af.mil/register/
https://login.afwa.af.mil/amserver/UI/Login?goto=https%3A%2F%2Fweather.afwa.af.mil%3A443%2FHOST_HOME%2FDNXM%2FWRF%2Findex.html
http://conceptactivityresearchvault.wordpress.com/2011/01/02/solar-energetic-particle-event-effects
http://www.globalsecurity.org/space/library/news/2011/space-110225-afns01.htm
http://www.wrf-model.org/plots/wrfrealtime.php

 

NonReferenceable Objects ( NRO )

[ PHOTO #1 ( above ): U.S. Navy missile launched from Pacific Ocean undersea channel dugout cliff holes near Santa Catalina island November 8, 2010, or as officials claimed it was only a passenger airline jetstream from Hawaii? ( click to enlarge ) ]

[ PHOTO #2 ( above ): U.S. Navy missile launched from Pacific Ocean undersea channel dugout cliff holes near Santa Catalina island November 8, 2010, or as officials claimed it was only a passenger airline jetstream from Hawaii? ( click to enlarge ) ]

[ PHOTO ( above ): U.S. Navy missile launched from Pacific Ocean undersea channel dugout cliff holes near Santa Catalina island November 8, 2010, or as officials claimed it was only a passenger airline jetstream from Hawaii? ( click to enlarge ) ]

NonReferenceable Objects ( NRO )
Harvesting Extra-Superconductive Magnetic Element Properties
by, Concept Activity Research Vault ( CARV / Paul Collin )

December 7, 2011 16:00:42 ( PST ) Updated ( Originally Published: November 22, 2010 )

CALIFORNIA, Los Angeles, San Nicholas Island – December 7, 2011 – Depends on how one looks at it, but amongst southern California islands, Santa Catalina rests about 23-miles offshore where the deadly Pacific Ocean channel is deeper than Mount Everest is high.

This Pacific Ocean channel is also home for the world’s largest Great White shark species, and for decades – according to some southern Cailifornia residents – Santa Catalina island is suspected of harboring offshore resident extraterrestrial aliens ( also known as ) ‘non-referenceable aliens’ caught with Unidentified Submersible Objects ( USO ), Unidentified Flying Objects ( UFO ), and /or Non-Referenceable Objects ( NRO ) diving in and out of this sea passageway to their suspected deep sea base.

While there has been no mention of what these extraterrestrial aliens might be doing underwater, a few believe there is a distinct – although remote – possibility they [ nonreferenceable biological entities ] may be utilizing their Non-Referenceable Objects ( NRO ) interalia Unidentified Submersible Objects ( USO ) to harvest a planet Earth natural resource ’super-magnetic fluid’ easily removed by them from ultra-deep sea volcano molten minerals they then only extract the super-magnetic properties therefrom, afterwhich they jetison the remaining molten lava.

While that may be too unbelieveable you are urged to pause before making any final determinations about this report because the U.S. National Oceanic and Atmospheric Administration ( NOAA ) Ocean Scienes and Technology ( OST ) department in the ultra-deep sea below the Pacific Ocean, far west of the California island of Santa Catalina, is right on target with what some believe to be aliens from outer space reaches beyone the Earth.

The 2007 New Zealand American Submarine Ring of Fire ( NZASRoF or NZASRF ) Mariana Trench expedition using the AUV ( Autonomous Underwater Vehicle ) ABE ( Autonomous Benthic Explorer ) was assigned to detail map the ultra-deep sea Brothers volcano, a rhyodacite ( silicious ) explosive volcano more than 2,600 feet ( 800-meters ) above its surrounding deep-sea floor to its caldera rim located 1-mile ( 1,500 meters ) beneath the surface of the Pacific Ocean where it erupted with so much explosive force that it carried carried volatile energy and volume partially emptying its ‘magma chamber’ where its collapsed caldera area measures 1.6 nautical miles ( 3-kilometers ) by ( x ) 2.2 nm ( 4-kilometers ) to a depth of 300 meters below the volcano rim.

Brothers Volcano, however has more to offer out-of the ultra-deep Pacific Ocean where its ‘large magnetic anomalies’ register many hundreds of nano-Teslas, a unit of magnetic field strength amplitudes from its volcanic lava, from nearby just southeast where ‘several volcanic domes’ ( ‘slightly less explosive’ than Brothers major caldera was ) that are ‘younger’ and ‘larger volcanic lava bodies’ associated with ‘greater magnetic anomalies’ expectedly holding a ‘strong magnetic anomaly signature’.

During this expedition, as is the case on many traditional marine geophysical surveys, a Remote Vehicle ( R/V ) Sonne will ‘tow’ ( from the sea surface ) a magnetometer ( dipped into but not far below the surface of the sea where it is dragged along to roughly measure ‘magnetic field amplitudes’ far above buried volcano lava domes revealing ‘magnetic anomaly surveying’ to ‘identify recent lava flows’, however Brothers volcano nearby domes in the southeast Pacific Ocean lay 1,500 to 2,500 meters beneath any surface tow magnetometer that will only provide a ‘very broad kilometer scale’ reading of ‘magnetic anomaly variations’ but not readings like anything positioned near the ‘actual volcanic structure’ so, for a much higher resolution of magnetic anomaly variables, a more precise ‘magnetic anomaly information’ survey can be performed with the Autonomous Benthic Explorer ( ABE ), an Autonomous Underwater Vehicle ( AUV ), that can perform a series of ultra-deep sea dives and flights ( 50 meters above the volcano domes ) following parallel lines while collecting ‘magnetic anomaly measurements’ from a mounted onboard ‘fluxgate magnetometer’ ( the size of a hand palm ) performing swath fly-overs collecting ‘bathymetry’, ‘conductivity’, ‘temperature’, and ‘chemical’ measurement instrument readings used to estimate ‘magnetization of the volcano’ correlated with what is seen on bathymetry and ‘volcanic flow unit identification images.

The Brothers Volcano caldera area sees ‘geothermal vents’ of lava spewing a residual ‘high temperature demagnetization fluid’ that ‘alters magnetic chemistry’ of ‘titanomagnetite’, a natural super-magnetic mineral, however after this ‘geothermal alteration’ even titanomagnetite is rendered far less magnetic. This ‘geographic high-temperature demagnetizing fluid alteration’ can be found even within ‘magnetic dead zones’ where both ‘active’ and ‘inactive’ as well as ‘young’ an ‘old’ lava vent sites exist.

On land surfaces, studies of similar collapsed volcano calderas ( depressions ) suggest calderas were filled with formed deposits of a pyroclastic low-density vesicular material.

References provided, click on: http://upintelligence.wordpress.com/2010/12/09/secret-hfse-properties-part-1/ and http://oceanexplorer.noaa.gov/explorations/07fire/logs/aug1/aug1.html to begin learning more.

Who says that all intelligent biological entities, unknown to us, are not of this world but only from outer space? We already know that the ultra-deep sea holds biological entities foreign to us, however what we ‘do not know’ is the extent of ultra-deep sea biological entity spieces. Does the simply fact that ‘we do not know about all of them’ render them ‘aliens’ on Earth? No. What if there are ‘advanced biological entities’ of the ultra-deep sea? What if these creatures were on Earth before us? If that proves to be the case, then all human beings could very well be new ’aliens’ on Earth. Keyword, being, ‘new’ so just how new makes a ‘new alien’ on Earth?

In 1989, a large Unidentified Submersible Object ( USO ) was discovered floating on the surface of the Pacific Ocean, according to eyewitnesses  aboard an ocean going vessel that tracked the UFO / USO underwater on sonar.

This same USO, was witnessed releasing what appeared to be several smaller Unidentified Submersible Objects ( USO ), submerged on a heading southwest beyond Santa Catalina island where it disappeared off sonar tracking. Did this USO travel underwater to the Mariana Trench or, as yet to be identified, seabase location?

On June 14, 1992 more than two-hundred ( 200 ) Unidentified Flying Objects ( UFO ) interalia Unidentified Submersible Objects ( USO ) were witnessed by at least twenty-seven ( 27 ) residents of Los Angeles County in southern California, according to MUFON ( Los Angeles, California ) chapter UFO / USO researcher Preston Dennett.

These UFO / USO Non-Referenceable Objects ( NRO ), however were not ’hovering in the sky’ but ‘rising-up out-of the Pacific Ocean channel near Santa Catalina island’ west of Santa Monica, California, according to resident witnesses positioned less than 10-miles east in the foothills of Los Angeles.

These two-hundred ( 200 ) UFO / USO  crafts, that came out-of the Santa Catalina island channel of the Pacific Ocean for several seconds, gathered into several cluster formation ( small squadrons ) before each set shot-off upward like missiles into southwest sky beyond southern California’s Santa Catalina island.

Were these two-hundred ( 200 ) Non-Referenceable Objects ( NRO ), which were all headed southwest by air in the same direction as the Mariana Trench in the Pacific Ocean, or were they travelling far beyond it and the atmosphere of Earth?

Worried local residents – including those from Santa Monica, California to as far south as Malibu, California – filed UFO / USO eyewitness reports by telephone with switchboard operators of the Los Angeles County Sheriff’s Department ( LASO ), Santa Monica Police Department ( SMPD ) as well as with other officials.

One ( 1 ) caller, who was recorded by a law enforcement operator, sounded hesitant perhaps, embarrassed, but nevertheless concerned enough to report the unidentifiable objects he had just witnessed.

The wealth of these eyewitness local resident report descriptions were fed by local officials to officials of the federal United States government, amongst which included the U.S. Coast Guard ( USCG ) that reportedly refused to honor a request to search the Pacific Ocean area near Catalina island for any trace remnant discharges any of the two-hundred ( 200 )  UFOs / USOs may have left or jetisoned.

On December 25, 2004 in southern California adjacent to Santa Catalina island the Long Beach Police Department ( LBPD ) had their own mid-air encounter with a UFO / USO officially recorded on their police helicopter Forward Looking Infra-Red ( FLIR ) camera ( see video below ):

During July 2009 in Santa Monica, California another UFO / USO – resembling the same UFO / USO encounter with the Long Beach Police Department helicopter ( video above ) – was filmed in color from the coastal City of Santa Monica, California just off the westcoast Pacific Ocean channel near Santa Catalina island ( see video below ):

On November 8, 2010 another Unidentified Flying Object ( UFO ) and / or Unidentified Submersible Object ( USO ) was sighted in southern California near Santa Catalina island witnessed it while filming it from aboard a southern California City of Los Angeles television ( KCBS TV ) news station helicopter ( see video below ):

The UFO / USO filmed ( above ) by the news station helicopter news was denied by ‘all’ official United States federal government agencies claiming they had no knowledge if it and had nothing to do with the Unidentified Flying Object ( UFO ) sighted.

Interestingly, after U.S. federal officials interviewed the KCBS TV news station helicopter reporter, the same reporter then disavowed what he previously reported by agreeing with U.S. federal officials that what he had seen filmed were just ice crystals glistening off a ‘small airplane’ travelling into the horizon during sunset.

Did this KCBS-TV news reporter undergo U.S. federal government official embedding? Why would the KCBS-TV news reporter change what he previously witnessed and then reported to the greater southern California Los Angeles news area public?

What about the KCBS-TV news helicopter video ( above )?

Listen very carefully to precisely what the news helicopter reporter describes ( below ) in the most ‘general of terms’:

Was the KCBS-TV Sky news helicopter video ‘interrupted’ or ‘earlier replaced’ by ‘decoy film footage’ to ‘disguise what was really being witnessed’ to ‘match general descriptions’ reported the KCBS-TV Sky 2 helicopter news reporter?

Watch what former U.S. federal government Department of Defense ( DoD ) military officials whom have decades of experience with what was sighted off the coast of southern California near Catalina island ( below ):

Now compare what civilian reporters said to agree with ‘official U.S. government cover story lines’ the later reported ( below ):

If the UFO / USO video was implanted or switched, might it have appeared more like the films ( further above )? One may no longer wonder what was actually seen coming out-of the Pacific Ocean near Santa Catalina island. Then again, the UFO / USO may have originated near another southern California island, San Nicholas island ( see video below ).

The U.S. National Reconnaissance Office ( NRO ) may have a purposeful misnomer embedded within its initials. Historical research proves that the U.S. federal government entity charged with who “owns the night” refers to Unidentified Flying Objects ( UFO ) as being ‘objects’ of “Non-Reference” or Non-Refrerencable Objects ( NRO ).

What if the U.S. National Reconnaissance Office ( NRO ) actually stands for the U.S. Non-Referenceables Office ( NRO ) charged with reconnaissance over Non-Referenceable Objects ( NRO ) “Who Rule The Night?” Learn a little something more about U.S. federal government clearance levels and what surrounds the NRO ( video below ):

Now, if those following this report are ‘thoroughly convinced’ and ‘absolutely sure’ that what is being discussed here has being cleverly twisted around into something less than factual, the following ( below ) serious speach given by the recently deceased former Minister of Defense for Canada may come to some as a rather shocking surprise:

The following five ( 5 ) color video mini-series ( below ) introduces Unidentied Submersible Objects ( USO ) also known as Non-Referenceable Objects ( NRO ):

The Truth Is Out There Amidst The Abyss Below

Far west of southern California islands, in the Pacific Ocean, lays the Marianna Trench where on May 24, 2009 an ultra-deep sea discovery project [ http://geology.com/press-release/deepest-part-of-the-ocean/ ] documented videos of ’self-illuminating ultra-deep sea creatures’ using no ’artificial light’ but a ‘natural bioluminescent process’ known as ‘counterillumination’, which is really something one needs to ’see to believe’ ( below ):

While the 2009 video ( above ) provides documented factual understandings of evidence for what truly exists ( i.e. bioluminescent sea creatures ) in the ultra-deep sea, another video clip ( below ) that was filmed twenty ( 20 ) years earlier in 1988 provided displays of bioluminescent sea creatures in the ultra-deep sea science-fiction motion picture film “The Abyss.” Perhaps, in another 20-years ( more or less ) in the future, this insight may allow the public to even better fathom and understanding what lies in the ultra-deep sea further ( below):

Now equipped with a new set of eyes, one wishing to go even deeper might research before consulting government on NonReferenceable Objects ( NRO ) claimed nonexistent for the public.

Submitted for review and commentary by,

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

/

/

NASA Elenin Coming

NASA Elenin Coming

 

[ PHOTO ( above ): April 15, 2011 Comet Elenin position and trajectory Earth dates ( click to enlarge ) ]

NASA Elenin Coming by, Concept Activity Research Vault

May 17, 2011 16:42:08 ( PST ) Updated ( May 16, 2011 11: 20 )

CALIFORNIA, Los Angeles – May 17, 2011 – After much controversy over a celestial body referred to as “Elenin,” said to be entering our solar system, NASA admits there is a ‘comet’ it calls “Elenin” that ‘is’ in-fact ‘coming very soon’ toward Earth that will simply go around our Sun and slingshott off into outerspace, however there seems to be a few missing pieces to the NASA public report.

The first ( 1st ) problem is the ‘planetary or solar body slingshot fly-by effect’, which NASA spacecraft have used for decades to ‘increase speed’ and consequently propel – ‘faster than we can engineer a rocket motor to do’ – satellites toward destinations using various forms of ’staring plane mosaic’ technology incorporated into ’electronic telemetry guidance systems’. Comet Elenin, will also be experiencing the ‘slingshot fly-by effect’, but with ‘no electronic telemetry guidance systems’ to guide it in any particular direction.

The second ( 2nd ) problem is NASA’s Jet Propulsion Laboratory ( JPL ) Near Earth Object ( NEO ) Program Office solar system plotting system ( see below ) pictures Elenin ’on a direct collision course’ with “Mercury” around May 21, 2011 but NASA reports ‘nothing publicly’ about Elenin colliding with, hitting or glancing off-of ”Mercury” so, you be the judge:

[ IMAGE ( above ): May 16, 2011 Elenin solar system slingshot pathway  ( NOTE: distance indicated from Earth ) – View #1 ( click to enlarge ) ]

[ IMAGE ( above ): May 16, 2011 Elenin solar system slingshot path to Mercury ( NOTE: distance indicated from Earth ) – View #2 ( click to enlarge ) ]

[ IMAGE ( above ): May 21, 2011 – Elenin solar system slingshot Mercury collision path ( NOTE: distance indicated from Earth ) – View #3 ( click to enlarge ) ]

NASA news ( below ) and NASA plots for May 16, 2011 do not appear to match NASA actual plots for May 21, 2011 ( above ) when looking at “Mercury.”

– –

Source: NASA Space.Com

Comet Elenin Trajectory 22,000,000 Miles Close To Earth

Don’t Fear Comet Headed Our Way — It’s A Wimp

May 10, 2011 6:37:23 PM ET Updated ( May 10, 2011 22:37:23 )

A comet first discovered just 6-months ago will be making a visit to the inner solar system soon, but don’t expect to be completely dazzled. This comet is a bit of a wimp, NASA says.

Comet Elenin ( also known, by its astronomical name: C/2010 X1 ), was first detected on December 10, 2010 in Lyubertsy, Russia by observer Leonid Elenin who found the comet while using the remote controlled ISON Observatory near Mayhill, New Mexico, USA.

At the time of its discovery, the comet was about 401,000,000 million miles from Earth.

Over the past 4-1/2 months, the comet has closed the distance to Earth’s vicinity as it makes its way closer to perihelion ( its closest point to the Sun ).

As of May 4, 2011 the Comet Elenin distance is about 170,000,000 million miles.

“That is what happens with these long-period comets that come in from way outside our planetary system,” said NASA Jet Propulsion Laboratory ( JPL located in Pasadena, California ) Near-Earth Object ( NEO ) Program Office Don Yeomans in a statement. “They make these long majestic speedy arcs through our solar system and sometimes put on a great show, but not Elenin; right now that comet looks kind of wimpy.”

The comet doesn’t offer much of a view and is quite dim to behold.

“We’re talking about how a comet looks, as it safely flies past us,” said Yeomans. “Some cometary visitors, arriving from beyond the planetary region like the Hale-Bopp comet in 1997, have really lit-up the night sky – where you can see them easily with the naked eye – as they safely transit the inner solar system. But Elenin is trending toward the other end of the spectrum, you’ll probably need a good pair of binoculars, clear skies and a dark secluded location to see it – even on its brightest night.”

Comet Elenin, An Icy Run

Comet Elenin should be at its brightest, shortly before the time of its closest approach to Earth on October 16, 2011 when its closest point will be 22,000,000 million miles from Earth.

Even at such a distance, the Elenin comet will ‘not’ be able to shift tides or tectonic plates here on Earth – as some Internet rumors have suggested.

Some have even wondered if the comet could possibly be pushed closer to Earth than usual.

“Comet Elenin will not encounter any ‘dark bodies’ that could perturb its orbit, nor will it influence Earth in any way,” said Yeomans. “It will get no closer to Earth than 35,000,000 million kilometers.”

Not only is the Elenin comet far away but it is also on the ‘small side’ for comets, said Yeomans.

“So you’ve got a modest sized icy dirtball getting no closer than 35,000,000 million kilometers,” said Yeomans.

“It [ Elenin, the comet ] will have an immeasurably minuscule influence on our planet. By comparison, my sub-compact automobile exerts a greater influence on the ocean tide than will comet Elenin ever.”

This Fall – Cosmic Comet Show

But just because the Elenin comet ‘will not change much’ here on Earth, ‘does not mean skywatchers should not pay attention’.

“This comet [ Elenin ] may not put on a great show, just as certainly it will not cause any disruptions here on Earth, but there is a cause to marvel,” said Yeomans. “This intrepid little traveler [ Elenin, the comet ] will offer astronomers a chance to study a ‘relatively young comet’ that ‘came here from well beyond our solar system planetary region’. After a short while, it [ Elenin ] will be headed back out again, and we will not see or hear from Elenin for thousands of years. That’s pretty cool.”

NASA detects, tracks and characterizes asteroids and comets – passing relatively close to Earth – using both ground-based and space-based telescopes, and its Near Earth Object ( NEO ) Observations Program ( called: SpaceGuard ) discovers these objects, characterizes a subset of them and predicts their paths to determine if any could be potentially hazardous to Earth.

References

http://ssd.jpl.nasa.gov/sbdb.cgi?sstr=Elenin;orb=1;cov=1;log=0;cad=1#cad http://ssd.jpl.nasa.gov/sbdb.cgi#top http://ssd.jpl.nasa.gov http://www.jpl.nasa.gov/news/news.cfm?release=2011-138 http://www.jpl.nasa.gov/asteroidwatch/newsfeatures.cfm?release=2011-129 http://www.jpl.nasa.gov/news/news.cfm?release=2010-144 http://www.space.com/11617-comet-elenin-wimpy-solar-system.html

– –

How big is comet Elenin? What does comet Elenin look like real close? Well, that just seems to be the third ( 3rd ) and fourth ( 4th ) problem because NASA is ‘not reporting anything about those two ( 2 ) additional items’ either.

The only public cross-section comparative information analysis on comet Elenin was found contained in a privately produced video by a ‘good ole boy’ whose factual references appear fairly represented ( below ) – less having ‘overlooked his own mis-typing input’ of ”1,700,000,000,000″ ( without ‘commas’ – as seen ‘here’ ) indicating ‘billion’ ( miles ) rather than ”170,000,000″ ‘million’ ( miles ) he ‘thought’ as he indicated typing-in in his diligent attempt trying to make a calculation demonstration:

Submitted for review and commentary by,

Concept Activity Research Vault E-MAIL: ConceptActivityResearchVault@Gmail.Com WWW: http://ConceptActivityResearchVault.WordPress.Com

 

Secret Locations

Paul Bennewitz photo of ETV at Wirt Canyon near Dulce NM - Lg

[ IMAGE ( above ): Special motion picture film and photographic imaging equipment caught this particular near still image of what Paul Bennewitz obtained certain information about ExtraTerrestrial Vehicles ( ETV ) passing in-to and out-of this particular solid rock face mountain at Wirt Canyon near Dulce, New Mexico, USA ( click on image to enlarge ) ]

Secret Locations
by, Kentron Intellect Research Vault ( KIRV )

March 17, 2011 11:08:42 (PST)

NEW MEXICO, Santa Fe – March 17, 2011 – Paul Philip Schneider, a mechanical engineer, was no slouch but a patriotic ex-military American who was found garroted to death shortly after publicly exposing a work-site area near the Archuleta Mesa ( between Colorado and New Mexico ) where an event occured while he worked under ‘contract’ for MORRISON-KNUTSON INC. that upon instructions from the Los Alamos Scientific Laboratory ( LASL ) Geosciences Division – now known as – Los Alamos National Laboratory ( LANL ) caused him to be sent down an ‘old nuclear bomb borehole’ ( see further below about “Nuclear Fracturing” ) he claimed was located in a ‘remote area of northern New Mexico’ where his mission was to discover why huge plumes of ‘black soot’ were forcefully being shot up into the atmosphere’ at an active drill site where his team was located.

Paul Philip Schneider 8

Schneider, by his own accounts, claimed the black soot atmosphere smelled like a sulphuric gas lingering in the air all around the drill-site, but who ever heard of a ‘carbon atmosphere’ of ’black sulphur gas’?

It is still seriously doubted if Phil Schneider knew what was actually happening down in the cavern he was ordered into through the borehole.

Did Schneider, know:

– What those plumes of ‘black soot’ actually consisted of, and ‘why’ it was being ‘forcefully shot-out of the borehole’ with such ‘pressure’;?

– How a ‘huge cavern came to be made’ at the bottom of the borehole?

– What the ‘cavern had been filled with and why’?

– Why boreholes were ‘specifically directed to be drilled precisely where they were’?

From what has been able to be gathered from in-depth research, Schneider was apparently sent by KNUTSON to perform his role as a ‘mechanical engineering consultant’, who totally by accident discovered something beyond his wildest imagination – and many other’s too.

Schneider was lowered down one of the boreholes with an ‘armed black beret military team’ that surrounded him but he could not understand ‘why’ he had such escorts – except for some unknown military security reason he wasn’t cleared to know, and being ex-military, Schneider knew-better than to ask any questions, but to ‘do what was asked of him by his superior’.

Now comes the point within this report that information becomes incredibly bizzare, however by in-depth research Schneider’s claims ( below ) have been pulled somewhat back into ‘more detailed perspective’ by ‘incredibly related facts’ that existed long ago.

Schneider and the other armed U.S. federal military employees discovered a group of what Schneider claims were ExtraTerrestrial Biological Entities ( EBEs ) that he referred to as alien “Grey talls” ( 7-foot extraterrestrial aliens with a bluish purple bioskin color.

Once lowered through the borehole, a deep underground cavern appeared and this is when Schneider claims he instantly got off several rounds of his own ammunition ( from his sidearm pistol ) before being shot by some ‘blue beam of light’ – leaving him cut wide-open in the chest as having several fingers of his cut-off  from what the first ( 1st ) alien he encountered ( down there ) apparently aimed at him ( Schneider ).

Paul Philip Schneider 2

Paul Philip Schneider 7

Schneider further claimed that nearly sixty ( 60 ) U.S. government personnel and contractors lost their lives during this August 1979 ‘unpublicized event’ in what Schneider described having been involved-in an underground cavern filled with alien beings.

Hard to believe, I know because at-first I totally scoffed at Schneider’s claims being utterly ridiculous, until some unrelated research studied earlier portions of what Schneider claimed.

Years earlier ( 2002 ) I was exchanging communications with someone related to Paul Philip Schneider, on an entirely ‘different subject’ related to work his ‘father’ Oscar Schneider was involved-in with the U.S. Navy, and current research now dovetailed back to that which I was researching years ago surrounding a U.S. National Security Agency ( NSA ) ‘precious metals’ scientific recycling process that involved baking ore resulting in low-yield nuclear energy transmutation of material into high-yield elements.

It appeared from further studies that Oscar Schneider’s son, Paul Philip Schneider claims required further detailed clarifications, which he either could not perform earlier due to secrecy agreements with the U.S. government or he may have felt he should ‘not’ provide too many specifics for fear of breeching a “business agreement” he made with the U.S. government, however based upon current research beyond Schneider’s claims, incredible further facts go far beyond this report sufficient to at least warrant a suitable public release of information contained herein.

In 2002, Schneider’s intinately close friend sent me “Rhyolyte 47″ ( classified ) ‘official U.S. Navy documents’ pertaining to Paul Philip Schneider’s ‘father’, U.S. Navy Captain Oscar Schneider, who from his ‘military professional medical position’ became directly involved with “Project Blue Ship” ( also known as ) “The Philadelphia Experiment.”

Phil Schneider mentions that upon drilling “boreholes” the drill-site teams were experiencing “huge plums of black soot’ or “black dust” ( that stank ) and were being ‘rapidly ejected under pressure’ into the atmosphere, which also lingered near the ground, from within these underground hole locations they were being told to drill at.

None apparently knew ‘why’ or ‘how’ these odd occurences were taking place, or why they were drilling where they were.

Initial research indicates that “Nuclear Fracturing” ( 1962 – 2005 ) had been used earlier in ‘underground nuclear blasts’ for what was said to be “oil experimentations,” which was a U.S. government cover for underground testing of nuclear warheads subsequent to Russia having exploded what it called the “Tsar Bomba,” a 100-kiloton nuclear bomb that the U.S. later claimed only yielded 50-kiloton blast with a Circular Error Probability ( CEP ) range for death from irradiation up to 60-miles away from ground-zero.

Did previously ‘large U.S. oil reserves once exist’ until subsequently becoming nuclear blasted underground, creating both a ‘high pressure black oil carbon soot atmosphere’ and simultaneously creating these ‘huge underground caverns’?

Were “Grey” alien “talls” ( 7-foot ) provided with a ‘government secret atmospheric environment’ consisting of ‘large underground cavern living where they were the only ones able to breath a high carbon pressurized atmosphere consisting of gamma irradiated particle elements” derived from ‘earlier U.S. secret military underground nuclear Projects’ publicly called “Nuclear Fracturing”? If so, could an extraterrestrial alien agreement been reached secretly with the U.S. government?

Was the LANL / LASL Geosciences Division ‘borehole drillings’ “Hot Rock” Project – publicly revealed as only seeking ‘new geothermal energy alternatives’ only a ruse?

By ‘drilling boreholes’ ( in highly ‘specific remote locations’ where contractors were ‘told to drill’ ) did that ‘actually and instantly deplete’ all of the carbon dense atmosphere used by these Grey alien “talls” whom were thereby being killed down inside these caverns? If so, why would the U.S. government destroy what it must have worked so hard to build for these aliens?

Might another “aggenda” have entered determining a take over of those Grey tall ( 7-foot + ) aliens existence underground? If so, why were these “Grey talls” suddenly considered a “threat?”

There is only rumored mentions of a percentage reduction number of Grey talls ( aliens ) being sighted on Earth, whereas “Grey shorts” ( 4-foot to 5-foot tall ) are now only but rarely sighted so, were nearly 80% of the “Grey talls” actually “wiped-out” by ‘another alien species’ – as rumored?

Is there any truth to any of this and are there any other correlations worth researching to such deep underground military bases and caverns albeit for aliens or humans?

Why have ‘sophisticated landing strips’ on “Church of Scientology” ( COS ) ‘remote properties’ been purchased by its “Church of Scientific Technology” entity that has also built ‘sophisticated underground vaults’ and ‘sophisticated long tunnel systems’ that are ‘guaranteed to last 1,000 years’ equipped with ‘sophisticated nuclear blast-proof doors’ built to withstand any direct nuclear hit? All of these have been substantiated through my research and documented by satellite photos, facility diagramatics, and land parcel details and grant information on many of these COS / CST ‘remote installation properties’.

Related points to initially consider:

– Schneider claimed his work was on a Los Alamos National Laboratory ( LANL ) Project, that my research indicates was actually performed the name “Los Alamos Scientific Laboratory” ( LASL ) Geosciences Division”  within its geothermal ( volcanic magma ) “Hot Rock” Project during August 1979 or ‘likely earlier’;

[ IMAGE ( above ): Archuleta Mesa ( top ) where ( due south ) is Dulce, New Mexico ( click to read, enlarge image ) ]

– Schneider of KNUTSON worked at this Archuleta Mesa / Kit Carson National Forest (aka) Carson National Park work-site during the August 1979 alien event;

– Schneider possibly a ‘sub-contracted engineer’ ( like Keith Millheim of CER GEONUCLEAR CORP. ) working for KNUTSON ( Tulsa, OK ) performing what that industry refers to as “high risk construction contracts;”

– New Mexico site on the Archuleta Mesa / Kit Carson National Forrest – aka – Carson National Park ( New Mexico near Colorado border ), research land grants near Edith, Colorado and Blue Lake, New Mexico ( Indian land );

– KNUTSON, which Schneider claimed to have been working for, may have been a ‘specially named adjunct’ working out-of Area 51 / S4 ( Groom Lake, Nevada ) while Schneider travelled from to work at the Archuleta Mesa ( Edith, Colorado ) site. Could Schneider have been covering-up the exact location being actually in the Kit Carson National Forest Indian land of Blue Lake, New Mexico ) near Taos, New Mexico where the phonemena of the “Taos hum” is and all the UFO sighting northeast of Taos, which would place the UFO sightings directly over Indian land of Blue Lake, New Mexico;

– Check both work and worksite connections between EG & G CHANDLER ENGINEERING ( Tulsa, OK ) and KNUTSON ( Tulsa, Oklahoma ) on work-site contracts that took place during 1979 ( or ‘earlier’ ), in the area of the Kit Carson National Forest ( New Mexico ) and /or Archuleta Mesa ( New Mexico or southern Colorado ), performing an industry referenced “high risk construction contract;” and,

– Google Earth shows a ‘gigantic cave-in’ or ‘explosion hole from something’ where scattered atop is an unusual ‘broken and caved-in lattice pattern’ of ‘huge very-very long beams / tubes’ [ cavern / cave / tunnel support beam cylinders ] that appear ‘snapped like twigs’ and sunken into a remote cave-in area nearest Edith, Colorado of the Archuleta Mesa area.

====

Part 2

COS CST & IGSS Brief Note –

Church of Scientology ( COS ) and Church of Scientific Technology ( CST ) New Mexico vault tunneling construction performed by INTERNATIONAL GROUND SUPPORT SYSTEMS ( Santa Fe, New Mexico ).

Note all ‘sub-contracts’, ‘consulting engineers’, ‘consulting architects’, ‘insurance companies’, and any other ‘related construction support firms’ for connections, between:

– INTERNATIONAL GROUND SUPPORT SYSTEMS ( Santa Fe, New Mexico ); – CER GEONUCLEAR CORPORATION of EG&G ( Las Vegas, Nevada 89114 ); – MORRISON-KNUDSEN INC. – aka – KNUTSON ( Las Vegas, Nevada ); – MORRISON KNUDSEN ENGINEERS – aka – KNUTSON ( Tulsa, Oklahoma ); – CHANDLER ENGINEERING of EG&G ( Tulsa, Oklahoma ); – Others ( e.g. AUSTRAL OIL, et al. ).

====

PART 3 –

Nuclear Fracturing Projects –

Atoms for Peace program fails oilpatch testing. Atoms for Peace was a grand idea aimed at benefiting the world and was tested in the oilpatch. It never reached its potential in ‘industrial use’, however it did set an active alternative; accelerating the nuclear arms race between the U.S. and Russia.

– –

UNITED NATIONS ( UN ) – UNITED STATES President Dwight D. Eisenhower –

It kicked off on November 28, 1953 when the United Nations ( UN ) General Assembly asked the Disarmament Commission for suggestions to halt nuclear proliferation, 10-days later U.S. President Dwight D. Eisenhower addressed the UN General Assembly saying:

“The United States would seek more than the mere reduction or elimination of atomic materials for military purposes. It is not enough to take this weapon out of the hands of soldiers. It must be put into the hands of those who will know how to strip its military casing and adapt it to the arts of peace.”

– –

OPERATION PLOWSHARE ( 1972 ) –

In the United States, that effort kicked-off OPERATION PLOWSHARE – named for the Biblical passage that referred to turning swords into plowshares. The United States had high hopes for Operation Plowshare. It anticipated twenty-seven ( 27 ) nuclear programs using atomic devices for peace.

– –

– PANAMA – – NICARAGUA – – COSTA RICA –

Among those plans were proposals to use more than one-hundred ( 100 ) nuclear explosions to blast a 37 mile ( 60 kilometer ) long canal across the isthmus of Panama at San Blas.

Another plan would use at least 250 blasts for a 140 mile ( 225 kilometer ) canal on the Nicaragua Costa Rica border.

– –

– ALASKA –

PROJECT CHARIOT Cape Thompson, Alaska

Another, PROJECT CHARIOT, almost went into action that would have used several hydrogen bombs to blast out a harbor at Cape Thompson, Alaska in the Chukchi Sea about 75 miles ( 121 kilometers ) from the Russia border in the Bering Sea Strait.

– –

– NEVADA –

PROJECT SEDAN ( 06JUL62 @ 10:00 a.m. ) Yucca Flats, Nevada

Concerns over the native population reduced that to a 104 kiloton proof-of-concept blast called SEDAN at Yucca Flats, Nevada on July 6, 1962. The blast displaced 12,000,000 million short tons of dirt and released a 12,000 ft. high ( 3,660 meter ) radioactive cloud.

– –

– NEW MEXICO –

PROJECT GASBUGGY ( 10DEC67 ) Kit Carson National Forest ( New Mexico ) / Archuleta Mesa adjacencies: Dulce, NM; Blue Lake, NM; and, Edith, CO. LAWRENCE LIVERMORE NATIONAL LABORATORY ( Livermore, California ) U.S. ATOMIC ENERGY COMMISSION ( AEC ) EL PASO NATURAL GAS

Native American indians of the Tiwa tribe, continue cultural traditions to this day. Around 1903, by Executive Order, U.S. President Theodore Roosevelt took over 48,000 acres of Tiwa tribal land ( in the “Sangre de Cristo Mountains” of New Mexico ) making it the “Kit Carson National Forest” (aka) “Carson National Forest” in New Mexico. In 1970, by Executive Order, U.S. President Richard Nixon returned part of that land to the Tiwa tribe.

A ‘tribal disaster’ ( elsewhere ) was mentioned in an official U.S. Administration letter. Pull letter from existing files under Knights of Malta member Thomas Sawyer.

Kit Carson National Forest, Sangre de Cristo Mounbtains, Archuleta Mesa, and other names ‘narrow area’ of ‘work-site location(s)’ requiring ‘further research’:

– Jemez Plateau; – Jemez Mountains; – Fenton Hill, New Mexico <?> – UNION OIL ( Baca location ) or UNOCAL Geothermal Division ( Los Angeles, CA ); – Mayrsville <?> could be “Marysville” < sp ? >; – Coso; – LASL Geosciences Division; – ERDA ( Energy Research and Development Administration ) National Laboratories Division of Geothermal Energy ( DGE ); – Drilling Third ( 3rd ) Borehole ( code name: EE-1 )

Most of the real testing took place in the oilpatch, starting with PROJECT GASBUGGY – the first test aimed at releasing natural gas from tight sands and the first use of a nuclear device for industrial purposes.

It took place on December 10, 1967 in the Carson National Forest of New Mexico – about 90 miles northwest of Santa Fe, New Mexico and 25 miles southwest of the town of Dulce, New Mexico.

The U.S. ATOMIC ENERGY COMMISSION ( AEC ) was the oversight agency as LAWRENCE LIVERMORE NATIONAL LABORATORY ( LLNL ) and EL PASO NATURAL GAS ( EPNG ) conducted the test. They set off a 29 kiloton blast 4,240 ft ( 1,293 m ) underground in a tight shale. According to official figures, the blast created a cavity 80 feet ( 24 m ) wide and 335 feet ( 102 m ) high filled with gas. Unfortunately, the ‘gas was too radioactive’ to use. They didn’t stop there.

– –

– COLORADO –

PROJECT RULISON ( 10SEP69 ) Rifle, Colorado; CER GEONUCLEAR CORP. ( Las Vegas, Nevada ); Keith K. Millheim ( president of Strategic Worldwide LLC The Woodlands, Texas ) under CER GEONUCLEAR CORP. contract; AUSTRAL OIL ( Denver, Colorado ); TELEDYNE ISOTOPES ( Palo Alto, California ); ATOMIC ENERGY COMMISSION ( AEC ).

The industry moved on to Project Rulison, probably the best known of the nuclear blasts for peace. This blast, also under the control of the ATOMIC ENERGY COMMISSION ( AEC ), was conducted by CER GEONUCLEAR CORP. ( Las Vegas, Nevada ) and AUSTRAL OIL ( Denver, Colorado ).

In between international assignments Keith K. Millheim ( president of Strategic Worldwide LLC The Woodlands, Texas ) was seconded by CER GEONUCLEAR CORP. to help design the drilling and testing of the Project Rulison Nuclear Gas Stimulation Experiment.

This 43 kiloton blast – 2.6 times the size of the bomb dropped on Hiroshima, Japan – was lowered 8,426 ft ( 2,568 m ) underground from a drill site on the southwest flank of Doghead Mountain in Battlement Creek Valley in Garfield County, Colorado approximately 40 miles northeast of Grand Junction, Colorado and 11 miles southwest of Rifle, Colorado. During hearings in the area, testimony revealed that the government believed the blast would fracture the tight MesaVerde gas sands and release more gas than conventional fracturing. If they could find a better way of fracturing, they might have the key to releasing more of the 317 Tcf of gas in place in tight sands, according to an article by Chester McQueary ( Parachute, Colorado resident ) writing for the High Country News. Also during the hearings, David Evans from Colorado School of Mines testified the industry would have to set off 13,000 similar blasts to get the kind of recoveries the government wanted. According to Chester McQueary, 1-week before the blast the AEC set up a 5 mile ( 8 kilometer ) quarantine zone around the well site as workers lowered the bomb into the hole. It even paid some homeowners to leave for the day. Unfortunately, winds that could have carried radiation north to Rifle, Colorado or west of Grand Junction, Colorado delayed the test until September 10, 1969. Protesters entered the quarantine area in 2s and 3s so they couldn’t be easily removed. KWSR radio station in Rifle carried a countdown for the blast. Finally, security forces just left protestors, McQueary among them, in the quarantine area. He felt a “mighty thump” that lifted him 8 inches into the air and measured 5.5 on the Richter scale in Golden, Colorado. Again, about 30-days later, the test team found the ‘gas was too radioactive to use’, but that wasn’t the end of the ‘nuclear program in natural gas’.

– –

– COLORADO –

PROJECT RIO BLANCO ( 1971 – 17MAY73 )

EG&G CER GEONUCLEAR CORP. ( Las Vegas, Nevada – 1971 – 1973 ) EQUITY OIL CO. ( Salt Lake City, Utah ) C.F. KNUTSON ( 1973 – 1975 ) D.L. Coffin ( 1968 – 1971 ); F.A. Welder ( 1968 – 1971 ); R.K. Glanzman ( 1968 – 1971 ); X.W. Dutton ( 1968 ).

<?> Virginia Glanzman ( USGS – Denver, Colorado ) <?>

The next test also took place in the Piceance Basin of Colorado, this one on May 17, 1973. It was called Project Rio Blanco and was scheduled in Rio Blanco County about 75 miles northwest of Grand Junction, Colorado and 30 miles ( 48.3 kilometer ) southwest of Meeker.

EG&G CER GEONUCLEAR CORP. ( Las Vegas, Nevada ) and EQUITY OIL CO. ( Salt Lake City, Utah ) conducted the blast where the operators set three ( 3 ) 330 kiloton devices between 5,838 feet and 6,689 feet ( 1,781 meters and 2,040 meters ) to blast out a huge cavern in the tight Mesaverde below the Green River oil shale. The blast went off, but the caverns didn’t connect.

– –

– NEVADA –

Nevada Test Site ( NTS ) ( 1993 – 1995 ) Nellis Air Force Base, Test Range Site, S-1 ( Nevada ) Area 51

LAWRENCE LIVERMORE NATIONAL LABORATORY ( LLNL – Livermore, California ) DESERT RESEARCH INSTITUTE ( P.O. Box 60220, Reno, Nevada 89506-0220 ) D.K. Smith; B.K. Esser; J.L. Thompson; J.I. Daniels; R. Andricevic; – 1994 – L.R. Anspaugh  ” ; R.L. Jacobson  ” ; I.Y. Borg; – 1976 – R. Stone  ” ; H.B. Levy  ” ; L.D. Ramspott  ” .

– –

– WYOMING – PROJECT WAGON WHEEL

AUSTRAL OIL PRESCO INC. ( The Woodlands, Texas – V-P: Kim R.W. Bennetts ) EL PASO NATURAL GAS U.S. ENVIRONMENTAL PROTECTION AGENCY ( EPA )

The federal agency had planned to move next to PROJECT WAGON WHEEL, but that test never took place. Wagon Wheel was scheduled to loosen up the tight formations on the Pinedale Anticline in Sublette County, Wyoming – an area operators have only recently started developing with large numbers of wells.

Unlike PROJECT GASBUGGY, which was a cost-is-no-object technical test, PROJECT WAGON WHEEL was supposed to ‘test profitability of atomic fracturing’ and was the biggest test to-date with plans for five ( 5 ) 100-kiloton nuclear explosions set-off 5-minutes apart and from 9,220 feet to 11,570 feet ( 2,812 meters to 3,529 meters ) in a field discovered by EL PASO NATURAL GAS.

The operators planned to wait 4-months to 6-months before testing the well, and thought radiation released in testing would be lower than normal background radiation. But the project gathered a lot of opposition. Among the statements that killed it was testimony by Dr. Ken Perry, a geologist with the University of Wyoming.

Looking at plans for 40 to 50 nuclear explosions per year, he said for full area development.

Southwestern Wyoming would be the ‘earthquake center of the world, according to Adam Mark Lederer in a thesis entitled, “Using Public Policy Models to Evaluate Nuclear Stimulation Programs: Wagon Wheel in Wyoming.”

After the PROJECT RIO BLANCO blast, officials plugged and abandoned three ( 3 ) wells in the area, but left three ( 3 ) wells open on the RB-E-01 drill pad so they could monitor the wells. When they finally abandoned the surface facilities, the radiation was no different than the background radiation in the area except for tritium readings, which exceeded government criteria in several samples.

In 2002, officials decided to lift all restrictions on surface activity. They did mandate there should be no penetration of the surface to 1,500 feet ( 458 meters ) within 100 feet ( 30 meters ) of the well bore and no intrusion from 1,500 feet to 7,500 feet ( 458 meters to 2,288 meters ) within a 600 feet ( 183-m ) radius of the well bore. The situation was similar at the Rulison well site, but with a recent twist.

Following the $6,500,000 million ( USD ) Rulison well, also called the AUSTRAL OIL Hayward #25-95 in Section 25-7s-95w, the government continued monitoring the subsurface as it had at Rio Blanco. In both cases it wanted to monitor the possible migration of radiation. The Environmental Protection Agency ( EPA ) also conducts annual sampling of deep monitoring wells and water sources in the area. Except for deep radiation at both sites, the areas are clean.

Recently, however PRESCO INC. ( The Woodlands, Texas ) sought permission for 40-acre spacing over an area that would include the Rulison well site.

It already had drilled a well in prolific Rulison field 1.5 miles ( 2.4 kilometers ) from the test site with no sign of radioactivity. Approval would have allowed the company to drill multiple wells near the Hayward well bore. Kim R.W. Bennetts, vice president of exploration and production for the company, said even if it drilled into the cavity, which it didn’t plan to do, very little radiation remained. Bennetts also said the company couldn’t, and wouldn’t, sell radioactive gas. Its planned wells in the area would be 1,200 feet ( 366 meters ) to the northeast and 1,600 feet ( 488 meters ) to the southeast of the Rulison test well.

After a February 10, 2004 meeting of the Colorado Oil & Gas Conservation Commission, the company then received approval to drill only one well on 40-acre spacing within a half-mile radius of the Rulison test well.

The company even had support from the county until it raised its plans from one ( 1 ) well to four ( 4 ) wells. The Colorado commission said it would approve the wells on a case-by-case basis. Currently, with no plans by any agency to continue work with ‘nuclear devices in the oilpatch’, it looks as if that research has come to an end.

While it lasted, however, it was a daring period of research into ‘advanced methods of releasing huge volumes of natural gas’, and that research continues today.

====

PART 4

EG&G –

EG & G INC. (aka) EG and G INCORPORATED 45 William Street Wellesley, Massachusetts 02181 U.S.A. Telephone: (781) 237-5100 Fax: (781) 431-4255 WWW: http://www.egginc.com

Statistics:

Public Company

Incorporated: 1947 as Edgerton, Germehausen & Grier, Inc. Employees: 13,000 Sales: $1.41 billion (1998) Stock Exchanges: New York Ticker Symbol:EGG

NAIC: 334413 Semiconductor & Related Device Manufacturing; 334419 Other Electronic Component Manufacturing; 334513 Instruments & Related Product Manufacturing for Measuring, Displaying & Controlling Industrial Process Variables; 334519 Other Measuring & Controlling Device Manufacturing; 54133 Engineering Services; 54199 All Other Professional, Scientific, & Technical Services; 54171 Research & Development in the Physical, Engineering & Life Sciences

Company Perspectives:

Our vision is that we can create value in an environment of ever-accelerating change. Value creation is our singular aim and ultimate measure of success. We believe that the increasing drive to create value represents the surest and most consistent avenue for us to benefit our customers, employees, stockholders and constituent communities. Our value creation model focuses on growth primarily derived from internal development.

Company History:

EG & G Incorporated is a diversified technology company that develops and provides products for public and private customers in the medical, aerospace, telecommunications, semiconductor, photographic, and other industries. The company’s operations are broken into five business units: Instruments; Life Sciences; Engineered Products; Optoelectronics; and Technical Services. Its Instruments operation is based on x-ray imaging systems and provides screening and inspection systems for use in airport and industrial security, and environmental, food, and nuclear industry monitoring. The Life Sciences unit develops systems for biochemical research and medical diagnostics. The Engineered Products unit designs and produces pneumatic systems, seals, and bellows for aerospace, semiconductor, and power generation markets. EG & G’s Optoelectronics division specializes in optical sensing devices for industrial and medical applications. The company’s final unit, Technical Services, provides engineering, research, management, and support services to governmental and industrial clients.

Nuclear Management and Monitoring: 1940s – 1950s

EG & G was established by three nuclear engineers from the Massachusetts Institute of Technology shortly after the end of World War II. These engineers, Harold E. Edgerton, Kenneth J. Germehausen, and Herbert E. Grier, had been involved in the American effort to construct an atomic bomb during the war. So valued were their contributions that after the war the government asked them to establish a company to manage further development of the country’s nuclear weapons. The three established a small partnership called Edgerton, Germehausen & Grier on November 13, 1947, and quickly began collecting contracts to advise the government on nuclear tests in Nevada and on South Pacific islands.

One of the first employees of the new company was Bernard J. O’Keefe, another MIT graduate who had worked for Dr. Grier during the war. O’Keefe served with the 21st Bomber Command in the Mariana Islands during the war, and is said to have personally wired the bomb that later destroyed the Japanese city of Nagasaki. O’Keefe was sent to Japan after its surrender to investigate that country’s progress with nuclear technology and recruit promising Japanese scientists for other atomic projects. A specialist in the design and development of electronic instrumentation and controls, O’Keefe quickly gained an important position in the growing firm.

Inconvenienced by the length of the company’s name, employees soon began to rely on the simple acronym EG & G, which later became its official name. In order to maintain close contact with MIT and its excellent nuclear and electronic engineering programs, EG & G set up its headquarters in Bedford, Massachusetts, in northwest suburban Boston.

EG & G was involved in the U.S. effort to build a more powerful nuclear weapon, the hydrogen bomb. That year, Grier and O’Keefe were present at a Nevada test site to personally witness an H-bomb detonation. After the weapon failed to explode, Grier and O’Keefe flipped a coin to determine who should scale the 300-foot test tower and disarm the bomb. Although O’Keefe lost, he won the special distinction of being the first man to disarm a live H-bomb.

O’Keefe had a second brush with disaster in 1958 when he witnessed an H-bomb detonation at Bikini Atoll in the South Pacific. There, shifting winds in the upper atmosphere caused a radioactive cloud of fallout to shower his bunker.

These experiences taught O’Keefe the awesome destructive power of nuclear weaponry and the dangers of radioactive fallout. As an engineer and manager he was bound to perform his company’s contracts, but grew personally opposed to the use of nuclear weapons. This sharpened his sense of responsibility toward the emerging form of warfare, a quality that was not lost upon the government’s Atomic Energy Commission.

As a result of EG & G’s experience with detonations, and O’Keefe’s concern for nuclear non-proliferation, the company became increasingly involved in distant monitoring projects, particularly as they related to Soviet nuclear tests. By observing changes in the atmosphere, EG & G was able to determine the incidence and strength of Soviet tests and provide important data on the progress of Moscow’s weapons program. In the process, EG & G gained highly specialized knowledge in environmental sciences. These skills had numerous applications outside the weapons industry, in such areas as pollution control and environmental management.

Exploring Commercial Markets: 1960s

As early as 1960, O’Keefe and the company’s three founders had considered establishing a new environmental analysis business, which would lessen EG & G’s dependence on low-margin government contracts and permit the company to enter new commercial markets. But at the time, neither public concern nor legislation placed a high value on such endeavors.

Three years later, the United States, the Soviet Union, and the United Kingdom signed a protocol that banned nuclear tests in the atmosphere, above ground, in the water, or in outer space.

With this document, EG & G appeared to lose a major portion of its business. However, the protocol did not prevent underground tests, which were far more complicated.

EG & G remained the only company with the proper supervisory credentials to manage this type of nuclear testing.

The company was forced to develop geologic analytical capabilities and become a ‘tunneling’ and ‘mining operation’ as well.

Furthermore, the government had also laid plans to establish a kind of NASA equivalent of oceanographics ( NOAA ).

Eager to take a place in this organization, EG & G invested heavily in oceanographic research.

While the underwater NASA never materialized, the efforts enabled O’Keefe to further cultivate new commercial markets for EG & G, including excavation and water transmission.

During this time the company’s three ( 3 ) founders moved further into retirement, taking ceremonial “executive chairman emeritus” positions.

As a result, O’Keefe became the de facto head of the company.

EG & G also pursued a strong acquisition campaign, taking over thirteen ( 13 ) companies between 1964 – 1967 when a strong environmental movement began to form in the United States.

With legislation still years away, EG & G began laying plans to play an important role in the environmental projects it was sure would result.

EG & G was divided into four ( 4 ) main operating divisions.

EG & G INTERNATIONAL, primarily concerned with oceanography, was the smallest.

EG & G standard products and equipment division produced a variety of machines and electronic devices, grew fastest during the 1960s.

EG & G nuclear detonation and monitoring business segment remained its largest.

EG & G nuclear technology group, most innovative and interesting, involved design of nuclear rocket engines ( ion engines ) for interplanetary propulsion.

EG & G CER GEONUCLEAR CORPORATION Projects included “nuclear landscaping” known as nuclear explosion controlled blasts, carving out:

Harbors; Canals; and, Passageways ( Tunnels, Pipelines, etc. ).

EG & G CER GEONUCLEAR CORPORATION Unit participations tested nuclear explosions ‘used to fracture layers of rock’ for access to otherwise inaccessible reserve locations of oil and gas for exploitation. Although feasible, these EG & G’public works’ projects failed in gaining public support.

In fact, opposition to nuclear technology in-general ‘increased’ as people grew wary of the safety of nuclear energy.

In addition, nuclear excavation would have required an unlikely waiver of the 1963 Nuclear Test Ban Treaty.

With the evaporation of good commercial prospects for its nuclear engineering expertise, EG & G was forced to rely again on military projects. Despite efforts to step up mechanical and electrical engineering work (partly by acquiring a spate of small research companies), EG & G mustered only four percent annual growth during the late 1960s.

Failed Initiatives: 1970 – 1975

Interest in nuclear power increased dramatically during the 1973 – 1974 Arab oil embargo, in which Americans sought to reduce their costly dependence on imported oil.

Realizing that the world’s oil exporting nations stood to permanently lose their largest customer, the United States, King Faisal of Saudi Arabia promptly called for an end to the embargo. Nonetheless, while Americans regained access to Arab oil, the end of the embargo was disastrous for the U.S. nuclear energy industry and for EG & G.

The end of the embargo removed one of the great justifications for nuclear power, and gave anti-nuclear activists time to properly organize legislative battles.

While EG & G was being locked out of yet another promising commercial application of its technologies, it attempted projects in other fields.

Some years earlier, in an effort to develop a new process for purifying nuclear isotopes, EG & G developed a flash tube that was ideal for photocopiers, but by the time an application could be developed, XEROX had already saturated the market with conventional designs.

In another ill-timed move, the company bet that environmental laws would cause demand for the unconventional Wankel engine to rise.

EG & G purchased a Texas automobile testing agency in hopes of winning large emission monitoring contracts, but the oil embargo destroyed the market for the clean but gas-eating Wankel engine as automobile environmental legislation was abandoned.

During this time, with the encouragement of the government, EG & G established a minority-dominated subsidiary, EG & G Roxbury, in a neighborhood of Boston, hoping to help strengthen the economic structure of the community. The project floundered, however, when bureaucrats failed to properly support the program, causing only a few sales to be made from the subsidiary. After a few years of disastrous results, the entire program was wound up.

On the Upswing: 1976 – 1980s

In 1976, EG & G environmental division evolved into a comprehensive resource efficiency operation providing complete oceanographic, atmospheric, and geophysical analysis rather than concentrating on environmental compliance languishing after the oil embargo by conserving resources, operations could more easily achieve pollution and waste reduction targets.

EG & G port development was unable to use nuclear devices to carve custom designed harbors, became a world leader in oceanographic studies and channel engineering by designing numerous tanker ports in the Persian Gulf and bauxite harbors in South America.

In 1979, U.S. President Jimmy Carter asked Bernard O’Keefe to serve as Chairman for the Synthetic Fuels Corporation of the U.S. government. Having already been asked to serve on a transition team for then presidential candidate Ronald Reagan, O’Keefe refused President Carter’s offer.

With the election of U.S. President Ronald Reagan in 1980, the United States took a sudden turn toward military armament programs.

EG & G experienced a resurgence in its flagging nuclear testing business and was tapped to develop a number of new nuclear weapons systems, including, the:

MX nuclear underground mobile railroad missile; and, Strategic Defense Initiative ( SDI ) – Star Wars Program.

A self-described “card carrying member of the military-industrial complex,” Bernard O’Keefe wrote in his book “Nuclear Hostages” that the United States and the Soviet Union were deadlocked in a nuclear arms race neither could control. Ironically, EG & G remained deeply involved in a number of Reagan administration projects O’Keefe opposed, including:

MX nuclear underground mobile railroad missile; Neutron bomb; and, Europe nuclear missile weapon stationings.

Nevertheless, EG & G pre-tax operating profit doubled from the new business.

EG & G also became involved in the space shuttle program, checking the spacecraft electrical components, loading its fuel, and managing the Cape Canaveral, Florida space center during shuttle missions.

EG & G site management abilities won it a position with the U.S. Department of Energy ( DOE ) elite Nuclear Emergency Search Team ( NEST ) which investigated nuclear extortion threats.

The company also won a contract to manage the U.S. government troubled Rocky Flats installation outside Denver, Colorado facility, widely criticized under ROCKWELL INTERNATIONAL for mismanagement, manufacturing nuclear weapon triggers.

EG & G maintained its momentum throughout the 1980s, winning contracts from diverse governmental agencies, including, the:

U.S. Department of Energy ( DOE ); U.S. Army; U.S. Air Force; U.S. Department of Defense; and, U.S. Customs Service ( now U.S. Department of Homeland Security ).

In 1988 the company hit a record high for both sales and earnings.

O’Keefe retired from EG & G during this period of strong growth, and was succeeded by John M. Kucharski.

Rapid Diversification: 1990s

Under U.S. President George Bush, and with the subsequent collapse of the Soviet military threat, the number of EG & G nuclear test projects decreased significantly.

As such, EG & G was under pressure to cultivate profitable new commercial ventures to offset the loss of revenue from military contracts.

The company responded rapidly, entering new commercial markets via a series of acquisitions.

One of the first acquisitions of the 1990s was ELECTRO-OPTICS, the optoelectronics business of GENERAL ELECTRIC ( GE ) Canada.

GENERAL ELECTRIC ELECTRO-OPTICS designed and produced advanced semiconductor emitters and detectors for defense, space, telecommunications, and industrial applications.

Other new ventures followed quickly:

WALLAC Group ( Finland-based ), which produced analytical and diagnostic systems;

IC SENSORS, a maker of sensing devices for industrial, automotive, medical, and aerospace uses; and,

NoVOCs Inc., an environmental remediation specialist.

In 1994, facing legal pressure from activists’ groups, EG & G announced that it would discontinue its nuclear-related endeavors as its various existing contracts expired. 1994, the company undertook a major reorganization to accommodate its newly acquired interests and the discontinuation of its nuclear business.

One important area of focus for the company was its Instruments division, which was rapidly becoming a leader in the field of weapons and explosives screening systems. After providing x-ray machines and metal detectors for the Democratic and Republican national conventions in 1992, the company won a contract to supply state-of-the-art explosives detection systems for U.S. federal courthouses across the nation. A subsequent contract with the Federal Aviation Administration ( FAA ) called for ten ( 10 ) of the most advanced explosives detection systems of EG & G for screening checked baggage in airports.

In 1998, EG & G president and CEO John M. Kucharski was replaced by Gregory Summe ( former president of ALLIEDSIGNAL INC. Automotive Products Group ) known for his ability to streamline and consolidate technology businesses.

He assumed his new position with the twin ( 2 ) goals, of:

1. improving operational efficiency; and,

2. restructuring EG & G portfolio, sharpen the focus on identified high-growth markets.

One of his first efforts toward better operational efficiency was to consolidate all EG & G’s business into five ( 5 ) independent strategic business units:

Life Sciences; Instruments; Engineered Products; Optoelectronics; and,Technical Services.

The company also began repositioning its portfolio by liquidating assets that fell outside these growth areas and making acquisitions that strengthened the EG & G position within its identified markets.

This strategy led to the largest acquisition in the company’s history:

LUMEN TECHNOLOGIES.

Lumen, purchased for $250,000,000 million in December 1998, was known globally as a producer of ‘specialty lighting’.

Lumen acquisition served to strengthen the company’s existing position in the medical lighting market, while at the same time allowing it entry into the areas of video and entertainment lighting.

Looking to the Future –

The consolidation efforts that Summe’s management team initiated in 1998 were expected to continue, with the goal of streamlining sites, functions, and processes so as to reduce operating costs and improve quality, consistency, and response time.

The company intended to continue its focused acquisition strategy.

It also planned to continue aggressively developing and marketing new products in its various divisions. Some of the products expected to be introduced were high-volume, cost-effective systems for drug screening, and a Point of Care system that allowed diagnosticians to determine whether or not a patient had suffered a heart attack in just 15-minutes.

In addition to introducing new products, the company also anticipated an increased emphasis on product line extensions and renewals.

EG& G Principal ( Primary List – Public ) Subsidiaries:

EG & G Alabama Inc.; EG & G ASTROPHYSICS ( England ); EG & G ATP GmbH ( Germany ); EG & G ATP GmbH & Co. Automotive Testing Papenburg KG ( Germany ); EG & G Automotive Research Inc.; EG & G CALIFORNIA INC.; EG & G Benelux BV ( Netherlands ); EG & G Canada Investments Inc.; EG & G Canada Limited; EG & G DEFENSE MATERIALS INC.; EG & G do Brasil Ltda.; EG & G E.C. ( UK ); EG & G Emissions Testing Services Inc.; EG & G ENERGY MEASUREMENTS INC.; EG & G Exporters Ltd. ( U.S. Virgin Islands ); EG & G Florida Inc.; EG & G GmbH. ( Germany ); EG & G HOLDINGS INC.; EG & G Hong Kong Ltd.; EG & G IC Sensors Inc.; EG & G Idaho Inc.; EG & G Information Technologies Inc.; EG & G Instruments GmbH. ( Germany ); EG & G Instruments International Ltd.; EG & G Instruments Inc.; EG & G International Ltd.; EG & G Japan Inc. ( USA ); EG & G Judson InfaRed Inc.; EG & G KT AEROFAB INC.; EG & G Langley Inc.; EG & G Ltd. ( UK ); EG & G Management Services of San Antonio Inc.; EG & G Management Systems Inc.; EG & G Missouri Metals Shaping Company Inc.; EG & G Mound Applied Technologies Inc.; EG & G Omni Inc.; EG & G Pressure Science Inc.; EG & G Singapore Pte Ltd.; EG & G SPECIAL PROJECTS INC.; EG & G Star City Inc.; EG & G S.A. ( France ); EG & G SpA ( Italy ); EG & G Technical Services of West Virginia Inc.; EG & G Vactec Philippines Ltd.; EG & G Ventures Inc.; EG & G Watertown Inc.; ANTARTIC SUPPORT ASSOCIATES ( Columbia ); B.A.I. GmbH. ( Germany ); Benelux Analytical Instruments S.A. ( Belgium; 92.3% ); Berthold A.G. ( Switzerland ); Berthold Analytical Instruments Inc.; Berthold France S.A. ( 80% ); Berthold GmbH & Co. KG ( Germany ); Biozone Oy ( Finland ); EC III Inc. ( Mexico; 49% ); Eagle EG & G Inc.; Eagle EG & G Aerospace Co. Ltd.; Heimann Optoelectronics GmbH ( Germany ); Heimann Shenzhen Optoelectronics Co. Ltd. ( China ); NOK EG & G Optoelectronics Corp. ( Japan; 49% ); PRIBORI Oy ( Russia ); PT EG & G Heimann Optoelectronics ( Singapore ); RETICON CORP.; Reynolds Electrical & Engineering Inc.; Science Support Corporation; SEIKO EG & G CO. LTD. ( Japan; 49% ); SHANGHAI EG & G RETICON OPTOELECTRONICS CO. LTD.; Societe Civile Immoiliere ( France; 82.5% ); THE LAUNCH SUPPORT CO. L.C.; VACTEC INC.; WALLAC ADL AG ( Germany ); WALLAC ADL GmbH ( Germany ); WALLAC A/S; WALLAC Holding GmbH ( Germany ); WALLAC Norge AS ( Norway ); WALLAC Oy ( Finland ); WALLAC SVERIGE AB ( Sweden ); WALLAC INC.; WELLESLEY B.V. ( Netherlands ); WRIGHT COMPONENTS INC.; ZAO PRIBORI.

====

PART 5

Research –

Keywords:

BaneBerry; Buggy; Cannikin; Dribble; Gnome; HardTack II; Hood; Pile Driver; PlumbBob; Roller Coaster; Rulison; Sedan; Shoal; Strategic Petroleum Reserve Operation ( SPRO ); Vela Uniform; Nuclear Rocket Development Station ( NRDS ); Kit Carson National Forest ( 10DEC67 ) – GasBuggy; Kit Carson National Forest ( AUG79 ) – Schneider’ Borehole Geophysics; Four-Dimensional Process Monitoring.

– –

– Pittman Station ( Henderson, Nevada ) USAF personnel support for Area 51. – PAX = Pittman Air Station ( USAF ) – PAD = Area 51 Test Site – Personnel Response Phrase For Area 51 ( Nevada ) Access Entry: “I work for ‘EG And G’ at the ‘Site’.”

– –

DESERT RESEARCH INSTITUTE ( DRI ), New Mexico.

– –

Not likely places to be visited by many of us are those secretly hidden underground, and even more-so underwater – since at least the 1930s.

Superpower nations have reigned supreme based on financial support for programs and projects worked on by some citizens holding ‘deep underground secrets’ hidden for decades by governments reasoning to such workers that their work has been done to protect ‘people’ and ‘property’ – all “in the interest of national security.”

Rarely do ‘government contract worker ants’ ( citizens ) ever realize ‘what it is’ we need to be protected from, and all anyone can think is that it is something bad we could imagine while never being able to comprehend anything ‘unimaginable’ we need protection from. Do any of us really want to know what ‘that is’, which we’re supposedly being protected against?

This report covers ‘unexplainable encounters’ governments decline to publicly explain. Events, buried over time and matters long forgotten about from most memories, that-is until recent events began unfolding a few, putting old stories thought-of as ’myths’ back into perspective today.

Some noteworthy facts, still remain ”classified’ and ‘locked away’ by high level authorities. Some, never revealed on television, motion picture films, books, or newspapers may now be reviewed through selected intelligence bits ‘n pieces ( below ) that may begin to dawn on some of us recalling having once heard of something about one or more but could not recall the source or details.

Legacy reports and more ( below ):

====

1975 – California ( Southern and Northern ) and Oregon ( Southern )

California Floats On Ocean?

John J. Williams of CONSUMERTRONICS CO. ( Alamogordo, New Mexico, USA ), said:

“Some time ago, I heard ( on a television interview show ) a man briefly mention that parts of California ( and neighboring states ) are floating on the Pacific Ocean.  He was a high ranking U.S. Navy officer aboard a top secret nuclear submarine that has been ( and is ) ‘exploring’ and ‘mapping’ enormous caverns and passageways underneath the Western U.S. for over 10-years now. A friend of mine finally tracked down the man who is now quietly living in retirement and asked that no details pointing to him be revealed as he does not want publicity and government attention. After writing this article, I destroyed my files on him.  This is his story.”

Williams explained that ‘not all’ areas in-question are actually ‘resting’ or “floating” on the ocean.

Many subterranean cavities are located beneath the western United States but not limited to just “California,” and consist of very large water-filled aquasystem passageways are explored up to several hundred miles inland, by nuclear-powered submarine, particularly in regions of southern California, northern California and southern Oregon.

Williams continues, “…When this U.S. Navy officer retired ( several years ago ), in spite of about 10-years of intensive U.S. Naval Oceanographic study, the U.S. Navy had still not gotten even a handle on their [ subterranean waterway passageway cavaties ] exacts [ oceanographic coordinates ] and dimensions. Today, the story may be different. He [ U.S. Navy officer, retired ] made the following statements from his observations:

1. These passageways are labyrinths with widths from a few [ feet ] up to thousands of feet wide, averaging roughly about a 100-feet across [ wet caverns ];

2. Much like dry caverns, heights and depths vary a great deal, and in some cases two [ 2 ] or more ‘caverns or passageways pass over or under each other’ at different depths;

3. Most of the underwater entrances lie just off the Continental Shelf;

4. Most of the underwater entrances are ‘too small for submarine investigation’, but many that are large enough lie in waters that are too deep [ ultra deep sea ];

5. Some of the caverns ( in southern California ) are topped by oil while others are filled with gases believed approximate to our atmosphere [ in very ancient times ];

6. The San Joaquin Valley [ California ] is essentially a portion of the original cavernous area that collapsed eons ago due to it’s sheer weight;

7. What is being passed-off as the “San Andreas Fault” are large unsupported chambers [ caverns ] that are in the process of collapsing.  When the ‘big one’ [ earthquake ] finally hits, many scientists in the know believe that most of California will break off like a cold Hershey bar [ chocolate candy bar ] and slide into the ocean.

8. [ this item was later deleted due to the individual’s fear that disclosure may in-part – due to recent ( 1985 ) international events – disturb a resolution to the feared problem, a similar scenario to that portrayed in a James Bond motion picture film depicting underground caverns, silicon valley technologies, nuclear weapons, and the San Andreas fault. ];

9. A well known U.S. nuclear submarine lost its way, within these passageways, resulting in its disappearance, but publicly reported as lost amidst open sea elsewhere using a recovery effort cover [ GLOMAR ( Global Marine ) Explorer ];

Williams continued, “I have no reason to doubt the man.  I can’t tell for sure whether or not these caverns and passageways exist or to their extent.  The story does sound a bit fantastic but I have no reason to doubt the man [ retired U.S. Navy officer ].  I have seen copies of documentation at least proving he was a high ranking U.S. Navy officer with nuclear-powered submarine duty, and a distinguished scientist. His scientific background and reputation are impeccable.  He definitely cannot be labeled as a crackpot, lunatic or publicity seeker.

I would very much like more information on this topic …”

Upon further inquiries, by ‘inner earth’ researchers, John J. Williams responded with the following when asked whether or not he [ Williams ] had received any replies to his request for more information about the alleged subterranean aquasystem passageways below California:

“Since publishing our article on the vast cavern network under much of California, we have received many responses and inquiries.  Some of these responses appear to have been from knowledgeable sources.  Note that the material sent to us for this article was written by someone of very high repute whose credentials I personally checked out.  Due to an agreement with him, I cannot reveal his identity.”

John J. Williams continued, “One response was from a retired submarine U.S. Navy Commander claiming to have spent many years in the waters off California and that such caverns do ‘not’ exist. Another response was from an anonymous person who cited unpublished oil company seismographic data stated, ‘Although most of the caverns you depict in your drawing are smaller, larger or located somewhat differently than the actual caverns, you are essentially correct … My information is more up-to-date than what you apparently relied upon.’  He ( or she ) did not supply any maps to pin down our differences, just some written descriptions, however some knowledgeable person could probably deduce his ( or her ) overall ‘map’ from the voluminous seismographic data sent.  I am in the process of looking for this input; it’s been several years now and it may have all been thrown out … Incidentally, the oil company seismic data had much data around the City of Fresno in Central California area if that helps any.”

1970s – California, County of Los Angeles, City of Long Beach

“One incident, which may lend credence to California floating on the ocean was a newspaper story that made ( in recent years ) headlines ( The Press Telegram newspaper of Long Beach, California ) involving an oil discovery beneath Long Beach, California as oil companies pumped oil out-of the ground the entire City of Long Beach began to sink up to 26-feet into the Pacific Ocean, and dikes had to be built to keep the seawater out.  The problem was ( temporarily ) being resolved by ‘water injection’ ( i.e. pumping an equivalent amount of water into the ground caverns to replace the amount of oil removed, in order to keep the City of Long Beach, California afloat.”

1963 April 23 – SubOceania      One note of interest, in connection with the account of John J. Williams, was a statement made by Virginia Louise Swanson, a prominent investigator of the California elusive creature known as “Bigfoot,” a huge hairy creature that walks upright. Virginia L. Swanson has performed considerale studies on cave connections in relationship to the Bigfoot phenomena and its ability to hide so well, and refers to these California dry cavernous openings saying, “Somewhere I got the idea that a big portion of Death Valley [ California ] is located on a shelf of ‘false bedrock’.  A certain type of earthquake would collapse all of it down to an enormous series of caverns that would open-up into another Grand Canyon [ Arizona ].”      According to our knowledge, the only nuclear-powered submarines to ever disappear – under mysterious circumstances – were the USS SCORPION or USS THRESHER.

It is uncertain whether the retired U.S. Navy officer, who John J. Williams spoke of, was referring to the USS SCORPION or USS THRESHER, although the USS THRESHER disappearance caused more publicity as a Flagship, the World’s most advanced class nuclear attack submarine designed to operate deeper in sea depths and more silent ( noise reduction propeller screws ) than any of its predecessor submarine vessels, The U.S.S. THRESHER was also endowed with highly significant advanced sonar equipment, fire-control systems, and was the most advanced international submarine in the World at the time of its disappearance so, it could have easily been an ideal choice as a U.S. Naval Oceanographic underwater global exploratory vessel missioned top secret for caverns mentioned by the U.S. Navy officer earlier interviewed by John Williams.

On April 10, 1963 the USS THRESHER, under command of U.S. Navy Lieutenant Commander John W. Harvey, a total of 129 men comprised of the ‘crew’, ‘civilian technicians’, and ‘observers’ – according to official government reports – disappeared without any explanation, trace or clue as to the fate of the vessel or occupants, nothing was ever recovered, and no indications of any oil slicks, radiation, floating debris, or similar signs of wreckage were ever seen.

It is interesting to note that, at the time almost all reports stated the USS THRESHER “disappeared” or was “lost” but no reports indicated it was “sunk” [ or “buried” or “captured” ].

One woman, whose husband was on the ill-fated USS THRESHER, reported she believed her husband was still alive.

Theologically speaking, the possibility of a long distance connection or “communion” on a deep emotional level between a husband and a wife may not always be consigned to the realm of the occult or psychic phenomena.  Many religions believe the very spiritual natures of a husband and a wife are united upon consummation of a marriage and thus become as Christian teachings indicate, “one flesh”.

The actual words of this woman, interviewed by William Carson and Jeannie Joy – two [ 2 ] writers [ columnists for Search Magazine ] devoted to pursuing strange events – shortly after the USS THRESHER disappearance, were as follows:

“My husband was on the submarine THREASHER when it disappeared.  I don’t consider myself a widow.  I don’t believe my husband is dead.  No, it’s not a matter of just not being able to believe it, to accept reality, I just can’t get over the conviction that he’s still alive somewhere.  I love my husband very much.  I know he loved – loves me.  We were very close.  We could always tell when something was wrong with each other.  Intuition, I guess.  I should have felt something the instant there was trouble, if he was really in serious trouble and knew it – a matter of life and death – but I didn’t.”

“What do you believe really happened?” Carson and Joy asked the attractive young woman.

“Most people think I’m crazy when I say this, but I believe the THRESHER was captured.”

“By whom?”

“I can’t say for sure, but there ‘was’ a Russian submarine spotted near there that day ( near ‘where it reportedly vanished 220-miles off Boston harbor ), only I can’t imagine how even the Russians could ‘capture’ a vessel like the THRESHER without leaving the slightest evidence!”

1989 – California, County of Inyo, City of Deep Springs ( east of Owens Valley and Bishop, CA ) and Nevada, Las Vegas

The following account, concerning an area just east of Owens Valley in Bishop, California was related by Val Valerian in his ‘Leading Edge’ newsletter ( December 1989 – January 1990 issue ) article entitled “Deep Springs, California,” stated:

“Deep Springs, California is an area that is becoming known as the site for very strange events.  According to the information released both on the air of KVEG AM radio and from other sources, the area is full of strange people wandering around in black suits. There have also been rumors that there is an underground facility in the area.

Checking with gravity anomaly maps proved that there are large cavities under the ground in that area. The wildest claims relative to the area have stated that alien life-forms are being released there … Deep Springs Lake has been probed and it appears bottomless.

Divers have traveled along an underground river 27-miles toward the Las Vegas, Nevada area before having to turn around.”

1963 – California, County of Inyo-Kern, Bishop & Casa Diablo

In the April 1963 issue of Search Magazine, investigative reporters William Carson and Jeannie Joy, in their regular column ( “Prying Into The Unknown” ) relayed the following information:

“It has always been a mystery to us, in the first place, how Mr. and Mrs. P.E. [ names excised for privacy ] can find and afford the time to do the sort of things most of us only dream of doing.  After knowing them for more than 15-years, it is inconceivable to suspect their integrity or sanity – and yet they impose the following excise upon our credulity.

While exploring for petroglyphs, in the Casa Diablo vicinity of Bishop, California Mr. & Mrs. P.E. came upon a circular hole in the ground, about 9-feet in diameter, which exuded a sulfurous steam and seemed recently to have been filled with hot water.  A few feet from the surface the shaft took a tangent course which looked easily accessible and, upon an impulse with which we cannot sympathize, the dauntless E.’s – armed only with a flashlight – forthwith crawled down into that hole. At a depth we’ve failed to record the oblique tunnel opened into a horizontal corridor whose dripping walls, now encrusted with minerals, could only have been carved by human hands, countless ages ago; of this the E.’s felt certain.

The end of the short passage was blocked by what seemed to be a huge doorway of solid rock which, however, wouldn’t yield.  Their flashlight was turned to a corner where water dripped from a protuberance, which proved to be a delicately carved face, distorted now by the crystallized minerals, and from whose gaping mouth water issued.

As Mr. and Mrs. E. stood there in silent awe – wondering what lay behind that immovable door – the strangest thing of all happened … but our chronology will not be incorrect if we wait until they return to the surface before revealing this, for now the water began gushing from the carved mouth and from other unseen ducts elsewhere in that cave and rising at an alarming rate. They hurried to the surface, and in less than 30-minutes there was only a quite ordinary appearing pool of warm mineral water on the desert floor.

‘Do you know,’ Mrs. E. said to her husband, ‘while I stood down there I heard music – the strangest, most weird music I’d ever heard.  But it seemed to come from everywhere at once or from inside my own head.  I guess it was just my imagination.’

Mr. E. turned pale. ‘My God,’ he said, ‘I thought it was my imagination but I heard it too – like music from some other world!’

Why do they call that rock formation – near where the E’s had their strange experience in Bishop, California – ” Casa Diablo ” [ known in English as ] “Devil House”? And why was that area [ County of Inyo ] named by the Indians as ” Inyo ” [ known in English as ] dwelling place of the great spirit?”

1991 – New Madrid Fault Area ( Central United States & Northeast Mexico )

– Typhlichthys Fish – Interwoven Immense Cave Systems ( Central United States & Southern United States – New Madrid Fault Area ) – Sotano de las Golondrinas Cave Hole ( Aquismo, S.L.P., MEXICO )      Erich A. Aggen Jr., in his article “Top Secret: Alien UFO Bases” ( Search Magazine issue Summer 1991 ), presented the following revelations concerning the UFO subterranean connection:

“A great deal of UFO research has also led to the conclusion that various … species of aliens have set up secret underground bases in the United States and other countries.  It is logical to assume that such bases have also been established elsewhere in the solar system.  If such bases exist, where would we find them?  Existing information allows us to make a few educated guesses.

Earth bases, underground, The dark, cavernous world beneath our feet is the source of many baffling mysteries. Clandestine UFO bases may be hidden deep within the earth in natural and/or artificial caverns.

As a former member of the National Speleological Society ( NSS ), I am well aware of the vast extent of cave systems within the United States.

In my own native state of Missouri, for example, there are over 2,500 known caves and dozens of new ones being discovered every year.  Many of these caves are intricately linked together by numerous passageways and interconnecting chambers.

One particular species of blind white ( albino ) cave fish, known as the TYPHLICHTHYS, has been found in many widely separated cave systems over several states.

Typhlichthys fish been found in caves that make a great arc through the states, of:

Kentucky; Indiana; Illinois; Missouri ( ‘under’ the Mississippi River ); Arkansas; and, Oklahoma

These states rest above one [ 1 ] immense cavern system that comprises a large area of ‘both’ the Central United States and Southern United States where many caves possess rooms hundreds of feet in length, width and height as huge natural caverns only reached and explored with the utmost skill and perseverance.

There are only a few thousand National Speleological Society ( NSS ) members in the U.S. and only a few hundred of this number are active spelunkers so, with few spread over such a large area only a very small fraction of the tens of thousands of known caves in the U.S. have been carefully mapped and explored while thousands of other caves remain undiscovered and unexplored.

Extensive evidence indicates caves in the U.S. may be connected with caves in other parts of the world.

In the Municipio de Aquismo, S.L.P. of Mexico, the cave known as ” Sotano de las Golondrinas ” [ known in English as ] “Basement of the Swallows ” reaches a depth of 1,100 feet ( 334 meters ).

The Sotano de las Golondrinas cave is actually a giant ‘sink hole’ or ‘hole in the ground’. Atop Sotano de las Golondrinas hole is a ‘near circular opening hundreds of feet in diameter that is impossible to climb down the sides of because the walls of the opening are too smooth and “belled-out” so, the only way to reach the bottom is to secure – at its top – a special rope over 1,100 feet long dropped into the sinkhole.

Underground explorers [ spelunkers ] must descend – into the Sotano de las Golondrinas yawning hole – one [ 1 ] person at a time, using special cave repelling gear and climbing techniques.  At the bottom of the Sotano de las Golondrinas hole are numerous ‘leads’ ( openings ) that feed into multiple different crevices, passageways, crawlways and rooms never mapped or investigated.

Sotano de las Golondrinas cave entrance is located in one of the most primitive and uncivilized areas of Mexico, and local inhabitants are afraid to approach the cave because they believe it is full of ‘evil spirits’ luring people to their deaths.  They tell stories of people mysteriously disappearing never to be heard from again while passing near the cave entrance.

These stories may be based more on fact than fiction as they are similar in some respects to UFO [ unidentified flying objects ] abduction reports.

Because of the Sotano de las Golondrinas huge hole size, remote location and unique geological structure it would be an ideal UFO [ unidentified flying objects ] base.  Naturally camouflaged caves, in other parts of the world, may serve as excellent natural bases, way-stations or depots for UFOs.

1968 December – Nevada and Canada

In December 1968, the ” SCHOONER EXPERIMENT ” was an underground nuclear test conducted that substantiates the theory that caves in North America and South America are intimately linked.

The Schooner Experiment was a 35-kiloton nuclear bomb exploded under the Nevada desert, however 5-days later and 1,000 miles further away in Canada 5-days test radiation levels rose up to 20 times greater than at the Nevada test site.  The only way the radioactive dust could have traveled that far is through an interconnected system of caves extending all the way from Nevada to Canada!”

1932 – California, Death Valley, Panamint Mountains

Bourke Lee authored the book, ‘Death Valley Men’ ( MacMillan Co., N.Y. 1932 ), wherein Chapter “Old Gold” described a conversation Bourke Lee had with a small group of Death Valley, California residents speaking of Native American Paihute Indian legends when two ( 2 ) men ( Jack and Bill ) began describing their experience with accidentally discovering an ‘underground city’ after one of them fell through the bottom of an old mine shaft near Wingate Pass where they began following a natural underground quay like tunnel system ( apparently formerly lit by subterranean gas light ) traveling 20-miles northward on an incline taking them to a higher level with an exit out onto a ledge looking about halfway down the slope of the eastern face of the Panamint Mountains, but back down within the tunnel system they came across a huge ancient underground cavern city that contained many sights, including several perfectly preserved mummified bodies ( mummies ) still wearing thick arm bands, gold spears, a ‘large polished round table’ within another huge ancient chamber, giant statues comprised of gold, stone vaults where ‘drawers’ were filled with a variety of precious gem stones and gold bars, stone doors perfectly counterweighted easy to open, amongst other amazing sights, as well as ‘extremely heavy stone wheelbarrows’ designed with scientific counterweight construction for perfect balance so they could easily be manipulated.

From the ledge, they could see Furnace Creek Ranch and its Arroyo ( a water wash ) below them in Death Valley where they realized it was formerly filled with water so, they concluded that having previously seen the underground city system of huge archway openings could have been ancient waterway docks for large boats.

Bourke Lee was further informed that they brought some of the treasure out of the caverns and tried to set up a deal with certain people, including scientists associated with the SMITHSONIAN INSTITUTE ( Washington, D.C. ) to garner assistance in further exploring and publicizing this ancient underground city as one of the wonders of the World, but there efforts ended in disappointment when a friend of theirs stole the treasure ( their ‘evidence’ ) and were consequently scoffed at and rejected by scientists because when the discoverers went to show the ‘mine’ entrance after a then apparent cloud-burst brought such severe rains upon the entire hillsides were washed down over entire countryside landscape areas rearranged to obscure the entrance location.

The last time Bourke Lee heard from his friends, Bill and Jack, they were preparing to climb the eastern face of the Panamint Mountains to locate the ancient tunnelled-out ledge opening – located half way up the side of that steep slope. Bourke Lee never saw or heard from his discoverer friends ( Jack and Bill ) again.

During their initial lengthy conversation with Bourke Lee, the two ( 2 ) discoverers ( Jack and Bill ) had previously ‘revealed secrets of the underground city’ to others, but they discussed many things, including:

An alleged ‘subterranean race’ living in deep underground caverns beneath the ‘former seabed floor area’ of what is now the desert area of Death Valley, California.

There was another conversation about a remarkable Native American ‘Paihute tribe legend’ similar to an ancient myth of Greece.

The Paihute legend surrounds the death of the wife of a tribal Chief who, according to Native American tradition, took a ‘spiritual journey to the underworld’ to locate her, and amazingly upon returning with her he forbiddenly ‘looked back’ and was then prevented from bring his wife the remainder of the back from the dead.

This would not be the same as a more tangible earlier report from the Native American Navaho indian Oga-Make who conveyed that a Native American Paihute indian tribal Chief was alleged to have been ‘physically’ taken into the Native American “Hav-musuv” tribe subterranean cities beneath the Panamint Mountains.

Paihute indian legends, of the Hav-musuvs indicate these ancient Panamint Mountain dwellers abandoned their ancient city within by migration deeper into larger caverns below.

Could these reports coincide with the Paihute legends of the Hav-musuvs?

Bourke Lee, discourse ( below ):

“… The ‘professor’, ‘Jack’ and ‘Bill’ sat in the little canvas house in ‘Emigrant Canyon’, and heard the legend all the way through.

The professor said, ‘That story, in its essentials, is the story of Orpheus and Eurydice.’ ( Greek mythology )

‘Yes,’ I said, ‘it’s also a Paiute legend. Some indians told that legend to ‘John Wesley Powell’ in the sixties.’ ( 1960s )

‘That’s very interesting,’ said the professor.  ‘It’s so close a parallel to Orpheus and Eurydice that the story might well have been lifted bodily from the Greeks.’

Jack said, ‘I wouldn’t be surprised.  I knew a Greek.  I forgot his name, but he ran a restaurant in almost every mining town I ever was in.  He was an extensive wanderer.  The Greeks are great travelers.’

Bill said, ‘They don’t mean restaurant Greeks.  The Greeks they’ve talked about have been dead for thousands of years.’

‘What of it?’ asked Jack, ‘maybe the early Greeks were great travelers, too.’

The professor said, ‘It’s very interesting.’

‘Now! About that tunnel,’ said Bill, with his forehead wrapped in a frown, ‘You said this indian went through a tunnel into a strange country, didn’t you?’

‘Yes,’ I said, ‘I think I called it a cave or a cavern, but I suppose a miner would call it a tunnel. Why?’

‘Here’s a funny thing,’ said Bill, ‘This Indian trapper living right across the canyon has a story about a tunnel, and it’s not 1,000 years old either. Tom Wilson told me that his grandfather went through this tunnel and disappeared. He was gone 3-years, and when he came back he said he’d been in a strange country living among strange people. That tunnel is supposed to be somewhere in the Panamints ( Panamint Mountains ) not awful far from where we’re sittin. Now! What do you make of that?’

Jack said, ‘I think Tom’s grandfather was an awful liar.’

I said, ‘Tom’s grandfather lived when the Paiutes ( Native American indian tribe ) were keeping their tribal lore alive. He probably knew the old legend.  Powell ( John Wesley Powell ) heard it in Nevada only 65-years ago.’

‘It’s very interesting,’ said the professor.

‘I got an idea about it,’ said Bill ( thoughtfully ), ‘Tom’s grandfather might have wandered into some tunnel all goofy from chewin’ jimson weed and then come out an found some early whites ( pioneer caucasian settlers ) and stayed with them. Tom told me that the people spoke a queer language and ate food that was new to his grandfather and wore leather clothes. They had horses and they had gold. It might have been a party in Panamint Valley, or even early explorers or early settlers in Owens Valley ( California ). How about that?’

Jack said, ‘Yeah. The Spaniards ( Spain ) were in here, too. So it might have been Spaniards ( Spanish ) or the early Greeks ( Greece ). And, where is this tunnel? And why did Tom’s grandfather have trouble speaking the language? This is an entirely different story than the one Buck told.  We are arriving at no place at all with these Indians and Greeks. To return – for a moment – to our discussion of geology, professor, ‘Have you been in Nevada much?’

From that point forward the conversation went onto another subject.

1970′s ( early ) – California, County of Kern, Garlock, Goler, and Mojave

An area in the [ southern California ] Mojave Desert region, that may connect to the U.S. Western Region subterranean ( subsurface ) drainage network, involves “Red Mountain” ( also known as the “Iron Mountain Range” ) [ 1-mile northwest from the old ghost town of Garlock, California ] where one [ 1 ] of its [ southeastern ] peaks in the “El Paso Mountains” – [ about 20-miles ] northeast of Mojave, California where there are many bizarre accounts connected with this mountain that apparently got it’s name in-part from the many old mines which can be found there, along with numerous natural cavities which open out to the surface in many different areas.

The area has allegedly been the site of certain activity concerning Native American Indian ritual and occult practices, as well as the site of alleged secret government activity, some of which reportedly involves the observation and monitoring of strange [ biological ] creatures and ‘automatons’ [ half and half, “Man-Machines” (aka) “Manchines” ] said to stealthily ( night cloakers ) emerge from seemingly out-of nowhere ( faint sound of electronic whirring ) that travels up and down and into the [ canyon ] areas on occasions.

Just exactly what these ‘bionic creatures’ are is uncertain, but some accounts indicate that they are dangerous!

Could it also be a ‘magnetic’ zone due to the high iron content [ within the “Iron Mountain” range ]?

During the early 1970′s, on ‘no less than’ two ( 2 ) seperate occassions have U.S. federal government employees mysteriously disappeared from ”Red Mountain” range areas, and not reported back for work the following day.

The second [ 2nd ] occassion was, when a U.S. government employee went missing while investigating the previous disappearance of the U.S. government employee during the first [ 1st ] occassion.

While all of the facts, surrounding United States missing federal employees were not released to the public, portions of information were discovered early-on into the case situation that was limitedly documented by a few additional facts:

… [ EDITED-OUT ] …

Some of this particular report is only held ‘in-part’ as ‘proprietary information’ by Kentron Intellect Research, and ‘officially’ in-full by the U.S. Geological Survey ( USGS ), U.S. Department of Justice Federal Bureau of Investigations, and other U.S. government authorities governing release of sensitive information. ]

– West Virginia, County of Webster ( Northern )

Some years ago, a woman by the name of Joan Howard – at the time living in eastern Canada although originally from Britain ( UK ) – wrote a manuscript in which she described her own paranormal experiences with small “alien” entities.

Joan had experienced several UFO type abduction / encounters while at a very young age when she still lived in Britain ( UK ), and claimed to have had ‘psychic contact’ with [ biological ] beings that claimed to be of extraterrestrial origin.

These experiences were accompanied by a great deal of occult manifestations – such as poltergeist phenomena, psychic dreams, encounters with invisible entities, etc.

Joan even admitted that she often doubted the claims of these [ biological ] ‘beings’ – their actions being manipulative and just didn’t seem to coincide with their claims of being here as some kind-of ‘group of cosmic saviors’ to ‘lead humanity’ into a ‘New Age’ of ‘enlightenment’.

She also warned other researchers, to retain a “keen analytical mind” – when dealing with alien entities – so as not to fall under possible deception or manipulation.

Perhaps, as she suggested to others, they [ alien beings ] ‘might actually be here’ to ‘prepare for a future invasion of this planet’ and were merely ‘using her for various purposes to help prepare the way’, and that all of their ‘benevolence’ talk was just that – talk!

She ‘did’ describe vivid “dreams” in which she saw ‘alien craft hovering over major cities blasting frightened and terrified people in the streets’ with powerful ‘beam weapons’ – ‘dreams’, which she suggested, might be somewhat ‘prophetic’ in nature.

She described the [ biological ] entities as being small or dwarfs, yet was unsure whether they were human or not – although they ‘did’ attempt to pass themselves off as some type of ‘evolved human species’ – something which the ‘Grays’ [ biological alien beings ] have apparently done in order to break down any natural enmity which might prevent their ‘contactees’ or ‘abductees’ from receiving the lies which they intentionally fed them as part of their program of conquest and control.

Joan Howard, incidentally, wrote a privately published book, “The Space – Or Something – Connection,” which is referred to because it dealt with some experiences her husband had – shortly after she came to America.

In fact she devoted an entire chapter of her book ( “The Space – Or Something – Connection” ) to her husband’s account, which involved some incidents that took place while he was doing some field work for a certain company requiring a great deal of activity outdoors, and her husband and his co-workers travelled through some relatively unpopulated terrain in West Virginia regional areas between Newville, West Virginia ( Braxton County ) and Helvetia, West Virginia ( Randolph County ) general around northern Webster County, West Virginia where through mountains of rolling hill forests and wilderness he encountered some very strange things and heard accounts of strange cave related incidents from the locals.

During, which at one point, her husband claimed their group ran across what appeared to be a pipe sticking up from the ground – far away from the nearest town – where there was no other sign of civilization or anything man-made for miles on either side; yet here was this large pipe or tube sticking straight up from the ground.

The most remarkable thing about the pipe was that a flame of fire was shooting straight up out-of the pipe as if it were burning-off some type of gas – they never found out just what it was – but it was ‘within this same general area’ they explored ‘caverns’ containing unexplained issues.

One ( 1 ) of the caves displayed strange hieroglyphic writings on its walls, according to some men, while others claimed also hearing faint volumes of voices – behind the walls of the cave, in-addition to faint sounds – coming from beneath the cave floor – as though machines were moving around within ‘underground’ depths.

Her husband claimed, that after a long work day in the field, one evening two ( 2 ) men fell asleep at the mouth of one particular cave that inside ( a great distance away ) contained an unexplored but apparently very deep chasm ( hole ), and the following morning one ( 1 ) of the two ( 2 ) men awoke ( in front of that cave ) but found his partner had disappeared – no trace was ever found of that missing employee.

That particular cave had been known as a place of unusual occurrences, and a place to stay away from. Some even went so far as to call it “Satan’s Lair.”   Whatever the case circumstances may actually be, all the aforementioned information may provide some additional awareness into what may have surrounded that employee’s disappearance.

One of the most remarkable accounts that Joan Howard’s husband heard involved a man claiming, that while exploring labyrinth depths of a particular cavern in the same area of north Webster County, West Virginia ( USA ), he suddenly came face to face with an attractive woman completely void of any hair on her head, and the woman spoke in a language completely foreign to the man – whereupon after unsuccessfully trying at great lengths to communicate with each other they departed and went their separate ways.

1989 – California ( Southern )

On November 3rd, 1989 Ken Hudnell, a well-known Los Angeles, California radio talk show host announced – over broadcast airwaves – his intention to take a group to visit ‘one of the ancient underground cities’ that had an ‘entrance’ located 60-miles from Anaheim, California. ( The Leading Edge Magazine )

1962 – California ( City of Mojave ), Nevada ( Carson City ), Utah ( Zion Canyon ) and Arizona ( Page )

During the 1940′s, one of the few specialized publications – that grew out-of the Palmer – Shaver ( Richard S. Shaver ) controversy – was The Hidden World ( issue A-8 ) reporting about a letter released from Charles Edwards ( aka ) Chuck Edwards, a researcher, surrounding what many people ( especially those throughout southern California ) have believed for decades based on having been officially told the United States Western Regional area subterranean drainage ultimately all flows into the Pacific Ocean.      Later, in 1962, Chuck Edwards released some of his own discoveries surrounding the “Western Subsurface Drainage Network” ( i.e. southern California, Nevada, and Utah ) that ‘does not’ “ultimately flow into the Pacific Ocean,” but actually flows ‘underground’ through a ‘vast subterranean network drainage system’ dumping elsewhere.

Addressed to Richard S. Shaver, the Chuck Edwards’ letter reply is quoted ( below ):

“This letter is in reply to your January 31 letter.  Please forgive me for not answering sooner.  Enclosed is some material I hope that you can glean something of value [ from ].  Please be as candid as you have been in the past and if I am far off base don’t hesitate to tell me. …

Our foundation has located a vast system of underground passages in the Mother Lode country of California. They were first discovered in 1936, ignored by all even with our best efforts to reveal them.

Recently a road crew blasted out an opening verifying our claims. One [ of the chambers is ] 200-feet long, 70-feet wide and 50-feet high.

We have disclosed what we believe to be a vast subterranean drainage system ( probably traversing the Great American Desert country for a distance of more than 600-miles ).

We believe this system extends out like five [ 5 ] fingers of your hand to such landmarks as Zion Canyon in Utah, the Grand Canyon [ Arizona ], another runs south from the Carson Sink in Nevada, and yet another follows [ below ] the western slope of the same range – joining it’s counterpart and ending somewhere in the Mojave Desert [ southern California ].

We believe – contrary to orthodox geologists – that the existence of this underground system, drains all surface waters running into Nevada ( none, with the exception of the Armagosa, runs out ) and accounts for the fact that it is a Great American Desert. The hairy creatures, that you have written about, have been seen in several of these areas. Certainly there has been much ‘saucer’ [ UFO ] activity in these parts. For 2-years, I have collected material pertinent to these creatures and if you have any opinions along these lines I would appreciate hearing them.

So much for now.  I hope that I am still your friend.

Much of my time has been devoted [ to ] helping a farmer near Portland [ Oregon ] who has made a fantastic discovery of incredible stone artifacts. He has several tons of them. They predate anything yet found ( or accepted ) let us say that for now.

We are making slow but steady progress in getting through the wall of orthodoxy.

– Chuck Edwards”

1946 – California ( Northern ), Mt. Lassen, Cascade Mountains, Cascadia Fault, Oregon, Washington and Canada ( British Columbia )

Following the Sierra Nevada [ mountain ] range from here [ California ] into the northern territories, one arrives at the Cascade Range [ mountains ], consisting mostly of dormant or extinct volcanic mountains rising at intervals through the U.S. northern State of California, Oregon, Washington, and into southwestern Canada.

The Cascade Range [ Cascadia Fault zone ] is not without it’s own peculiar accounts of subterranean recesses occupied by unknown beings – both human and non-human – apparently rediscovering what are portions of ancient antediluvian underground networks, which some say were inhabited by a race of intelligent [ biological ] but war-like hybrid reptiles genetically resembling instances of humanoid shapes.

There are many unanswered questions as to just how the subsurface world was used or exactly what role it may have played in relation to subterranean ancient legends of inhabitant races, but the following account may explain some mysteries by envisioning a clearer and broader perspective.

Around September 1946, Ralph B. Fields submitted his account to Amazing Stories Magazine ( December 1946 issue, pp. 155-157 ), with the assurance that it actually happened and his facts were true, as follows:

“In beginning this narrative and the unexplained events that befell my friend and myself, I offer no explanation, nor do I even profess to offer any reason.  In fact I have yet to find a clue that will even in part offer any explanation whatever. Yet as it did happen, there must be some rhyme or reason to the whole thing. It may be that someone can offer some helpful information to a problem that just should not exist in these times of enlightenment.

To begin with, if we had not been reading an article in a magazine telling us about the great value of guano, ( i.e. old cave bat excrement / dung droppings as being highly valued fertilizer ] that have accumulated over a great number of years, we would have continued to mend our merry way through life without ever having a thing to worry about. But having read the article as we were at the time living near a small town called Manten in Tehama County, California we thought that would be a good country to explore for a possible find of this kind.

After talking it over for some time, and as we had plenty of time just then, we decided to take a little trip up the country just back of us.

As we were almost at the foot of Mount Lassen, that seemed the best place to conduct our little prospecting tour.

Collecting a light camping outfit, together with a couple of tents to sleep in, we started out on what we expected to be a 3 or 4-day jaunt up the mountain … I guess we covered about 10 or 12-miles on the 3rd day and it was fast approaching time to begin to look for a place to spend the night and the thought was not very amusing as it had turned a little colder and we were well over 7,000 feet above sea level.

We soon found a sheltered place, beneath a large outcrop of rock, and set about making a camp.

As I was always the cook, and Joe the chore boy, I began getting things ready to fix us some grub. Joe began digging around for some dead scrub brush to burn.

I had things all ready and looked around for Joe and his firewood, but I could see no signs of him.

I began calling for him, and he soon came into sight from around the very rock where we were making our camp.

And I knew he was laboring under some great excitement and his face was lit up like a Christmas tree.

He had found a cave.

The entrance was on the other side, of that very rock.

He was all for exploration right away.

But I argued that we had better wait till morning.

But he argued that, in a cave it was always night and we would have to use flashlights anyway, so what would be the difference?

Well, we finally decided that we would give it at least a once-over after we had a bite to eat.

It wasn’t much to call a ‘cave’ – at first – as it had a very small entrance, but back about 20-feet it widened out to about 10-feet wide and around 8-feet high. And it did reach back a considerable distance as we would see at least 100-yards and it appeared to bend off to the left. The floor sloped slightly down. We followed to the bend and again we could see a long way ahead and down … At this point we became a little afraid as we were some way into the mountain …

I don’t know how far we went, but it must have been 1-mile or 2-miles, as we kept on walking and the cave never changed it’s contour or size. Noticing this I mentioned it to Joe.

And we discovered an amazing thing. The floor seemed to be worn smooth as though it had been used for a long time as a path or road. The walls and ceiling of the cave seemed to be cut like a tunnel. It was solid rock and we knew that no one would cut a tunnel there out of rock as there had been no sign of mining operations ( tailings ). And the rock in the walls and ceiling was run together like it had been melted. Or fused from a great heat.

[ UPI NOTE INSERT ( here ): It is believed the U.S. government possesses PlasMole ( aka ) “Terron Drive” ( ref.: Paul Peter Schneider ) tunneling machines that travel at 5-mph and can permeate – by melting – a 50-foot hole through solid rock. ]

While we were busy examining the cave in general, Joe swore he saw a light way down in the cave.

We started down the cave once more and found a light. Or should I say the light found us as it was suddenly flashed into our faces. We stood there blinded by it for a minute until I flashed my light at it’s source and saw we were confronted by three [ 3 ] men. These men looked to be about 50 or a little younger. They were dressed in ordinary clothing such as is worn by most working men in the locality. Levi type pants and flannel shirts and wool coats. They wore no hats. But their shoes looked strange as their soles were so thick that they gave the impression of being made of wood.

There stood three [ 3 ] men looking at us in a cave, 1-mile or so in the depths of old Mount Lassen … One of them spoke to us. He asked what we were looking for … we came to the conclusion that we had better retreat. Turning to go we were confronted by two [ 2 ] more of them. One of the strangers told us, ‘I think maybe you had better come with us.’ … So we permitted the five [ 5 ] to escort us deeper into the depths of old Mt. Lassen … They had led us farther down and I guess we had gone a couple more miles when we came to the first thing that really amazed us.”

We came to a place where the cavern widened out a little and we saw some kind of machine, if it can be called that. Though I had no chance to examine it closely at the time, I did later and it was a very strange contrivance. It had a very flat bottom, but the front was curved upward something like a toboggan. The bottom plate was about 8-inches thick and it was the color of pure copper. But it was very hard tempered. Although I have had a lot of experience in metals and alloys, I had no opportunity to examine it closely enough to determine just what it was. I doubt very much if I could.

It had a seat in the front directly behind a heavy dashboard affair and there was a dial shaped in a semi-circle with figures or markings on it. I had not the slightest idea what they stood for, but they were very simple to remember. If there was a motor, it was in the rear.

All I could see was two [ 2 ] horseshoe or magnet-shaped objects that faced each other with the round parts to the outside. When this thing was in operation, a ‘brilliant green arc’ seemed to leap between the two [ 2 ] and to continue to glow as it was in operation. The only sound it gave off was a hum or buzz that sounded like a battery charger in operation.

The seat in the front was very wide.

The only method of operation was a ‘black tear-shaped object’, which hung from the panel by a chain.

One [ 1 ] of these men – sitting in the middle – took this thing and touched the sharp end to the first [ 1st ] figure on the ‘left side’ of the dial.

When he touched the first [ 1st ] figure, the contraption seemed to move almost out from under us, but it was the smoothest and quietest take-off I ever experienced. We seemed to float. Not the slightest sound or vibration.

And after we had traveled for 1-minute he touched the ‘next [ 2nd ] figure’ on the dial and our speed increased at an alarming rate.

But when he had advanced the black object over past the center of the dial, our speed increased until I could hardly breathe.

I can’t begin to estimate the distance we had traveled or our speed, but it was terrific.

The two [ 2 ] horseshoe objects in the rear created a green light that somehow shone far ahead of us, lighting up the cavern for a long way.

I soon noticed a black line running down the center of the cavern and our ‘inner-mountain taxi’ seemed to follow that.

I don’t know how long we continued our mad ride, but it was long enough for us to become used to the terrific speed and we had just about overcome our fear of some kind of wreck when we were thrown into another spasm of fear. Another machine of the same type was approaching us head on. I could see that our captors were very nervous, but our speed continued.

As the other machine became closer our speed slowed down very fast and we came to a smooth stop about 2-feet from the front of the other machine.

Our machine had no sooner stopped than our captors leaped from the machine and started to dash away.

A ‘fine blue light’ leaped from the other machine in a ‘fine pencil beam’ and it’s sweep caught them and they fell to the cavern and lay still.

The figures dismounted from the other machine and came close to us.

Then I noticed they carried a strange object in their hands. It resembled a ‘fountain pen flashlight with a large round bulb-like affair on the back end and a grip’ – something like a German luger pistol.

They pointed them at us. After seeing what had happened to our erstwhile captors I thought that our turn was next, whatever it was.

But one [ 1 ] spoke to us.

“Are you surface people?”

I guess we are, as this is where we came from very recently.

“Where did the horlocks find you?”

If you mean those guys, I pointed to the five [ 5 ] motionless figures, back there a few hundred miles – I pointed toward the way we had come in our wild ride.

“You are very fortunate that we came this way,” ‘he’ told us, “You would have also become horlocks and then we would have had to kill you also.”

That was the first time I had realized that the others were dead. They put their strange weapons away and seemed friendly enough, so I ventured to ask them the who and why and everything we had run into.

I told them of our search for guano and how we had encountered the five [ 5 ] horlocks, as he called them, and asked ‘him’ about the machines, their operation and could we get out again?

He smiled and told us, “I could not tell you too much as you would not understand. There are so many things to explain and you could not grasp enough of what I could myself tell you. The ‘people on the surface’ are ‘not ready to have the things’ that ‘the ancients’ have left. Neither I nor any one in any of the caverns know why these things work, but we do know how to operate some of them. However, ‘there are a great many evil people here who create many unpleasant things for both us and the surface people’. They are safe because ‘no one on the surface believes us or them’. That is why I am telling you this. No one would believe that we exist. We would not care, but there are many things here that ‘the outer world must not have until they are ready to receive them’, as ‘they would completely destroy themselves, so ‘we must be sure that they do not find them’. As for the machine, I don’t know how it works, but I know some of the principles of it. It works simply by gravity. And it is capable of reverse. The bottom plate of it always is raised about 4-inches from the surface of the floor. That is why there is no friction and has such a smooth operation. This ‘object suspended from this chain is pure carbon’. It is the key to the entire operation. As I told you before, I cannot explain why it runs, but it does. We want you two [ 2 ] to ‘return to where you came’ and ‘forget about us’. We will show you ‘how to operate the sled’ and ‘we want you never again to enter the cave’. If you do – and you do not encounter the horlocks – we will have to do something about you ourselves so, ‘it would not be advisable to try to return at all events’. One thing I can tell you. ‘We never could permit you to leave another time’.”

He explained to us the operation of the machine and in some way reversed it’s direction.  So thanking them, we seated ourselves in the sled, as he had called it, and were soon on our way back.

Our return trip was really something we enjoyed, as I was sure not to advance the carbon far enough on the dial to give us such terrific speed, but we soon found ourselves where we started from.  The sled slid to a smooth stop and we jumped out and started up the cave afoot.

We must have walked a long way coming in, for we thought we never would come to the surface.  But at last we did.  And it was late afternoon when we emerged.

We lost no time in making our way down the mountain, and Joe tells me that he isn’t even curious about what is in that cave. But I am.

What is the answer to the whole thing?  I would like to know.

We had been told enough for me to believe that down there – somewhere – there are things that might baffle the greatest minds of this Earth. Sometimes I’m tempted to go back into that cave if I could find it again, which I doubt, but, then I know the warning I heard in there might be too true, so I guess I had better be of the same mind as Joe.  He says: ‘What we don’t know don’t hurt us’.

Regardless of Joe’s opinion, however, there is reason to believe that influences from these nether regions can and do affect “us” in a profound way, and even the men whom Ralph and Joe encountered, whoever they were, admitted this fact.

====

Submitted for review and commentary by,

Kentron Intellect Research Vault ( KIRV )
E-MAIL: KentronIntellectResearchVault@gmail.com
WWW: http://KentronIntellectResearchVault.WordPress.Com

UFO Flight Training

[ PHOTO ( above ): Nellis AFB Range Test Site S4 UFO Flight Simulators ( click to enlarge ) ]

UFO Flight Training
by, Concept Activity Research Vault ( CARV )

November 10, 2011 13:22:08 ( PST ) Updated ( Originally Published: January 25, 2011 )

NELLIS AIR FORCE BASE – November 10, 2011 – Had one not been familiar with the official background on such subjects, they might have quickly nominated a certain gentleman ( reported about below ) to receive a U.S. Academy Award ‘secondary prize’ of ‘free mental health treatment’, but after carefully considering this subject to be reported, some may see where ‘this particular information’ certainly ‘pushes the envelope of reality’ – beyond where most professionals never venture.

– – –

IMPORTANT NOTICE:

All information – contained within this report ( below ) – was obtained from ‘official U.S. federal government sources’ ( applicable ‘quoted texts’, ‘images’ and ‘videos’ – below ), ‘worldwide mainstream news media public broadcasts ( ‘text’ and ‘videos’ – below )’, ‘multiple other officially recognized as being publicly reliable information sources’ plus all those are provided additionally in a long list of “References” ( found at the bottom of this report ). The only exception a ‘usual journalistic speculative digest of remarks ( text )’ aligning ‘official information’ with ‘enhanced clarity’ plus ‘information continuity formatting’ for ‘easier public reading comprehension’.

This website report ‘does not’ asbscribe to any version of ‘religious radicalism’, ‘doomsday prophetics’, ‘conspiratorial theories’ or any ‘other imaginings’ that may or may not be circulating on the internet.

This website publicly reports ‘only factually proven information’ through ‘ardous research’ in ‘adherence to strict principles’ only ‘obtaining official information’ from ‘official information sources’ confirmed by ‘no less than’ two ( 2 ) ‘additional independent sources’. While scientific and technological information may vary, amongst professionals, only those held in ‘good standing’ by a ‘worldwide list of prominent universities’ and ‘other prominent organizations’ provided additional factual support documentation.

This website maintains a ‘private database’ of ‘official documents’ and ‘other credible source information support material’ that ‘officially serves to substantiate all information’ contained in ‘all reports published’ here.

– – –

History may someday reveal more about what the now-late William Uhouse, a mechanical engineer ( 1966 – 1979 ), outlined surrounding ‘official U.S. government contract “business agreements” relating to ’private-sector’ high-risk work involving, amongst a myriad of others, engineering companies, contractors, sub-contractors, engineering consultants and engineers hired to perform extremely dangerous high-risk work involving secret hazards surrounding such work.

Disclosure, for the most-part, is predominantly not a favorite intelligence past time, for the less private-sector individuals know about what surrounds their contracted task assignments, all the better can government maintain its secrets  – until something goes terribly wrong on a U.S. government contractor job site.

A classic example of what could go wrong was presented by a patriotic American, the now-late ( deceased ) Paul Philip Schneider (aka) Phil Schneider, an engineering consultant hired by MORRISON-KNUDSEN INC. ( Tulsa, Okalahoma ) – a sub-contractor company for CER GEONUCLEAR CORP. ( Tulsa, Oklahoma ) that was a subsidiary of the U.S. government contractor company EG&G that maintained a super secret facility building on the ground inside a U.S. government highly restricted Range Test Site ( RTS ) known as the infamously spooky Area 51 near Groom lake, Nevada.

Much like Paul P. Schneider, William Uhouse –  another engineer – at a very late stage in his lifetime began revealing – and consequently became more comfortable ( to ‘some degree’ ) about what he revealed his first-hand knowledge of – and that was, at least ‘four’ ( 4 ) ‘Extraterrestrial Biological Entities’ ( EBE ) commonly referred to as  “Extra-Terrestrials” (aka) ETs (aka) aliens held under U.S. government command directives issued to work alongside private-sector engineers cleared to work on secret-sensitive classified government Programs and Projects at specifically selected locations.

See what I mean? Sounds just like “government conspiracy” material. Well, that too was my initial knee-jerk reaction while I was simultaneously pondering an the ‘free mental health treatment’ prize for ( were he still alive ) Uhouse, however in-lieu of that curious situation, I additionally considered handing myself a ‘similar prize’ for even having considered reporting about this, until looking further into the Uhouse case where eventually all doubt came to evaporate. Doing the research, was ‘no simple task’, especially when the claimant ( Uhouse ) turned-up dead.

According to Bill Uhouse ( see, I immediately start pointing my finger at him! ), these extraterrestrial biological entities ( EBE ) were extracted from an ‘underground bunker’ where supposedly thousands more extraterrestrial biological entities ( EBE ) – controlled by the U.S. government – held in Deep Underground  Military Bases ( D.U.M.B. ) through an elaborate network of tunnels leading to various and sundry equally deep underground pockets of caves / caverns.

I know, by now, those so-far reviewing this will ensure I receive the ‘crazy prize’ instead of ( posthumously ) Uhouse, but not to worry as it only gets worse until getting better when viewing the ‘official government films’ ( further below ), but ‘reading this report further will assist skeptics’ in understanding what can be identified within the videos. Jumping the gun by advancing to view the videos will net ‘little to no information’ so, I’m counting-on ‘official government speed readers’ ( reviewing this report ) to jump-the-gun and do exactly that so, hopefully I won’t be pestered ( by them ) later-on for having publicly reported something too sensitive. I just love doing reports like this.

Bill Uhouse goes on to described – in ‘very specific words’ ( as though he were sending a coded message to someone knowing more about these EBEs ) – the four ( 4 ) beings as what he calls ” Feeling Good Guys ” ( ” FGG ” ) especially selected – by U.S. government officials – for transport ‘away from’ one Nevada ( USA ) facility and ‘sent to’ another facility in New Mexico ( USA ) but then transported ‘back to’ to Nevada ( USA ) at a facility believed to be on the Nellis Air Force Base ( NAFB ) Range Test Site ( RTS ) known as “S4″ ( ’near’ – but ‘not at’ – the infamously spooky Area 51 ) where Uhouse claims these particular four ( 4 ) EBEs or ‘aliens’ were assigned – by an official U.S. government directive – to ‘work alongside private-sector members’ of his ( Uhouse’s ) engineering team assigned likewise to the same U.S. government Program Project.

William Uhouse specifically mentions but ‘one ( 1 ) alien’ ( a “Grey” – bluish-purple skin color type of ‘extraterrestrial biological entity’ – EBE ) intimately that he was told was nicknamed ” JAYROD ” ( also known as ) ” JROD ” ( aka ) ” JRD ” (aka) ” JERROD ” who was presented to him ( Uhouse ) as a ‘personally assigned’  “scientific translator.”

This is ‘not’ the first time the name “Jerrod” has surfaced surrounding Groom Lake, Nevada NAFB S4 and Area 51 extraterrestrial reverse-engineering Program Projects where Bob Lazar referenced someone he claimed was his “friend” he cleverly gave the name “Jerrod” to, and this “Jerrod” was claimed by Bob Lazar to have revealed ‘highly classified historical background information about’ surrounding certain government Program Project activities in the same location areas.

Bill Uhouse claims JAYROD ( the alien Grey ) – also known as – “Jarod” used ‘mental telepathic communication skills’ to receive questions he ( Uhouse ) presented to it ( the alien ), but before he ( Uhouse ) could purse his lips to begin asking audible questions about ’engineering configurations’, JayRod ( alien ) instantly and telepathically transmitted ‘mental telepathy replies’ as ’engineering instructions’ back to Uhouse.

Uhouse was believed to have been dealing with what had been presented elsewhere – during the 1980s – in what was leaked to the public – during the early 21st Century – as ‘alien symbolic construct science technologies’ studying translations of a ‘materials science language’ from a strange looking ‘map’ containing ‘patterned symbols and designs’, which Uhouse ( alone ) could not possibly comprehend in order to properly lead engineering teams into formulating ‘standard basic application designs’ enabling the U.S. government military to develop an actual UFO ‘flight simulator’ – or ‘lenticular craft simulator’ ( “flying disc simulator” ) – specifically designed so ‘only human astronautic experienced pilots’ – assigned to the National Reconaissance Office ( NRO ) and NASA – could learn how to fly UFO type spacecraft equipped with secret extraterrestrial ( ET ) technology capabilities.

Official U.S. Air Force ( USAF ) ‘motion picture film footage’ shows men near one type of “flying disc demonstrator” – wrapped and resting on its ‘container base’ with its ‘container cover’ suspended above it – and film footage of ‘teams of men’ guiding ‘suspended sub-sections’ of a ‘disc’ being ‘assembled’, and ‘partially assembled sub-sections’ of a ‘disc’, plus ‘more’ ( below ):

After carefully studying the video contents thoroughly and comparing all aspects thereof to what surrounded the AVRO ‘manned aerial vehicle’ ( circa: late 1950s – early 1960s ) – test-flown under a BELL LABORATORIES ( Canada ) ”business agreement” contract under U.S. Central Intelligence Agency ( CIA ) supervision ( from the late 1950s through early 1960s ) – these video contents are ‘definitely not’ films of ’any early developmental stages’ of the AVRO aerial vehicle, nor do video contents of any of the discs depicted therein even remotely represent ’test engineering’ or ‘manufacturing’ facilities where even pieces of the AVRO were ever housed – including BELL LABS ( Canada ) and the MITRE CORP. ( USA ).

Should those, reviewing this report, decide I should still be awarded the ‘free mental health treatment prize’ – alluded-to at the beginning of this report – for revealing what was contained herein, please notify me immediately – otherwise try not to appear so shocked as to what you eventually realized from this report – more-so than initially believed.

As for Extraterrestrial Biological Entities (aka) aliens (aka) ETs (aka) Ancient Earth Underground Dwellers (aka) Feeling Good Guys ( FGG ) – and films thereof – one may only be reminded to ask William Uhouse on their final travel.

As for documentation? Read the “IMPORTANT NOTICE:” ( above ), for without such this report would never have been written.

 

Submitted for review and commentary by,

 

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

 

Robot Combat Intelligence

[ PHOTO ( above ): W-88 miniature nuclear bomb property of USA ( click to enlarge ) ]

Robot Combat Intelligence
by, Concept Activity Research Vault ( CARV )

January 18, 2011 21:08:42 ( PST ) ( Originally Published: February 1, 2002 )

DISTRICT OF COLUMBIA, Washington – January 18, 2011 – Over 12-years ago, after the United States realized too late that its ‘miniature nuclear weapons technology delivery system’ ( W-88 ) secrets had already been stolen ( from the vault of its insurance carrier ) after the People’s Republic of China ( PRC ) rapidly produced their own version, ‘only a select few’ realized a secret U.S. decision took futuristic concepts into development for U.S. global military applications deploying technologies that only seemed to have been conceived from science-fiction motion picture films ( e.g. STAR TREK, STAR WARS, MATRIX, and more ) shocking audiences worldwide.   In 1999, U.S. secret defense endeavors forgings – with several universities and U.S. government contract private sector organizations – were led by the U.S. Department of Defense ( DoD ), Defense Advanced Research Projects Agency ( DARPA ) created even newer more advanced multiple Program stratagems employing various forms of ‘combinatoric’ technologies developed for globally deploying U.S. military dominance with various and sundry secret-sensitive devices and systems far beyond many imaginations.

DARPA SIMBIOSYS Program –

DARPA SIMBIOSYS Program entails, amongst other things, multi-functional microbiological nano technology robot android devices primarily for military applications, where such remained until just a few years ago, until it began being applied in some medical arenas today.

To understand what is ‘current’, one must first look briefly at DARPA Programs ‘past’ ( 1999 – 2002 ), which ( alone ) is enough to ‘still send chills down many people’s spines today’. Once realizing what DARPA was doing 12-years ago, it’s not all that unfathomable to comprehend where DARPA has taken and will continue taking many.

SIMBIOSYS ( 1999 – 2002 ) –

In 1999, DARPA SIMBIOSYS developed a combined quantitative understanding of various biological phenomena characteristics opening the DARPA door to what amounts to MicroElectroMechanical Systems ( MEMS ) integrating microphotonics in, amongst many things such as electro-optic spatial light modulators ( SLM ) combining very short pulse solid state lasers providing powerful new capabilities for secure communication up-links ( multi-gigabits per second ), ‘aberration free’ 3-D imaging and targeting performed at very long ranges ( greater than 1,000 kilometers away ), innovative design system integration of MEMS spatial light modulators ( SLM ) providing quantum wavefront control leaps in photonics and high speed electronics, and even ‘flexible cloth-like smart materials’ DARPA wants hardware placed into production devices and systems applications optimizing both U.S. and ‘its specially selected few other foreign nation U.S. friendlies’ ( Israel ) to hold in future warfaring battlespace management superiority over other foreign nation threats.

DARPA SIMBIOSYS includes classes of biological molecules ( i.e. antigens, antibodies, DNA, cytokines, enzymes, etc. ) for analyses and diagnoses studies, from:

1. Biochemical sensors, sensing ‘details from environments’; and,

2. Biochemical sensors, sensing ‘details from human body fluids’.

Specific examples under each of those two ( 2 ) groups being left up to the discretion of the PI.

Bio-molecules importance slect criteria, includes:

1. Microsystem sensors, for automated sampling and analyses, extendibility;

2. Bio-molecules simulant, to which it represents U.S. Department of Defense ( DoD ) relevant extents; and,

3. Bio-detection high degree of sensitivity and specificity processing, etc.

DARPA SIMBIOSYS emphasis is at the ‘molecular level’ for ‘sensing’ and ‘detection’.

SIMBIOSYS Program precludes human cells and human tissue based sensing because other DARPA programs currently address those issues in combination thereof.

SIMBIOSYS Goals –

SIMBIOSYS Program ‘stimulates multi-disciplinary research’ – bringing together biologists, chemists, engineers, physicists, computer scientists and others to address difficult and pressing challenges in advancing micro and nano-biotechnology.

SIMBIOSYS Program goal is to ‘utilize phenomena’ in ‘bio-fluidic transport’, ‘molecular recognition’ and ‘signal transduction from joint studies in modeling and experiments.

SIMBIOSYS Program joint effort expects results in ‘new hardware device, new hardware processes and new hardware production communities that will begin utilizing new models, new rules, new methods and new processes together enabling design and development of enhanced performance next generation bio-microdevices.

DARPA Advanced Projects –

DARPA is focusing on, amongst many, these advanced projects:

1. Bioengineering artificial intelligence ( AI ) systems sized from nanometers and meters up to large-scale robotic systems deployed globally;

2. Biological hybrid devices and systems, inspired from computational algorithms and models;

3. Biosynthesized composite materials incorporating synthetic enzymes and pathways from biochemical cellular engineered concepts for application productions;

4. Neural phenomena control over system science computation measurement application interfaces addressing humans;

5. Micro-scale reagents biochemically engineered;

6. Biosynthesis signal processing control platform studies;

7. Molecular biological population level behavior dynamic simulation modeling complexes; and,

8. Subcellular device physics affects and cellular device physics affects within biological component systems using real-time non-destructive observation study techniques.

[ PHOTO ( above ): legacy MicroFlyer, only a Microelectronic Aerial Vehicle – MAV ( click to enlarge ) ]

Bioengineered MicroBots Developed & Deployed –

Battlefields now require ‘unmanned combat aerial vehicles’ ( UCAV ) and ‘advanced weapons’ that self-navigate and self-reconfigure with autonomous communication systems accomplishing time-critical commands, however while many use Commercial Off The Shelf ( COTS ) products, such is not the case for developed and deployed bioengineered microrobots.

MicroBot AMR Control By MARS –

DARPA mobile autonomous robot software ( MARS ) Project is designed to develop and transition ‘currently unavailable software technologies programming’ operations of autonomous mobile robots ( AMR ) in partially known changing and unpredictable environments.

DARPA SIMBIOSYS Program aims provide new software removing humans from combat, conveyance, reconnaissance, and surveillance processes by:

1. Extending military hardware range;

2. Lowering manpower costs;

3. Removing human physiology for swifter concepts, designs, engineering, development, and deployment successes; and,

4. Researchers demonstrating autonomous navigation of humanoid robots, unmanned military vehicles, autonomous vehicles and interactions between humans.

DARPA indicates that robots – to be meaningful – must be fully integrated into human lives in military, commercial, educational and domestic usages must be capable of interacting in more natural human ways.

DARPA funded research and development of robots given similar bodies with human-like intelligence for humanoid interaction providing new ways for the human world.

COG Robot –

DARPA funded Massachusetts Institute of Technology ( MIT ) researchers, employing a set of sensors and actuators ( with small microcontrollers for joint level control, up to larger audio-visual digital signal network pre-processors for controlling different levels of its heterogeneous hierarchy network ) approximating human body sensory and motor dynamics, created the robot named COG that eventually allowed DARPA further development of deployable, modular, reconfigurable and autonomous robots.

[ PHOTO ( above ): legacy Biomorphic Explorers – Snakes and Bats ( click to enlarge ) ]

CONRO Robots –

CONRO robots, developed through DARPA, employed autonomous capabilities, of:

1. Self-repair; and, 2. Morphogenesis ( changing shapes ).

Examples, amongst many, included design styled:

Snake robots, able to move ‘in-to’ and ‘out-of’ tight spaces; and,

Insect robots, able to move faster ( covering more ground meeting military mission swifter needs ).

[ PHOTO ( above ): legacy Spider, and Payload biochemical delivery simulation ( click to enlarge ) ]

CONRO robots were design equipped to perform two ( 2 ) missions:

1. Reconnaissance ( activity detection, monitorization, and reporting – surveillance ); and,

2. Deliver small ‘military payloads’ ( bio-chemical weapons, etc. ) into ‘enemy occupied remote territory locations’ ( away-from friendly warfighters ).

CONRO robots are comprised of multiple SPIDERLINK modules.

In 1999, DARPA built both ‘snakes’ and ‘hexapods’ as ‘initially tethered’ prototypes termed 1-DOF, equipped with abilities to both ‘dock’ and ‘gait ambulate’ based on applied computational algorithms.

In 2000, DARPA had twenty ( 20 ) autonomous self-sufficient ‘modules’ – not mentioning what those resembled – built designated as 2-DOF, after:

1. Hormone based control developed and tested theory;

2. Hormone hexapods and snakes implemented motions ( for 2-DOF );

3. Quadrupeds, hexapods and snakes implemented locomotion with centralized control for 2-DOF;

4. Morphing self-repair ‘modules’ delivering small payloads used ‘miniature cameras’ that were designed and tested; and,

5. Snake head with snake tail with configured docking capabilities were implemented laboratory two dimensional ( 2-D ) testing.

CONRO DARPA Near-Term Milestones:

1. Modules’ reconfigurability ( morphogenesis ) robust automation designed and demonstrated ( for 2-DOF );

2. Topology ‘discovery’ ( automatic topography recognition ) demonstration;

3. Gait reconfiguration ( morphogenesis ) automation for ambulating a ‘given’ ( programmed instruction ) topology designed and demonstrated;

4. Gait reconfiguration ( morphogenesis ) automation for ambulating a ‘discovery’ ( automatic topography recognition ) designed and demonstrated;

5. Wireless ( radio frequency, infra-red, etc. ) control of miniature cameras demonstrated;

6. Pointing ( waving, mousepad, etc. ) control of miniature cameras demonstrated; and,

7. Large scale deployment of CONRO robots demonstrated.

[ PHOTO ( above ): DARPA BioBot named Blaberus ( click to enlarge ) ]

Deployer Robot ( DR ) –

Deployer Robots ( DR ) ‘support’ and ‘deploy’ distributed ‘teams of other smaller robots’ termed “Joeys” ( singular, “Joey” ) that perform either ‘hazardous tasks’ or ‘tedious tasks’.

Deployer Robots ( DR ) have two ( 2 ) roles, that:

1. Carry and launch given numbers of smaller Joey robots ( Joeys ); and,

2. Command and control ( C2 ) – after launching – Joey robots ( Joeys ).

[ PHOTO ( above ): legacy CyberLink HID test USAF personnel with DARPA robots ( click to enlarge ) ]

Robot Loop Pyramid –

Robot-in-the-Loop ( RIL ) concept, augments Human-in-Loop ( HIL ), building a ‘pyramid of robots’ – supervised by one ( 1 ) person.

‘Launch’ and ‘Command and Control’ ( C2 ) – of different Joey robots ( multiple, i.e. Joeys ) – two ( 2 ) goals are handled independently, as:

1. ‘Launch’ of robots, via grenade sized Joey robot clusters ( multiple ), developed under DARPA Deployer Robot ( DR ) Program availability of smaller Joeys; and,

2. ‘Command and Control’ ( C2 ), is investigated using ‘larger robots’ developed for DARPA ITO sister Software for Distributed Robotics ( SDR ) Program enabling fully leverage of both Deployer Robot ( DR Program and Software for Distributed Robotics ( SDR ) Program development of algorithms leveraging heterogeneous interaction between a ‘smart’ highly mobile ‘Deployer Robot’ ( DR ) and a ‘team’ of Joey robots that are more powerful, less computational and less mobile.

[ PHOTO ( above ): legacy Virtual Combiman digital glove waving battlespace management ( click to enlarge ) ]

DARPA key universal elements of robot deployment examined:

1. Emplacement – Launching and dynamically situating the Joeys for mission goals;

2. Operations – Maintaining the infrastructure to support the distributed front, including communications and error detection and recovery ( e.g., getting back on course after positional drift ); and,

3. Recovery – Collecting Joey robots data to analyze after delivery into a format useful for the human operator.

DARPA Deployer Robot ( DR ) Program development acquired and refitted two ( 2 ) Urban Robot Upgrades ( URU ) in new Deployer Robots ( DR ) types.

DARPA, investigated five ( 5 ) alternate launch strategies, but selected only one ( 1 ):

1. Grenade barrel launch, delivery of robots, into a three ( 3 ) story building.

2. Grenade barrel launcher was designed, equipped and developed, with:

3. Grenade Magazine contains ‘multiple Joey robots’ for ejection – supports full mobility integrity of the Deployer Robot ( DR );

4. Sensor mast ( collapsible ) – for Deployer Robot ( DR ) interaction with Joey robots launched on arrival at destination location; and,

5. Communication ( 916 MHz ) link between Deployer Robot ( DR ) and Joey robots.

DARPA SDR Program –

DARPA Software for Distributed Robotics ( SDR ) Program development designed and built Joey robot prototypes ( approximately 3-1/2 inch cube ) for ultimate fabrication in a production lot quantity of 120 Joey robot units.

DARPA Software for Distributed Robotics ( SDR ) Program leverage and adaptation controls swarms of Joey robots.

DARPA Near-Term Milestones:

1. Launch propulsion mechanisms ( C02 cartridge, .22 caliber shell, or other ) deployment testing of Joey robots into battlefield areas;

2. Launcher ( of multiple Joey robot deployment ) mechanism built on-board first ( 1st ) Deployer Robot ( DR ) named Bandicoot;

3. Sensor mast ( collapsible ) built and installed on-board second ( 2nd ) Deployer Robot ( DR ) named Wombat;

4. Radio Frequency ( RF ) development protocols for interaction between Deployer Robot ( DR ) and Joey robots;

5. Infra-Red ( IR ) deployment protocols for interaction mechanisms between Deployer Robot ( DR ) and Joey robots using IR ( Infra-Red );

6. Human Interface Device ( HID ) operator remote control unit ( ORCU ) development for Deployer Robot ( DR ).

DARPA SIMBIOSYS began over 12-years ago. All the photographs ( above ) are almost one decade ( 10-years ) old.

Current careful research on this subject further provides more information about where the U.S. stands today.

Submitted for review and commentary by,

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

Reference

http://web.archive.org/web/20021214110038/http://groups.msn.com/AnExCIA/rampdintell.msnw

Secret IT Directorate

CIA HQ former buildings

[ PHOTO ( above ): Former U.S. Central Intelligence Agency Headquarters ( click on image to greatly enlarge ) ]

Secret IT Directorate
by, Concept Activity Research Vault ( CARV )

November 22, 2011 11:45:08 ( PST ) Updated ( Originally Published: October 25, 2010 )

USA, Menlo Park, California – November 22, 2010 – Some may not recall the ‘first public announcement ( 2000 )’ of the United States Central Intelligence Agency ( CIA ) ‘private business corporation’ having been referred to as the IN-Q-TEL CORPORATION INTERFACE CENTER (aka) QIC, it was however fomerly known as IN-Q-TEL CORPORATION, but ‘that company name’ was even formerly known as IN-Q-IT CORPORATION (  ’not’ to be confused with the INTUIT CORPORATION business of software application programs QuickBooks and TurboTax ), however the CIA ’reversed’ its previous private business name change decisions back to it now being known as the IN-Q-IT CORPORATION ( IN-Q-IT ) today. Clear as mud, right?

Some wonder whether ”IN-Q-IT” is even ‘really’ the ’true name’ of this CIA ‘private business’ company today, or whether – within the intelligence community pea ‘n shell game of names – other company subsidiary names may have developed, but for now the IN-Q-IT CORPORATION is ‘currently known’ as being the U.S. Central Intelligence Agency ( CIA ) ‘venture capital’ private business corporation.

Important to understand precisely ‘what this CIA private business was supposed to be accomplishing’ versus ‘what the CIA actually did’ with its private business; more recently, however ’what it has become’ and ‘what it is supposed to be accomplishing’ today and for the future.

The curious state of affairs sees no one knowing anything more about the CIA QIC ( IN-Q-TEL Interface Center ) private business corporation than a few did when it began, but ‘now’ no one is even required to inform the public with an accounting to justify anything surrounding it. Why? Because it was meant to be a ‘private business’ company, ‘not’ a U.S. government entity, and ‘that’ was ‘how’ the CIA created it to remain – outside anyone’s purview – for a rather ‘complex’ reason.

The only method, by which an ‘even more complete’ and ‘even more accurate assessment’ may be formulated for an ’even more thorough understanding’ is quite involved and may at times be highly complex. One must not only review ‘multiple facet areas’ this CIA ‘private business company was originally designed to tackle but more-so what it was supposed to accomplish, and from within ‘both’ of those areas, go on to ’realize precisely’ what ‘were’ and ‘still are’ today’s “problem sets” facing the CIA and just ‘how’ they are juggling it all.

Some believe ‘members’ of U.S. Congressional committees’ and subcommittees’ ‘oversight’ had to attend ‘special educational lessons’ designed by the CIA. Did key members of Congress attend what basically amounted to a CIA ‘school’?

The CIA Congressional school was believed non-existent by many left to see other less palatable theories develop into the U.S. Congress having simply tired over too numerous CIA complex oversight reviews – and so much so that Congress relagated its own authority over to CIA in what some believed tantamount to the CIA ‘fox’ guarding its own global-sized intelligence ‘chicken coop’.

Some may now be enlightened to understand what the United States Central Intelligence Agency ( CIA ) decidely phrased as its own ”radical departure” away-from what it perceived as ‘inefficient economic budget support’ for solving its own ’quantum complexities’ within ’highly specific areas’ – still ‘classified’ in the interest of ‘national security’ – burdensomely producing an exponential growth of new ”problem sets” the CIA would only publicly explain – in the most general of terms – as such ‘experienced from within un-named areas’ of ‘science and technology research and development applications’ that the CIA was decidely viewing to establish ‘limits upon’ and go on to a’simultaneously’ establish as ‘marketable derivatives’ it called ”products” that CIA Office of Science and Technology ( S&T ) oversight could ‘manage distribution of information knowledge’ from but on an ‘in-exchange’ contractual agreement basis with ‘cooperative’ “private sector” ‘individuals’, ‘businesses’,  ’institutions’ and ‘organizations’ and thereby ‘establish who held proprietary keys’ to ‘special skill sets’ of what was already protected ’intellectual property rights’ of “existing” technology and CIA global establishment over all ”emerging” ( new upcoming future ) technology proprietary rights by sole marketeering ‘special talents’ and ‘special services’ could be harvested where incredible amounts of ‘profit’ could also be harnessed ( absorbed ) by the CIA.

To many of entrepeneurial independent spirit this CIA QIC private business corporation appeared, in-essence, out-of nowhere, like a new Borg structure infringing on private freedoms of what few once experienced of global marketplace past, and to others CIA QIC tenor was too Godfather-like – making people and entities an offer they couldn’t refuse. While the CIA foresaw such rumors and speculation coming,  in reality, what ‘was’ the ‘CIA’ doing by opening-up ‘its own private business corporation’?

Visionary dreams may be able to see the United States Central Intelligence Agency ( CIA ) ‘shed’ its ’government skin’ to become the ‘world’s largest multi-national corporation’ holding the ‘world’s largest monopoly’ on information technology ( IT ) research and development direction of much of the world’s finest talent resources, i.e. private ‘individuals’, ‘businesses’, ‘institutions’ and ‘organizations’ independently operating outside U.S. Congress ‘oversight, budget justification and related constraints’. Such clever restructuring in-place, CIA would cease to exist as the public knows it today, technically – by legal definition – becoming a wholly-owned ’non-profit organization’ – no longer requiring U.S. Department of the Treasury tax dollar funding. Set free, a new type of CIA would exist with ‘self-determined financing’ and stock market trading profits derived from a host of private sector corporate ‘mergers and acquisitions’ ( M&A ).

While surface dreams of such visionaries might at-first appear ingenious ‘how’ was ‘all’ this ‘actually assembled’?

Before 1999, it took the CIA Office of Legal Counsel less than 1-year to research various United States laws to locate legacy technicality provisions that the U. S. Congress approved allowing the CIA to exercise its own ”radical departure” plan.

By 2000, ’reality’ saw the fetal stages of this CIA private bussiness venture plan developing, leaving the public without hearing anymore further about its progress.

Some believed an ‘initial public offering’ ( IPO ) paying dividends to private individual investors and corporate trading of shares of ‘stock’ in what could have been misconstrued as potentially being the world’s largest ‘insider trading scheme’ headache of the United States Securities and Exchange Commission ( SEC ) whose predictives could only imagine manage the envelopment of multiple new technology area companies trading on ’stock exchange’ floors that could potentially carry forward ‘mutual profit secrets’ paying more funds than anticipated into the CIA private business plan – a “radical departure” away from what otherwise had long been understood as the status quo of world trading – where embarrassing implications might turn’terrorist fund reduction measures’ into ‘profits derived from CIA led secret private business developments in high technology products’. Could such a “radical departure” plan backfire or morphotherwise ’unsophisticated terrorists’ – utilizing improvised munition missions – into a new more powerful community of ’uncooperative competitive business terrorists’? Perhaps.

Implications of a CIA private business group of subsidiary businesses trading stock on ‘open stock market exchanges’ around the world could create an entirely ’new form of intelligence blowback’ of staggering global socio-economic business proportions for future generations.

Today, no overall clear pictures exist on what still remains cloaked in secrets – albeit ’government’ or ‘private’ – where outside both domains this CIA private business enterprise continues growing. But, in which directions?

Prior to 2001, the new CIA plan became a ’high-directional multi-tiered simulataneous growth-oriented economic support expansion’ for and of ‘key-critical secret-sensitive advanced “information technology” ( IT ) derivatives ( “products” )’ that could ‘only be implimented’ by the CIA “identifying” ( targeting ) and “partnering” ( obtaining ) ’exisiting information’ and ‘information tasking ability quotients’ from “global” ( worldwide ) “private sector” masses – an ”infinite” ( unlimited ) supply of ‘private individuals’, ‘private businesses’ and ‘institutions’ to become ‘dedicated taskers’ of controlled CIA “problem sets.”

The CIA private business futures would depend on successful simultaneous utilization of exercising better economic sense to its maximum potential immediately alongside highly specific advanced technological enhancements the United States ’intelligence community’ would be grown under a new ‘broad term secrecy’ commonly known but hidden within what the CIA termed only as ”information technology” ( IT ) that would necessarily require CIA controlled ’targeting’, ‘shaping’ and ‘acquisition’ of a plethora of private business sector information technology ( IT ) application research and development.

Truely a ”radical departure,” as the CIA publicly alluded to, when describing its ‘new mission focus’ – solving CIA “problem sets.”

Although CIA controlled ’special technology’ research and development ( R&D ) was on ‘applications’ that later becale known as ” Commercial Off The Shelf ” ( C.O.T.S. / COTS ) ‘products’ that were in-essence – during early stage informational development – the cruxt of what the CIA wanted presented on its ‘table’ whereupon the CIA would legitimatize and manage ‘mass information exchanges’ the CIA would ’trade’ for ‘other valuable considerations’ but to only a select few ‘private companies’ ( e.g. LOCKHEED, LUCENT, PHILIPS, AT&T, et al. ) that would in-return be ‘capable of offering’ through only ’United States government qualified’ contractual agreement exchanges of whatever the CIA deemed these companies ’could place of further interest’ or ’further the duration of continuing to provide’ what these select private ‘individuals’, ‘companies’, ‘institutions’ and ‘organizations’ were ‘already providing under U.S. government contract agreement harvests’.

The public, however only understood ‘press reports’ that kept all of the aforementioned very ’simple’ – indicating in the vaguest of terms – that ‘products would eventually be sold’ ”through the private sector.”

Secret-sensitive ‘products’, that were in all actuality ‘technological breakthroughs’ were to be traded between CIA selected and controlled business stock holdings, and the CIA IN-Q-TEL INTERFACE CENTER ( QIC ) would privately and thereby secretly manipulate all technology funds derived from what the CIA QIC publicly referred to as those being its ‘partners’ and ‘other vendors’ that would remain ’outside the purview U. S. Congress government budget oversight’ where all private companies remain to enjoy unfettered privledges of privacy.

By utilizing U.S. Department of the Treasury government tax funds – for U.S. government contract agreement funding to ‘private business partners’ – the United States Federal Reserve System follows in CIA footpath lock-step by ’mirroring’ private bank wire transfer monies directed and then redirected through a long chain series of foreign corporation named offshore bank accounts secretly routed back into the U.S. Central Intelligence Agency ( CIA ) IN-Q-IT CORPORATION (aka) QIC  private business enterprise handling ‘venture capital’ where new project funding amounts may be decidely broken down into smaller amounts or pooled into much larger amounts for dispersals to clandestine other secret-sensitive intelligence programs, projects and/or operations that gain the strength to easily remain ‘outside U.S. Congress intelligence oversight board committees and subcommittee scrutinization.

Directorate Central Intelligence ( DCI ) ochestral management arrangements within its IN-Q-IT INTERFACE CENTER ( QIC ) realizes from foreign historical prospectives that when a private business exercises ‘en masse privatized mind teams’ to understands today’s falabilities in-keeping with human frailties encountering CIA inherent procedural compartmentalization of secrets rule ( no “talking around” ) requiring those of such outside each task to be unable to quickly assemble an ‘overall picture’ of what overall CIA plans consist of – at least that’s how it’s supposed to work but rarely does – so, while that alone ( in and of itself ), became a “problem set” to solve that drastic measures needed taking. Hence, the CIA “radical departure” plan has another design serving to counteract intelligence information leaks.

Ingenious, is a very small word to describe even one ( 1 ) facet of this CIA private business plan where the public has its limited understanding confusingly ’shifted from what it perceives to be government secrets’ moved rapidly back and forth between ‘private sector secrets’ in what only but a few perceive to be a ‘new wave intelligence form’ or ‘combinatoric intelligence structuring’ producing a shield ( shell ) to protect even more secrets actually beneath what has become a new globally flexible CIA layered support group strengthening.

As an extremely large ’black budget’ intelligence missions funding source, CIA IN-Q-TEL INTERFACE CENTER would only be the recipient of limited and toned recommendations supplied by the QIC Board of Advisors ( CIA headquarters ) as measures to be reviewed for eventual implimentation by the QIC Board of Trustees for the essential discretionary manipulation of sophisticated:

1. Technologies [ i.e. XEROX PARC RESEARCH CENTER, et al. ];

2. Vendors [ i.e. LOCKHEED, et al. ];

2. Debt [ i.e. TELECREDIT INC., et al. ];

3. Capital [ i.e. MARSH & MCLENNAN CAPITAL INC. ];

4. Stock Market Trading [ i.e. GOLDMAN SACHS & CO. ]; plus,

5. More.

The report ( below ) shows ‘whom’ were initially placed in ‘experienced authoritative positions’ and shows ’whom’ were chosen as ’senior level executive advisors’, all selectively chosen by the CIA pulling them from ‘key critical private businesses’ to ‘guide’ the private U.S. Central Intelligence Agency business venture.

Such should really come as no surprise, at least to those understanding mechanics of international business, trading and finance where all domestic and foreign bank account transactions are mirrored under oversight by the United States Federal Reserve System ( FED ) and U.S. Securities And Exchange Commission ( SEC ), the latter two ( 2 ) of which are ‘overseen’ by the U.S. Central Intelligence Agency ( CIA ).

This information was dervived from, outside ’market sensitive‘ ( stock market trading ) material, a Critical Sensitive National Security report [ August 13, 2001 ] of the U.S. Congress, House of Representatives Subcommittee on Oversight and Investigations ( Subcommittee ) of the Committee on Financial Services relying on information supplied by, amongst others, the SEC Divisions of Enforcement and Corporation Finance, Offices of the Chief Accountant, General Counsel, Compliance, Inspections and Examinations, Office of the Comptroller, Office of Economic Analysis, the U.S. Central Intelligence Agency ( CIA ).

But, is all this ‘really going on’? See full report ( below ).

– – – –

Courtesy: Unwanted Publicity Information Group

Source: X-CIA Files [ now defunct MSN Group website ]

 

 

 

CIA Sends Hi-Tech Tsunami Warnings by, X-CIA Files – Staff Writer [ AnExCIA@bluewin.ch ]

March 12, 1999

USA, California, Menlo Park – The first ‘publicly open contract’ between a so-called ‘private firm’ partnering with the U.S. Central Intelligence Agency (CIA) where the private corporation whose CIA members conceived and funded it is known as the In-Q-Tel Corporation was formerly known as the In-Q-It Corporation.

In-Q-Tel is actually one of many CIA private-sector business partners, which is not all that uncommon a partnership, like what the CIA has had for decades with the MITRE CORPORATION (USA), BELL LABORATORIES ( Canada ), and TRW Power Thrusting Division ( Hawthorne, California, USA ), remote control center for CIA maneuvering Tracking Data and Relay Satellites ( TDRS ) and the Killer HUGHES ( KH-11 ) anti-satellite satellite ( space born destructive laser platform ) series.

The publicly revealed partnership between the CIA and IN-Q-TEL CORPORATION has sent another tsunami warning to Japan at its NIPPON ELECTRIC CORPORATION ( NEC ) high-technology monopoly, that has been right on the heels of advanced high-technology advancements seen within the RESEARCH TRIANGLE PARK ( North Carolina, USA ).

There is some doubt and controversy though between what the CIA says In-Q-Tel is, verses what Q-In-Tel says it is.

CIA claims ( on its website ) that Q-In-Tel is a “‘non-profit’ organization.”

IN-Q-TEL states it has had ’profitability in-mind for quite some time’.

Let’s see what the facts reveal (below), allowing casual observers to make up their own minds about just what IN-Q-TEL ‘really is’:

The In-Q-Tel Heirarchy

CIA IN-Q-TEL INTERFACE CENTER ( QIC ) players are stacked-up on a list that reads like something out-of a Robert Ludlum novel – filled with international intrigue high-tech corporatarchy.

CIA IN-Q-TEL INTERFACE CENTER ( QIC ) Board of Trustees is a Who’s-Who of ‘big corporate’ America:

– Gilman Louie, CEO of IN-Q-TEL CORPORATION, most recently was HASBRO INTERACTIVE Chief Creative Officer and General Manager of the GAMES.COM group ( responsible for creating the HASBRO internet game site ), previously serving as Chairman of the Board of MICROPOSE, CEO and Chairman of SPECTRUM HOLOBYTE, CEO of SPHERE INC., and is on the Boards of Directors of numerous software firms.;

– Lee Ault, Chairman, former Chairman and CEO of TELECREDIT INC.;

– Norman Augustine, former Chairman and CEO of LOCKHEED MARTIN CORPORATION;

– John Seely Brown, Chief Scientist, XEROX CORPORATION and President, XEROX PARC RESEARCH CENTER;

– Michael Crow, Executive Vice Provost of Columbia University;

– Stephen Friedman, Senior Principal of MARSH & MCLENNAN CAPITAL INC., and former Chairman of GOLDMAN SACHS AND CO.;

– Paul Kaminski, former U.S. Department Of Defense ( DoD ) Undersecretary for the U.S. Defense Acquisition and Technology Office, President and CEO of TECHNOVATIONS INC., and Senior Partner in GLOBAL TECHNOLOGY PARTNERS;

– Jeong Kim, President of CARRIER NETWORK, part of the LUCENT TECHNOLOGIES GROUP, and former founder of YURIE SYSTEMS;

– John McMahon, former Deputy Director of U.S. Central Intelligence Agency ( CIA ), former President and CEO of LOCKHEED MISSILE & SPACE COMPANY, and consultant to LOCKHEED-MARTIN CORPORATION;

– Alex Mandl, Chairman and CEO of TELIGENT, and former President and CEO of AT&T; and,

– William Perry, former U.S. Department Of Defense Secretary and currently Berberian Professor at Leiland Stanford University.

New CIA Use Of In-Q-Tel Interface Center ( QI.C )

In-Q-Tel is a new non-profit corporation funded by the CIA to seek Information Technology ( IT ) solutions to the Agency’s most critical needs. A unique venture, was formed to enable the Agency ( CIA ) to have access to ‘emerging and developing information technology’ in a timely manner.

QIC ( IN-Q-TEL INTERFACE CENTER ) is the ‘interface center’ linking the IN-Q-TEL CORPORATION to the Agency ( CIA ).

QIC – CIA Function

QIC develops a problem set for In-Q-Tel, partners with In-Q-Tel in the solution acceptance process and manages the Agency’s relationship with In-Q-Tel.

QIC plans and evaluates the partnership program, protects CIA security and CIA counter-intelligence interests and communicates the QIC / In-Q-Tel venture to the World.

CIA – QIC Function

The CIA, working in partnership with IN-Q-TEL, created the Agency’s ( CIA ) new found organization QIC.

QIC goals now, are to be the leading source for commercial, high impact IT solutions for the Agency ( CIA ), and will be herald as the single most important contributor to the Intelligence Community by the year 2001. QIC will create and use the full range of corporate processes needed to manage QIC (aka) the ” CIA-In-Q-Tel Partnership ” by delivering CIA-accepted IT solutions.

CIA Goals With QIC

Eventually, IN-Q-TEL will take on a life funded by the high-technology consumer public. QIC ( IN-Q-TEL Interface Center ) however, works comprehensively and collaboratively with Agency ( CIA ) IT specialists, customers, IN-Q-TEL experts, Agency ( CIA ) managers, the Chief Information Officer, the Chief Technology Officer, Chief Financial Officer, Agency directorates, and Executive Board to develop an annual coordinated and approved critical ‘problem set’ for IN-Q-TEL.

QIC, leads Agency participation in the partnership’s solution transfer planning, including resources, technology demonstration, and prototype testing and evaluation.

At the same time, QIC works with In-Q-Tel to assure that it addresses issues regarding the transfer of IT solutions into the Agency. QIC also works with Agency customers and their managers to create an environment conducive to the implementation and acceptance of partnership solutions and follow-on initiative.

In-Q-Tel Background

On September 29, 1999 the Central Intelligence Agency (CIA) was treated to something different. In many of the nation’s leading newspapers and television news programs a story line had appeared that complimented the Agency for its creativity and openness.

The media was drawn to a small corporation in Washington, D.C. that had just unveiled its existence and the hiring of its first CEO, Gilman Louie who described the Corporation called, the “IN-Q-IT CORPORATION“, as having been formed “…to ensure that the CIA remains at the cutting edge of information technology advances and capabilities.”

With that statement the Agency ( CIA ) launched a new era in ‘how it obtains cutting-edge technologies’.

In early January 2000, the name of the corporation ( IN-Q-IT CORPORATION ) was changed to IN-Q-TEL CORPORATION.

The ‘origins of the concept’ that has become IN-Q-TEL are traceable to Dr. Ruth David, former CIA Deputy Director for Science and Technology.

She and CIA Science And Technology Deputy Director, Joanne Isham, were the first senior Agency ( CIA ) officials to understand that the information revolution required the CIA to forge ‘new partnerships’ with the ‘private sector’ and ‘design a proposal for radical change’.

The timing of the proposal was fortuitous.

CIA Director of Central Intelligence ( DCI ), Mr. George Tenet, had just launched his own Strategic Direction Initiative ( SDI) – also known as “Star Wars“ – included technology as one of its areas for review.

The study made a direct link between Agency ( CIA ) ‘future technology investments’ and ‘improving’ its ‘information gathering’ and ‘analysis capabilities’.

By the summer of 1998, the Agency ( CIA ) had assembled a few senior Agency ( CIA ) ‘staff employees with an entrepreneurial bent’ and ‘empowered them’ to take the Dr. Ruth David original concept and flesh it out.

Aided by a ‘consulting group’ and a ‘law firm’, they ( CIA ) devoted the next 4-months to making the rounds in Silicon Valley ( California ) – and elsewhere – putting the concept through the wringer. Much of the ‘time was spent listening’.

Many they met with were often critical of one aspect or another of the concept.

But, whether they were ‘venture capitalists’, Chief Executive Officers ( CEO ), Chief Technical Officers ( CTO ) or men of Congress and staffers, all eagerly immersed themselves in spirited debates that enriched the Agency ( CIA ) team and ‘drove the concept in new directions’.

By the end of 1998, the Agency  CIA ) team reached a point at which the concept seemed about right.

Though’ it had changed considerably’ from that which had been proposed initially by Dr. Ruth David, it remained true to its core principles.

It was time to hand the ‘product’ of the Agency ( CIA ) work over to someone in the ‘private sector’ with the ‘experience’ and passion necessary ‘to start the Corporation’.

To the delight of the DCI and Agency ( CIA ) team, Norman Augustine, a former CEO of LOCKHEED-MARTIN and 4-time recipient of the Department of Defense highest civilian award, the Distinguished Service Medal, accepted the challenge.

By February 1999, the Corporation was established as a legal entity, and in March [ 1999 ] it [ IN-Q-TEL CORPORATION ] received its first [ 1st ] contract from the Agency ( CIA ). In-Q-Tel was in business, charged with ‘accessing information technology ( IT ) expertise and technology wherever it existed’ and brought it to bear on the’ information management’ challenges facing the Agency ( CIA ).

In-Q-Tel Creation

As an information based agency, the CIA must be at the cutting edge of information technology in order to maintain its competitive edge and provide its customers with intelligence that is both timely and relevant.

Many times the Agency and the federal government have been the catalysts for technological innovations. Examples of Agency ( CIA ) inspired breakthroughs, include the LOCKHEED AEROSPACE aircraft designed U-2 ( Dragon Lady ) and SR-71 ( Black Bird )reconnaissance aircraft and the CORONA ‘surveillance’ satellites, while the ‘parent of the Internet’ [ Advanced Research Projects Agency (aka) ARPA.NET ] was led forward with the Defense Advanced Research Projects Agency ( DARPA ).

By the 1990s, however – especially with the advent of the World Wide Web – the ‘commercial market’ was setting the pace in IT innovation.

And, as is the nature of a market-based economy, the ‘flow of capital’ and ‘talent’ has irresistibly ‘moved to the commercial sector’ where the prospect of huge profits from ‘initial public offerings‘ ( IPO ) and ‘equity-based compensation‘ has become ‘the norm’.

In contrast to the remarkable transformations taking place in Silicon Valley ( California ) and elsewhere, the Agency ( CIA ) – like many large Cold War era ‘private sector corporations’ – felt itself being ‘left behind’. It ( CIA ) was not connected to the creative forces that underpin the digital economy.

And, of equal importance, many in Silicon Valley ( California ) knew little about the Agency ( CIA ) IT ( information technology ) needs.

The opportunities and challenges posed by the information revolution to the Agency ( CIA ) core mission areas of ‘clandestine collection’ and ‘all-source analysis’ were growing daily.

Moreover, the [ CIA ] challenges are not merely from foreign countries, but also ‘transnational threats’.

Faced with these realities [ by 1997 ], the leadership of the CIA made a critical and strategic decision in early 1998.

The Agency’s leadership recognized that the CIA did not, and could not, compete for IT ( information technology ) innovation and talent with the same speed and agility that those in the ‘commercial marketplace’, whose businesses are driven by “Internet time” and ‘profit’, could.

The CIA mission ‘was’ intelligence collection and analysis, not IT innovation.

The leadership also understood that, in order to extend its reach and access a broad network of IT innovators, the Agency had to step outside of itself and appear not just as a buyer of IT but also as a seller.

The CIA had to offer Silicon Valley ( County of Santa Clara, California ) something of value, a business model that the Valley [ Silicon Valley ] understood; a model that ‘provides’ – for those who joined hands ( became partner affiliates ) with IN-Q-TEL – the ‘opportunity to commercialize’ their ‘innovations’. In addition, IN-Q-TEL ‘partner companies’ would also ‘gain another valuable asset’, access to very difficult CIA ‘problem sets’ that could become ‘market drivers’.

Once the Agency ( CIA ) leadership crossed these critical decision points, the path leading to IN-Q-TEL formation was clear.

In-Q-Tel – Close-Up

In-Q-Tel founder, Norm Augustine, established it as an independent non-profit corporation.

Its Board of Trustees, which now has 10 members, functions as any other board, initially guiding and overseeing the Corporation’s startup activities and setting its strategic direction and policies.

The CEO, who was ‘recruited’ by the Board [ Board of Trustees for IN-Q-TEL CORPORATION ], reports to them [ Board of Trustees for IN-Q-TEL CORPORATION ], and ‘manages’ IN-Q-TEL.

The Corporation [ IN-Q-TEL ] has offices in two ( 2 ) locations:

1. Washington, D.C.; and,

2. Menlo Park, California [ Silicon Valley ].

It [ IN-Q-TEL ] employs a ‘small professional staff’ and a ‘smaller group’ of ‘business consultants’ and ‘technology consultants’.

In-Q-Tel’s mission is to foster the development of new and ‘emerging information technologies’ and pursue ‘research and development’ ( R&D ) that produce solutions to some of the most difficult IT [ information technology ] problems facing the CIA.

To accomplish this, the Corporation [ IN-Q-TEL ] will network extensively with those in ‘industry’, the ‘venture capital’ community, academia, and any ‘others’ who are at the ‘forefront of IT [ information technology ] innovation’.

Through the business relationships that it establishes, In-Q-Tel will create environments for collaboration, product demonstration, prototyping, and evaluation.

From these activities will flow the IT solutions that the Agency ( CIA ) seeks and, ‘most importantly’, the ‘commercial opportunities’ for ‘product development’ by its ‘partners’.

To fulfill its mission, In-Q-Tel has designed itself to be:

– Agile to respond, rapidly to Agency needs and commercial imperatives;

– Problem driven, to link its work to Agency program managers;

– Solutions focused, to improve the Agency’s capabilities;

– Team oriented, to bring diverse participation and synergy to projects;

– Technology aware, to identify, leverage, and integrate existing products and solutions;

– Output measured, to produce quantifiable results;

– Innovative, to reach beyond the existing state-of-the-art in IT; and,

– Over time, self-sustaining, to reduce its reliance on CIA funding.

At its core, In-Q-Tel is designed to operate in the market place on an equal footing with its commercial peers and with the speed and agility that the IT world demands.

As an example, it [ IN-Q-TEL ] can ‘effect the full range of business transactions‘ common to the industry – it is ‘venture [ venture capital ] enabled’, can ‘establish joint ventures‘, ‘fund grants [ grant funding ]‘, sponsor open competitions, ‘award sole source contracts‘, etc. And, ‘because of the many degrees of freedom granted to it‘ [ IN-Q-TEL ] by the Agency ( CIA ), IN-Q-TEL ‘does not require Agency ( CIA ) approval for business deals it negotiates‘.

As such, In-Q-Tel represents a different approach to government R&D.

It [ IN-Q-TEL ] ‘moves away from the more traditional government project’ office model in which the program is managed by the government.

Instead, the Agency ( CIA ) has invested much of the decision-making in the Corporation [ IN-Q-TEL ].

Hence, In-Q-Tel will be judged on the outcomes produced, i.e. the solutions generated, and not by the many decisions it makes along the way.

In-Q-Tel – IT Space

As with many aspects of the In-Q-Tel venture, the Agency took a different approach in presenting its IT needs to the Corporation. It bounded the types of work that In-Q-Tel would perform – its IT operating “space” – by two ( 2 ) criteria:

In the first [ 1st ] instance, it made the decision that In-Q-Tel would initially conduct only unclassified IT work for the Agency ( CIA ).

Second [ 2nd ], to attract the interests of the private sector, it recognized that IN-Q-TEL would ‘principally invest in areas’ where there was both an Agency ( CIA ) need and ‘private sector interest’.

Whereas in the past, much of the commercial computing world did not focus on those technologies useful to the CIA, the intersection zone between intelligence and private sector IT needs has grown tremendously in recent years.

Many of the underlying technologies that are driving the information revolution are now directly applicable to the intelligence business. Examples of commercial applications that also support intelligence functions are:

1. Data warehousing and data mining; 2. Knowledge management; 3. Profiling search agents [ Search Engines and User search requests ]; 4. Geographic information systems [ Satellite Communication Information Systems ]; 5. Imagery analysis and pattern recognition; 6. Statistical data analysis tools; 7. Language translation; 8. Targeted information systems; 9. Mobile computing; and, 10. Secure computing.

Information Security or INFOSEC, a critical enabling technology for all intelligence information systems, is now a mainstream area of research and innovation in the commercial world, due in no small part to the exponential growth in Internet e-commerce.

Thus, there are a number of commercially available security technologies:

1. Strong encryption; 2. Secure community of interests; 3. Authentication and access control; 4. Auditing and reporting; 5. Data integrity; 6. Digital signatures; 7. Centralized security administration; 8. Remote users or traveling users; and, 9. Unitary log-in.

It is, no doubt, the case that the commercial investments flowing into information security outpace the spending made by the Intelligence Community.

Thus, In-Q-Tel will be poised to ‘leverage the investments of others to the benefit of the Agency ( CIA )‘.

Having bounded In-Q-Tel’s IT space with these 2 criteria – ‘unclassified work’ with ‘commercial potential’ – the Agency defined a set of strategic problem areas for the Corporation.

These four ( 4 ) areas have the added and obvious benefit of spanning the needs of all the Agency’s directorates and, hence, its core business functions of collection and analysis:

1. Information Security: Hardening, and intrusion detection, monitoring and profiling of information use and misuse, and network and data protection.;

2. Use of the Internet: Secure receipt of information, non-observable surfing, authentication, content verification, and hacker resistance.;

3. Distributed Architectures: Methods to interface with custom / legacy systems, mechanisms to allow dissimilar applications to interact, automatic handling of archived data, and connectivity across a wide range of environments.; and,

4. Knowledge Generation: Geospatial data fusion and multimedia data fusion or integration and, computer forensics.

Information Technology ( IT ) In-Q-Tel – CIA Occupancy

It will no doubt raise questions with some who will believe that it or the Agency ( CIA ) have other motives.

It is, therefore, important to highlight ‘what In-Q-Tel is not’ and what it [ IN-Q-TEL ] will not do.

First, it is not a front company for the Agency ( CIA ) to conduct any activities other than those spelled out in its Articles of Incorporation and its Charter Agreement.

As a non-profit – 501(c)3 – corporation, it will operate in full compliance with the Internal Revenue Service ( IRS ) regulations and, as with all similar non-profits, its IRS filing will be a matter public record.

In-Q-Tel is ‘openly affiliated with the Agency ( CIA )’, as was made obvious to the world during its press rollout on September 29, 1999.

Of equal importance, it will not initiate work in areas that lead to solutions that are put into so-called “black boxes” – that is, innovations that the government subsequently classifies. To do so would undercut In-Q-Tel credibility with its business partners to the detriment of the Agency.

Finally, IN-Q-TELis a solutions company‘, ‘not a product company‘.

Working through its business partners, it will demonstrate solutions to Agency problems but will not generate products for use by Agency components.

In-Q-Tel ‘inspired products‘ will be ‘developed through separate contractual arrangements‘ involving Agency ( CIA ) ‘components‘ and ‘other vendors‘.

In-Q-Tel – Structure & Staffing

Central to the In-Q-Tel business model are speed, agility, market positioning, and leveraging.

These attributes, taken together, have helped shape the evolving structure of the Corporation. It is one that intends to emphasize the “virtual” nature of the Corporation while minimizing “brick and mortar” costs, i.e. it will operate by facilitating data sharing, and decisionmaking via seamless communications using a private network with broadband connectivity to the Agency and its partners, while limiting direct infrastructure investments in laboratories and related facilities by leveraging the facilities of others.

To facilitate this intent, the In-Q-Tel Board and CEO decided to hire a small staff composed of people with strong technical and business skills.

At present, the Corporation has about ten ( 10 ) staff employees and, it is expected that, by the end of the year 2000, the total will number about thirty ( 30 ).

The CEO is currently designing In-Q-Tel management structure, but the parameters he has set for it indicate that it will be very flat and aligned for rapid decision-making.

How In-Q-Tel Works

One of the great leaps of faith the Agency took in this venture was to recognize, early on, that private sector businessmen were better equipped than it was to design the Corporation and create its work program.

The Agency’s critical role was to develop the initial concept, help form the best Board possible, give IN-Q-TEL a challenging problem set, and then design a ‘contractual vehicle‘ that ‘gave‘ the [ CIA ] CorporationQIC ] the ‘necessary degrees of freedom to design itself and operate in the market place‘.

All of this was accomplished in less than 1-year, to include the design of In-Q-Tel’s initial work program. In-Q-Tel’s current work program is built on a process of discrete, yet overlapping, elements – IT roadmapping, IT baselining, and R&D projects.

The underlying philosophy now driving the In-Q-Tel program is to gain an understanding of the many players occupying In-Q-Tel’s IT space – by roadmap analysis – and, concurrently, test and validate the performance and utility of existing products and technologies – by baseline testing – against current Agency needs.

If the test results are successful, the Agency ( CIA ) has the ‘option’ of quickly ‘purchasing’ the ‘products’ directly ‘from the vendor’.

However, in those ‘cases where there are no existing products or technologies‘, or where a gap exists between the baseline test results and the Agency ( CIA ) needs, IN-Q-TEL will launch R&D projects.

In this way, the Agency ( CIA ) obtains near-term solutions through the evaluation of those products considered “best-in-class” and can target its R&D projects more precisely – that-is, to where ‘commercial‘ or ‘other government [ contract ] IT investments [ $ ] are small‘ or nonexistent.

With its first [ 1st ] year budget of about $28,000,000 million, In-Q-Tel has focused its initial efforts on the IT roadmap and baseline elements of the program.

The roadmap project seeks, first, to ‘identify those in industry, government, and academia who occupy the same IT space as In-Q-Tel’ and, secondarily, to ‘spot existing technologies of potential interest’.

The results will also help In-Q-Tel leverage the technical advances made by others, assess the overall direction and pace of research, avoid duplicating work done by other government entities, and highlight [ identify and target ] potential business partners. The roadmap will be updated and refined by In-Q-Tel throughout the life of its work program.

Two ( 2 ) Team Incubators & Twenty ( 20 ) Hi-Tech Firms [ Businesses ]

These twenty (20) are executing the baseline-testing element of the In-Q-Tel work program. They were selected by an independent review panel of national IT experts convened by In-Q-Tel to evaluate multiple proposals.

Each of the two ( 2 ) teams is working on one ( 1 ) or more ‘incubator concepts’ derived by In-Q-Tel from the Agency ( CIA ) ‘problem set‘ enumerated above. The incubator teams will operate for over a [ 1 ] year. As the In-Q-Tel work program grows, it is possible that other baseline incubator teams will be established.

The R&D part of the program, which In-Q-Tel manages, will soon become the core of its activities, with a growing percentage of its funds directed towards a portfolio of research projects. In-Q-Tel is formulating its research thrusts based on the information and test results gathered under the roadmap and baseline work, aided by extensive interactions with the private sector and the Agency.

The design of the research projects will be set by In-Q-Tel and will vary to meet the mutual interests of the Agency ( CIA ), In-Q-Tel, and its prospective business partners.

As mentioned earlier, In-Q-Tel will ‘draw from a broad range of R&D competition‘ models to attract the business partners it seeks.

In some cases, it may assemble teams of companies that each has a necessary part, but not the whole, of the solution In-Q-Tel seeks.

In ‘other projects’ IN-Q-TEL might be a co-investor in a fledgling company with another business partner such as a venture capital firm.

Or, it could take a more traditional route, using a request for proposal.

In essence, In-Q-Tel will use whatever model most efficiently and effectively meets the needs of all parties to a transaction, with a constant eye towards leveraging its resources and solving the Agency’s IT needs.

Common to most or all of the R&D agreements that In-Q-Tel intends to use will be the subject of intellectual property (IP), or more precisely said, the ownership of IP and the allocation of IP generated revenues.

In the area of IT R&D, a deal is typically not struck until all of the parties’ IP rights are clearly established.

In-Q-Tel’s acceptance within the IT market place depends heavily on its ability to negotiate industry standard IP terms.

Recognizing this, the Agency ( CIA ) agreement with In-Q-Tel allows it and/or its partners to retain title to the innovations created and freely negotiate the allocation of IP derived revenues.

The only major stipulation is that the Agency ( CIA ) retain traditional “government purpose rights” to the ‘innovations‘.

Contract Model – In-Q-Tel

Before the partnership between In-Q-Tel and the Agency became a reality, the Agency ( CIA ) had to develop a new contract vehicle that granted the Corporation [ QIC ] the degrees of freedom it needed to operate in the market place.

Most Agency ( CIA ) contracts, including those in R&D, are based on the Federal Acquisition Regulations ( FAR ), however FAR is often viewed by industry as overly burdensome and inflexible. And, it has been the U.S. Department of Defense ( DoD ) experience that smaller companies often will not contract with the government because of the extra costs they would incur to be FAR compliant.

Because the Agency ( CIA ) wanted to encourage such companies to work with In-Q-Tel, it took a different approach and designed a non-FAR agreement with the IN-Q-TEL CORPORATION.

It [ CIA ] also adopted elements from the old internet Godfather, i.e. Advanced Research Projects Agency or ARPA and its model based on “Other Transactions ( OT )” authority granted to the DoD [ U.S. Department of Defense ] by Congress [ U.S. Congress ].

OT [ Other Transaction ] agreements ‘permit authorized government agencies’ [ e.g. CIA ] to design R&D agreements outside the FAR.

The hoped for result is to spur greater flexibility and innovation for the government. In addition, it permits well-managed businesses, large and small, to perform R&D for the government, using their existing business practices and procedures.

Using an ARPA model OT agreement as a guide, the Agency ( CIA ) designed a 5-year Charter Agreement that describes the broad framework for its relationship with IN-Q-TEL, sets forth general policies, and establishes the terms and conditions that will apply to future contracts. In addition, a short-term funding contract was negotiated that includes In-Q-Tel’s “Description of Work”.

Together these documents define the metes and bounds of the Agency ( CIA ) relationship with In-Q-Tel and permit IN-Q-TEL to negotiate agreements with its partners, absent [ without ] most government flow down requirements.

In-Q-Tel – Advancements

The In-Q-Tel venture is one that has challenged the Agency to think creatively and quickly to address the fundamental changes that the information revolution is having on its core business.

It responded by setting aside traditional policies and practices in many areas and established a new partnership with industry and academia, based on shared interest and mutual benefit.

But, one cannot ignore that this venture involves risk, both for the Agency and In-Q-Tel. From the Agency’s perspective there are three ( 3 ) major areas that will require constant attention:

1. Managing its relationship with IN-Q-TEL;

2. Solution transfer; and,

3. Security.

Perhaps the most important of the three is the first, managing the relationship without stifling In-Q-Tel’s competitive edge.

IN-Q-TEL is a small independent corporation ‘established to improve the mission performance of a much larger government Agency‘. [ ? National Secruity Agency ( NSA ) ? ]

The imperatives that led to In-Q-Tel have many parallels in industry. In fact, the IT sector is replete with examples of a large corporation seeking to improve its competitiveness by either purchasing a small start-up company or forming a subsidiary.

The ‘parent corporation‘ [ ? ] sees in ‘its offspring‘ traits that it no longer possesses – speed, agility, and expertise. But, for these traits to be realized, ‘the start-up‘ must operate unencumbered from the ‘parent corporation‘ [ ? ], whose natural tendency is to rein in and control it.

Similarly, the Agency ( CIA ) will have to restrain its natural inclination to micromanage IN-Q-TEL and, instead, allow the Corporation [ QIC ] the freedom to prosper. It must have continuous insight into In-Q-Tel’s activities, but must understand that In-Q-Tel is responsible for its own operations, including the design and management of the work program.

Acceptance by Agency ( CIA ) components of In-Q-Tel inspired solutions will be the most important measure of success in this venture. It is also likely to be the hardest.

While there is every expectation that In-Q-Tel will become commercially successful and seed innovative solutions, if they are not accepted and used by Agency line managers, then the overall venture will be judged a failure.

Although In-Q-Tel has a critical role in the solution transfer process, the burden rests with the Agency, since the challenges are as much managerial and cultural as they are technical.

The Agency ( CIA ) Chief Information Officer ( CIO ), directorate heads, and component directors will all have to work closely with IN-Q-TEL to overcome bureaucratic inertia and identify eager recipients of the innovations that the Corporation develops.

Agency ( CIA ) “product champions” for each IN-Q-TEL project should be identified early and should participate fully in its formulation, testing, and evaluation. Incentives should be considered for those Agency ( CIA ) components that commit to projects with unique risks or that require extensive personnel commitments.

These and other strategies will be employed to ensure that the return on the Agency’s investment in In-Q-Tel translates into measurable improvements in its mission performance.

The open affiliation between the CIA and In-Q-Tel is yet another unique aspect and challenge for this venture. Although the Corporation [ QIC ]will be doing only unclassified work for the Agency ( CIA ), the nature of its IT research and its association with a US intelligence agency will undoubtedly attract the interests of foreign persons, some with questionable motives.

The obvious security ramifications of this scenario were well considered in the decisionmaking process that led to In-Q-Tel’s formation. It was ultimately decided that the risks are manageable and, in many ways, are similar to those faced by any high-tech company trying to protect its IP and trade secrets.

IN-Q-TEL and the Agency ( CIA ) will be working closely to ensure that the Corporation [ QIC ] operates with a high degree of security awareness and support.

In-Q-Tel has a critical role in meeting these three ( 3 ) challenges. However, it’s most persistent challenge will be developing and sustaining a reputation as a business that:

1. sponsors leading edge research;

2. produces discoveries; that can be,

3. profitably commercialized.

Once it has established a record of accomplishment in these two areas, the high caliber IT talent the Agency hopes to reach through In-Q-Tel will be drawn to the Corporation.

In-Q-Tel Future

Those of us at the Agency who helped to create In-Q-Tel are endlessly optimistic about its prospects for success. The early indicators are all positive. Among them is the caliber of the people who stand behind and lead the Corporation and the initial reaction from industry and the trade press to its formation.

IN-Q-TEL Board of Trustees is at least the equal of any large corporation’s board. They are committed to the Agency ( CIA ) mission, the new R&D model that IN-Q-TEL represents, and have invested much of their time to its formation.

The Agency and the nation are in their debt.

The Board [ IN-Q-TEL Board of Trustees ] also recruited an outstanding CEO who brings with him the ‘experiences’ and ‘contacts’ of his Silicon Valley [ California ] base and an established reputation for starting and growing new IT companies.

The favorable press coverage of In-Q-Tel combined with the industry “buzz” engendered by the Board and CEO have brought a flood of inquiries by those interested in doing business with the Corporation. And, most importantly, its work program is already beginning to achieve results that the Agency ( CIA ) can use and that its ( CIA ) partners can commercialize.

Judging by the record to date, the road ahead appears promising. But, In-Q-Tel’s fate also rests in part on those institutions charged with oversight of the Agency and its budget.

Congress has supported the Agency as it launched this new venture. The U.S. Congress “seeded the venture with start-up funding” when it was still in its conceptual phase, but asked hard questions of the Agency throughout the design and formation of In-Q-Tel.

Members understood that starting an enterprise such as IN-Q-TEL is ‘not risk free‘. As with all R&D efforts in government and industry, there will be some home run successes but also some failures. That is the price the Agency must be prepared to pay if it wants to stay on the leading edge of the IT revolution.

With In-Q-Tel’s help plus the continued support of Congress [ U.S. Congress ] and Office of Management and Budget ( OMB ), as well as from the traditional Agency ( CIA ) ‘contractor community‘ and ‘others‘, an “e-CIA” of the next century [ 21st Century ] will evolve quickly, to the benefit of the President and the national security community.

Notes Of Interest

For the next one ( 1 ) or two ( 2 ) years [ 1999 – 2000 ], IN-Q-TEL will accept work only from the CIA‘.

All solutions that it provides to the CIA will be made available to the entire Intelligence Community.

Codified in a 5-year Charter Agreement with the CIA and a 1-year funding contract that is renewable annually. As stipulated in the Charter Agreement, “…the Federal Government shall have a nonexclusive, nontransferable, irrevocable, paid-up license to practice or have practiced for or on behalf of the United States the subject invention throughout the world for Government purposes”.

The Agency ( CIA ) component that has day-to-day responsibility for guiding the CIA relationship with IN-Q-TEL, including the ‘design and implementation of the contract’ and the ‘problem set’, is the IN-Q-TEL INTERFACE CENTER ( QIC ) which resides inside the CIA Directorate of Science and Technology.

Circa: 2002 – 2008

IN-Q-TEL INCORPORATED (aka) IN-Q-IT CORPORATION 2500 Sand Hill Road – Suite 113 Menlo Park, California 94025 – 7061 USA TEL: +1 (650) 234-8999 TEL: +1 (650) 234-8983 FAX: +1 (650) 234-8997 WWW: http://www.inqtel.com WWW: http://www.in-q-tel.org WWW: http://www.in-q-tel.com

IN-Q-TEL INCORPORATED (aka) IN-Q-IT CORPORATION P.O. Box 12407 Arlington, Virginia 22219 USA TEL: +1 (703) 248-3000 FAX: +1 (703) 248-3001

– –

IN-Q-TEL focus areas, surround:

– Physical Technologies; – Biological Technologies; – Security; and, – Software Infrastructure.

– –

IN-Q-TEL

Investments –

Strategic Investments, Targeted Returns

In-Q-Tel is ‘building’ a ‘portfolio of companies’ that are ‘developing innovative solutions’ in ‘key technology areas’

Similar to many ‘corporate strategic venture’ firms, In-Q-Tel seeks to ‘optimize potential returns’ for our clients — the CIA and the broader Intelligence Community— by investing in companies of strategic interest.

In-Q-Tel engages ‘start-ups’, ‘emerging’ and ‘established’ companies, universities and research labs.

In-Q-Tel structure attractive win-win relationships through ‘equity investments’, as well ‘strategic product development funding’, and ‘innovative intellectual property arrangements’ and ‘government business development guidance’.

An Enterprising Partner –

In-Q-Tel ‘portfolio companies’ value a ‘strategic relationship’ with a ‘proactive partner’.

Companies, that work through In-Q-Tel due diligence process, know their technologies have the potential to address the needs of one of the most discriminating enterprise customers in the world.

In-Q-Tel takes a hands-on approach, working closely with our ‘portfolio companies’ to help ‘drive their success’ in the ‘marketplace’ and to ‘mature [ ‘grow’ ] their technologies’.

In-Q-Tel ‘investment goals’ are focused on ‘return’ on technology – a ‘blend of factors’ that will ‘deliver strategic impact’ on the Agency [ CIA ] mission:

– Effective ‘deployments’ of innovative technologies to the CIA; – Commercially successful ‘companies that can continue’ to ‘deliver’ and ‘support’ innovate technologies; and, – Financial ‘returns to fund further technology investments’ to ‘benefit the Intelligence Community’.   Investing In Our National Security –

In just a few short years, In-Q-Tel has ‘evaluated’ nearly two thousand [ 2,000 ] ‘proposals’:

75% [ 1,500 ] of which have come from companies that had never previously considered working with the government.

To date, In-Q-Tel ‘established strategic relationships’ with more than ‘twenty’ ( 20 ) of these ‘companies’.

Read more about our ‘portfolio companies’ and ‘technology partners’, or learn how to submit a business plan to In-Q-Tel.

Areas Of Focus –

IN-Q-TEL focuses on next generation technologies for gathering, analyzing, managing and disseminating data. Learn more about our areas of focus:

Knowledge Management: [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/tech/km.html ];

Security and Privacy: [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/tech/sp.html ];

Search and Discovery: [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/tech/sd.html ];

Distributed Data Collection: [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/tech/dd.html ]; and,

Geospatial Information Services: [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/tech/gi.html ].

Submit A Business Plan –

“In-Q-Tel also has garnered a reputation in the tech and VC [ Venture Capital ] worlds for being hard-nosed during due diligence. Unlike some venture firms, In-Q-Tel is staffed with hard-core techies who know how to put a program through the ringer. They’ve also got one of the roughest testing domains: the computer systems of the CIA.” – Washington Business Journal ( November 19, 2001 )

– View our criteria [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/submit/index.html ] for submission, and apply for consideration online.

Media Resources –

– Investment Portfolio: [ http://web.archive.org/web/20020630223724/http://www.inqtel.com/news/attachments/InQTelInvestmentPortfolio.pdf ].

Reference

http://web.archive.org/web/20020630223724/www.inqtel.com/invest/index.html

– – – –

Circa: 2002

IN-Q-TEL

Investments –

Technology Partners ( 2002 ) –

INKTOMI [ http://www.inktomi.com ] ( Leading Edge Search and Retrieval Technology )

INKTOMI, based in Foster City, California ( USA ), has offices elsewhere in North America, Asia and Europe.

INKTOMI division business, involves:

Network Products – comprised of industry leading solutions for network caching, content distribution, media broadcasting, and wireless technologies; and,

Search Solutions – comprised of general Web search and related services, and ‘enterprise’ search.

Inktomi ‘develops’ and ‘markets’ network infrastructure software essential for ‘service providers’ and ‘global enterprises’.

Inktomi ‘customer’ and ‘strategic partner’ base of leading companies, include:

MERRILL LYNCH; INTEL: AT&T; MICROSOFT; SUN MICROSYSTEMS; HEWLETT-PACKARD; COMPAQ; DELL; NOKIA; AMERICA ONLINE ( AOL ); and, YAHOO.

SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) Lead System Integrator ( LSI ) – SAIC LSI [ http://www.saic.com/contractcenter/ites-2s/clients.html  ]

SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ), founded in 1969 by Dr. J. R. Beyster who remained with SAIC for 30-years until at least November 3, 2003, has had as part of its management, and on its Board of Directors, many well known former U.S. government personnel, including:

– Melvin Laird, Secretary of Defense in the Richard Milhouse Nixon Presidential Administration;

– William Perry, Secretary of Defense in the William Jefferson Clinton Presidential Administration;

– John M. Deutch, U.S. Central Intelligence Agency ( CIA ) Director of Central Intelligence ( DCI ) in the William Jefferson Clinton Presidential Administration;

– U.S. Navy Admiral Bobby Ray Inman, U.S. National Security Agency ( NSA ) and U.S. Central Intelligence Agency ( CIA ) – various employed capacities in ‘both’ Agencies – in the Gerald Ford Presidential Administration, Billy Carter Presidential Administration and Ronald Reagan Presidential Administration;

– David Kay, who led the search for Weapons of Mass Destruction ( WMD ) – following the 1991 U.S. Persian Gulf War – for the United Nations ( UN ) and in the George W. Bush Sr. Presidential Administration following the 2003 U.S. invasion of Iraq.

In 2009, SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) moved corporate headquarters to Tysons Corner at 1710 SAIC Drive, McLean, Virginia ( USA ).

SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) is a scientific, engineering and technology ‘applications company’ with numerous ‘state government clients’, ‘federal government clients’, and ‘private sector clients’.

SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) works extensively, with:

U.S. Department of Defense ( DOD ); U.S. Department of Homeland Security ( DHS ); U.S. National Security Agency ( NSA ); U.S. intelligence community ( others ); U.S. government civil agencies; and, Selected commercial markets.

SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) Subsidiaries –

SAIC VENTURE CAPITAL CORPORATION; SCICOM TECHNOLOGIES NOIDA ( INDIA ); BD SYSTEMS ( BDS ); BECHTEL SAIC COMPANY LLC; BECK DISASTER RECOVERY ( BDR ); R.W. BECK; BENHAM; CLOUDSHIELD; DANET; EAGAN MCALLISTER ASSOCIATES INC.; HICKS & ASSOCIATES MEDPROTECT LLC REVEAL; SAIC-FREDERICK INC.; NATIONAL CANCER INSTITUTE ( NCI ); SAIC INTERNATIONAL SUBSIDIARIES; SAIC LIMITED ( UK ); CALANAIS ( SCOTLAND ); VAREC; APPLIED MARINE TECHNOLOGY CORPORATION; EAI CORPORATION; and, Others.

In 1991, SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) received transference of the U.S. Department of Defense ( DOJ ), U.S. Army ( USA ), Defense Intelligence Agency ( DIA ) ‘Remote Viewing Program’ renamed STARGATE Project.

In January 1999, SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) consultant Steven Hatfill saw SAIC vice president Joseph Soukup internally ( with no outside client ) commission ( with no outside client ) William C. Patrick – a retired leading figure in the legacy U.S. bioweapons program – see a report produced ( 28-pages on Feburary 1999 ) on terrorist anthrax attack possibilities via Unitd States postal mailings prior to 2001 anthrax attacks in the United States.

In March 2001, the U.S. National Security Agency ( NSA ) had SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) in ‘concept definition’ phase for what later became known as the NSA TRAILBLAZER Project, a “Digital Network Intelligence” system intended to ‘analyze data’ carried across computer ‘networks’.

In 2002, the U.S. National Security Agency ( NSA ) chose SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) to produce a ‘technology demonstration platform’ for the NSA TRAILBLAZER Project, a contract worth $280,000,000 million ( USD ).

TRAILBLAZER Project participants, included:

BOEING; COMPUTER SCIENCES CORPORATION ( CSC ); and, BOOZ ALLEN HAMILTON.

In 2005, TRAILBLAZER – believed by speculators ( http://www.PhysOrg.Com et. al. ) to be a continuation of an earlier data mining project THINTHREAD program – saw U.S. National Security Agency ( NSA ) Director Michael Hayden inform a U.S. Senate hearing that the TRAILBLAZER program required several hundred million dollars over budget – consequently trailing years behind schedule waiting for approvals.

From 2001 through 2005, SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) was primary contractor for the $600,000,000 million ( USD ) TRILOGY Program, a three ( 3 ) part program – intended to replace obsolete FBI computers with a then-new state-of-the-art cutting edge technology ‘secure high-speed computer network system’ that would install 500 computer network servers, 1600 scanners and thousands of desktop computers in FBI field offices – that on December 2003 delivered to the U.S. Department of Justice ( DOJ ) Federal Bureau of Investigation ( FBI ) its SAIC “Virtual Case File” ( VCF ), a $170,000,000 million ( USD ) software system designed to speed tracking of terrorists, better accurize communications amongst agents fighting criminals with this FBI ‘critical case management system’, however nineteen ( 19 ) different government managers involved 36 contract modifications averaging 1.3 FBI changes everyday totaling 399 changes during 15-months afterwhich the FBI continued arguing ( through its own intermediary, AEROSPACE CORPORATION ) changes until the U.S. Department of Justice ( DOJ ) Inspector General ( IG ) criticized its ‘FBI handling’ of SAIC software, whereon February 2005 SAIC ‘recommended’ the FBI at-least ‘begin using’ the SAIC TRILOGY VCF ‘case management system’.

On September 27, 2006 during a special meeting of SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) stockholders, employee-owners voted by a margin of 86% to proceed with the initial public offering ( IPO ) whereupon completion SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) also paid – to existing stockholders – a ‘special dividend’ of $1,600,000,000 billion to $2,400,000,000 billion ( USD ).

On October 17, 2006 SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) conducted an initial public offering ( IPO ) of common stock offering of 86,250,000 shares priced at $15.00 per share. Underwriters – BEAR STEARNS and MORGAN STANLEY – exercised over-allotment options resulting in 11,025,000 million shares seeing the IPO raise $1,245,000,000 billion ( USD ).

SCIENCE APPLICATIONS INTERNATIONAL CORPORATION ( SAIC ) had approximately 46,000 total employees, 16,000 employees were in McLean, Virginia ( USA ) and another 5,000 employees were in San Diego, California ( USA ).

SRA INTERNATIONAL INC. ( SRA ) [ http://www.sra.com/about-us/profile.php ]

SRA INTERNATIONAL INC., founded in 1978, headquartered in Fairfax, Virginia has additional U.S. offices.

SRA INTERNATIONAL INC. is a leading provider of information technology services and solutions to clients in national security, health care and public health, and civil government markets, requiring:

– Strategic Consulting; – Systems Design, Development, and Integration; – OutSourcing; and, – Operations Management.

SRA INTERNATIONAL INC. also delivers business solutions, for:

– Text mining; – Data mining; – Disaster and Contingency Response Planning; – Information Sssurance; – Environmental Strategies – Environmental Technology; – Enterprise Systems Management; and, – Wireless Integration.

SRA INTERNATIONAL INC. ORIONMagic ®

– –

Circa: 2002

IN-Q-TEL

Investments –

Portfolio Of Companies ( 2002 ) – Partial List

ARCSIGHT [ http://www.arcsight.com ] ( Security Management Software for The Enterprise )

ArcSight, founded in May 2000, is located in the heart of Silicon Valley, California ( USA ).

ArcSight is a leading supplier of enterprise software that provides the security “air traffic control system” for large, geographically dispersed organizations. These organizations are augmenting their network infrastructure with a wide variety of security devices such as firewalls, intrusion detection and identity management systems that produce a barrage of uncoordinated alarms and alerts that overwhelm the security staff.

With its ‘centralized view’ of ‘all security activity’ combined with ‘real time analysis’ of ‘events’, by both ‘operating at the perimeter and inside’ the organization, ArcSight provides a ‘single solution’, for:

Event capture; Log aggregation; Real time correlation; Incident investigation; and, Reporting.

ArcSight ‘separates’, the ‘true threats and attacks’ from the ‘millions of false alarms and non-threatening activities’ that occur each day, focusing attention and resources on high-priority problems.

The company has delivered enterprise, ‘security management solutions’ to leading ‘financial services’, ‘government’ and ‘manufacturing’ organizations while ‘attracting capital’ from ‘leading investors’, such as:

IN-Q-TEL; KLEINER PERKINS CAUFIELD & BYERS ( KPCB ); and, SUMITOMO CORPORATION.

ATTENSITY CORPORATION [ http://www.attensity.com ] ( Text Extraction for Threat Detection )

Attensity Corp., founded in 2000, is a privately held company with dual headquarters in Mountain View, California ( USA ) and Salt Lake City, Utah ( USA ).

Attensity Corp. provides enterprise, ‘analytic software’ and ‘services’, to:

Government agencies; and,

Fortune 500 companies.

Attensity has developed breakthrough text extraction technology that transforms information captured in free form text into structured, relational data.

Attensity enables government agencies to dramatically expand their analytical capabilities in the area of ‘threat detection’ by, powering:

Link analysis; Trending; Exception reporting; Other advanced analytics; and, Knowledge management applications.

Attensity technology is the culmination of nearly a decade [ 10-years ] of research in computational linguistics.

Attensity Corporation customers include:

IN-Q-TEL, a strategic venture group funded by the CIA; WHIRLPOOL; and, JOHN DEERE.

ATTENSITY CORPORATION investor, is:

IN-Q-TEL

BROWSE3D [ http://www.browse3d.com ] ( Advanced Web Navigation )

BROWSE3D, founded in 2000, is located in the Dulles Technology Corridor of northern Virgina.

The company’s first Knowledge Management product, the Browse3D Browser, enables Internet users to browse Web sites using a dynamic, interactive, 3 dimensional ( 3-D ) display environment.

One year later [ 2001 ] the Browse3D Browser was recognized as the Best Internet Software of 2001 at the COMDEX Fall Technology Show ( Las Vegas, Nevada, USA ).

Browse3D launched its ‘consumer product’ in January 2002.

For the past 2-years [ since 2000 ], Browse3D has been working to re-invent the online researcher’s tool set. A researcher’s ability to ‘harvest relevant online data’ is often limited by the tools available to view that data.

Future products and technologies promise additional improvements in the way users ‘find’, ‘organize’, ‘save’ and ‘exchange’ web-based ‘content’.

BROWSE3D early-stage venture funding provided, by:

IN-Q-TEL; and, angel investors.

CANDERA INC. [ http://www.candera.com ] ( Enterprise Storage )

Candera Incorporated, founded in 2000, is a development stage stealth mode company headquartered in Milpitas, California ( USA ).

Candera Inc. is developing a new generation, purpose built, network based storage management platform that gives businesses unprecedented ‘control over’ and ‘visibility into’ their networked storage environments.

With the Candera Confluence solution, businesses can dramatically improve the utilization of their existing heterogeneous storage assets by consolidating them into a centrally managed storage pool. These can then be quickly and dynamically allocated to meet the needs of current and future network based applications, giving large enterprises a strategic advantage.

Candera is building the first [ 1st ] system, of a new generation of systems, that will enable customers to unleash the ultimate value of networked information storage.

CONVERA [ http://www.convera.com ] ( Mission Critical Enterprise Search and Categorization Software )

Convera RetrievalWare is a high-performance intelligent search system that allows broad flexibility and scalability for implementation across corporate intranets and extranets, enabling users to index and search a wide range of distributed information resources, including text files, HTML, XML, over 200 proprietary document formats, relational database tables, document management systems and groupware repositories. Convera RetrievalWare excels in distributed client environments and server environments with hundreds or thousands of users, documents, images and / or multiple media assets.

Advanced search capabilities include concept and keyword searching, pattern searching and query by example.

Convera is a leading provider of enterprise mission-critical ‘search’, ‘retrieval’ and ‘categorizing’ solutions.

More than 800 customers – in 33 countries – rely on Convera search solutions to power a broad range of mission critical applications, including enterprise:

Portals; Knowledge management; Intelligence gathering; Profiling; Corporate policy compliance; Regulatory compliance; Customer service; and, More.

DECRU [ http://www.decru.com ] ( Secure Networked Storage )

Decru, founded in April 2001, is headquartered in Redwood City, California ( USA ).

Decru solves the problem of secure data storage with a robust, wire-speed encryption appliance that fits transparently into any SAN or NAS storage environment, protecting data from both internal and external threats.

Markets include essentially any organization with a need to protect proprietary or confidential information ( e.g. government, technology, financial services, health care ).

Investors, include:

IN-Q-TEL; NEA; GREYLOCK; and, BENCHMARK.

GRAVITRON [ http://www.graviton.com ] ( Early Warning Detection and Notification System for Homeland Security Over Wireless Mesh Networks )

GRAVITON, founded in 1999, is located in La Jolla, California, USA.

Solomon Trujillo, former head of U.S. WEST ( baby bell telephone company ), leads GRAVITRON.

GRAVITON is on leading edge of a fledgling ( small ) industry, known as:

Machine to Machine Communications ( M2M ).

GRAVITON is developing an advanced integrated wireless sensor platform uniquely optimized for large-scale distributed sensor network applications working with Micro Electro Mechanical Systems ( MEMS ) sensor and spread spectrum wireless technologies licensed exclusively to GRAVITON from the U.S. National Laboratory at Oakridge ( also known as ) Oakridge National Laboratory ( Tennessee, USA ) – managed by the U.S. Department of Energy ( DOE ).

GRAVITON products and solutions integrate wireless, sensor and data management technology enabling enterprises to efficiently and transparently monitor, control, send, receive, and update system information from devices anywhere in the world.

GRAVITON is supported and funded by a number of corporate partners and investors, including:

IN-Q-TEL; GLOBAL CROSSING; ROYAL DUTCH SHELL ( oil / petroleum ); MITSUI; SIEMENS; QUALCOM; OMRON; MOTOROLA; and, SUN MICROSYSTEMS.

GRAVITON ‘primary’ financial investors, include:

MERRILL LYNCH;

GRAVITON ‘venture capital’ firms, include:

KLEINER PERKINS CAUFIELD & BYERS ( KPCB ); and, EARLYBIRD.

INTELLISEEK [ http://www.intelliseek.com ] ( Enterprise Intelligence Solutions )

INTELLISEEK, founded in 1997, has since 1998 been changing the way organizations ‘understand’, ‘gather’ and ‘use’ enterprise ‘intelligence’.

INTELLISEEK ‘knowledge discovery tools’ [ as of: 2002 ] enable the nation’s largest enterprises with up-to-the-minute consumer, industry information and ‘competitive intelligence’.

INTELLISEEK ‘Enterprise Search Server’™ ( ESS ) search platform provides a suite of intelligent applications that automate ‘knowledge discovery’ and ‘knowledge aggregation’ from hundreds of disparate, and often hard-to-locate data sources.

INTELLISEEK ‘Knowledge Management’ and ‘Search and Discovery’ solutions solve the fundamental problem of “information overload” by identifying and searching relevant, targeted and personalized content from the internet, intranets and extranets.

INTELLISEEK clients, include:

FORD MOTOR COMPANY ( FOMOCO ); NOKIA; and, PROCTOR AND GAMBLE.

Investors include:

IN-Q-TEL; FORD VENTURES; RIVER CITIES CAPITAL; GENERAL ATLANTIC PARTNERS LLC; FLAT IRON PARTNERS; BLUE CHIP VENTURE COMPANY; NOKIA VENTURES; and, Other private investors.

METACARTA [ http://www.metacarta.com ] ( Geospatial Data Fusion )

MetaCarta, established in 1999, was launched on more than $1,000,000 million in funding from the U.S. Department Of Defense ( DOD ) Defense Advanced Projects Agency ( DARPA ) and private investors.

MetaCarta CEO John Frank, with a doctorate from the Massachusets Institute Of Technology ( MIT ) where during 1999 – as a Hertz Fellow in physics working on a PhD – conceived a new way to view – geographically – ‘collections of text’ that later saw MetaCarta combine his interests in algorithms, information design, and scientific models of real world phenomena.

Metacarta provides a new knowledge management platform that integrates ‘text data with geography’ providing a ‘cohesive system’ for ‘problem solving’.

METACARTA Geographic Text Search ( GTS ) appliance, the software solution, redefines how people interact with information, enabling analysts to view text reports and geographic information in one ( 1 ) logical view through integration of text and geography delivering new information not obtainable from any other source.

MetaCarta CEO John Frank graduated from Yale University.

MOHOMINE [ http://www.mohomine.com ] ( Transforming Unstructured Multi-Language Data Into Actionable Information )

MOHOMINE, founded in 1999, is privately-held and located in San Diego, California, USA.

MOHOMINE technology has been deployed by United States national security organizations.

MOHOMINE mohoClassifier for National Security Organizations ™ reviews ‘text information’ in ‘cables’, ‘e-mails’, ‘system files’, ‘intranets’, ‘extranets’ and ‘internet’ providing ‘automated document classification’, ‘routing’ – based upon ‘learn-by-example pattern recognition’ technology – and ‘reports’ on user defined properties such as ‘topic’, ‘subject’, ‘tone’ ( ‘urgent’, plus others ), ‘author’, ‘source’ ( geographic locations, ‘country’, etc. ), and more.

MOHOMINE mohoClassifier users can easily set up ‘filters’ to automatically ‘identify’ and ‘prioritize’ ( ‘read first’ requirement ) documents that are quickly processed – out-of large volumes of other data – and then quickly route prioritized information to quickly reach the proper people.

MOHOMINE, from Global 5000, currently [ since 2002 ] has more than one hundred fifty ( 150 ) customers across numerous vertical industries, including:

CITICORP; WELLS FARGO; INTEL; TEXAS INSTRUMENTS; PFIZER; BOEING; ORACLE; PEOPLESOFT; and, NIKE.

MOHOMINE investors, include:

IN-Q-TEL; HAMILTON APEX TECHNOLOGY VENTURES; and, WINDWARD VENTURES.

QYNERGY CORPORATION [ http://www.qynergy.com ] ( Long-Lasting Power Solutions For Multiple Applications And Small-Tech )

QYNERGY CORP., founded in 2001, is located in Albuquerque, New Mexico.

QYNERGY technology originated at the U.S. National Laboratory at Sandia ( also known as ) Sandia National Laboratories ( New Mexico, USA ) and at the University of New Mexico ( New Mexico, USA ).

QYNERGY Corp. develops leading-edge energy solutions based on QYNERGY proprietary QynCell ™ technology that made an exciting breakthrough – over other ‘battery’ or ‘portable energy’ devices – in ‘materials science’ allowing QYNERGY to possess several unique competitive advantages.

QYNERGY QynCell ™ is an ‘electrical energy device’ revolution, providing:

Long-lived Batteries – QynCell usable life is potentially over a period of ‘several decades’ ( 10-year multiples ), during which time the QynCell device ‘does not require external charging’;

Miniature and Micro Applications – QynCell™ technology is scaleable, thus can be ‘miniaturized’, for:

Micro Electro Mechanical Systems ( MEMS ); MicroPower™ applications; Small microelectronics; and, Power-on-a-chip applications.

SAFEWEB [ http://www.safewebinc.com ] ( Secure Remote Access )

SAFEWEB, established in April 2000, is based in Emeryville, California, USA.

SAFEWEB built the world’s largest ‘online privacy network’, however in 2001 its ‘free online service’ was ‘concluded’ – to focus on developing its ‘enterprise’ product.

SAFEWEB is a leading provider of innovative security and privacy technologies that are effective, economical and simple.

SAFEWEB Secure Extranet Appliance ( SEA ), the first [ 1st ] SAFEWEB enterprise security release – reduces the cost and complexity traditionally involved in securing corporate network resources.

SAFEWEB Secure Extranet Appliance ( SEA ), named Tsunami, is a fundamental ‘redesign of extranet architecture’ integrating disparate technologies into a ‘modular plug-in network appliance’ ( SEA Tsunami).

SAFEWEB SEA Tsunami is an ‘all-in-one solution’ simplifying implementation of ‘extranets’ and ‘Virtual Private Networks’ ( VPN ) reducing Total Cost of Ownership ( TCO ) by innovative architecture letting companies build – in less than 1-hour – ‘secure extranets’ providing ‘remote stationed’ enablement of ‘employees’, ‘clients’ and ‘partners’ to access ‘internal applications’ and ‘secure data’ from anywhere using a standard internet website browser.

SAFEWEB delivers, through established strategic partnerships, customized versions of its Secure Extranet Appliance ( SEA ) Tsunami technology to U.S. intelligence [ CIA, etc. ] and communications agencies [ NSA, etc. ].

SAFEWEB investors, include:

IN-Q-TEL; CHILTON INVESTMENTS; and, KINGDON CAPITAL.

STRATIFY INCORPORATED [  ] ( Unstructured Data Management Software )

In 1999, PURPLE YOGI was founded by former INTEL Microcomputer Research Laboratory scientists Ramana Venkata and Ramesh Subramonian.

PURPLE YOGI, became known as STRATIFY INCORPORATED ( a privately-held company ).

In early 2001, ORACLE CORPORATION veteran and senior executive Nimish Mehta became president and chief executive officer ( CEO ).

STRATIFY INC., headquartered in Mountain View, California ( USA ), is [ 2002 ] the ‘emerging’ leader in ‘unstructured data management’ software.

STRATIFY Discovery System is a ‘complete enterprise software platform’ helping todays [ 2002 ] organizations ‘harness vast information overload’ by ‘automating the process’ of ‘organizing’, ‘classifying’ and ‘presenting’ business-critical unstructured information usually found in ‘documents’, ‘presentations’ and internet website pages.

STRATIFY Discovery System platform ‘transforms unstructured internal and external data’ into ‘immediately accessible relevant information’ automatically organizing millions of documents displayed in easy navigational hierarchy.

STRATIFY INC. clients, include:

INLUMEN and INFOSYS TECHNOLOGIES LIMITED, named in 2001 as one ( 1 ) of The Red Herring 100.

STRATIFY INC. received funding, from:

IN-Q-TEL; H & Q AT INDIA (also known as ) H & Q ASIA PACIFIC; SOFTBANK VENTURE CAPITAL ( now known as ) MOBIUS VENTURE CAPITAL; SKYBLAZE VENTURES LLC; and, INTEL CAPITAL.

SRD [ http://www.srdnet.com ] ( Near Real Time Data Warehousing and Link Analysis )

SYSTEMS RESEARCH & DEVELOPMENT ( SRD ), founded in 1983, develops software applications to combat fraud, theft, and collusion.

SYSTEMS RESEARCH & DEVELOPMENT Non-Obvious Relationship Awareness ™ ( NORA ™ ) was originally developed for the gambling casino gaming industry

SYSTEMS RESEARCH & DEVELOPMENT NORA software is designed to identify correlations across vast amounts of structured data, from hundreds or thousands of data sources, in near real-time, and alert users to potentially harmful relationships between and among people.

SRD NORA software technology leverages SYSTEMS RESEARCH & DEVELOPMENT proven expertise in ‘aggregating’, ‘warehousing’ and ‘leveraging people data’ and ‘transaction data’ to strengthen corporate management and security systems.

SYSTEMS RESEARCH & DEVELOPMENT clients [ 2002 ], include:

U.S. Depaartment of Defense ( DOD ); CENDANT; TARGET; MGM MIRAGE; MANDALAY BAY RESORT GROUP; and, Food Marketing Institute.

TACIT [ http://www.tacit.com ] ( Enterprise Expertise Automation )

TACIT, founded in 1997, is located in Palo Alto, California ( USA ) with regional sales offices in Virginia, Maryland, Pennsylvania and Illinois.

David Gilmour serves as president and chief executive officer ( CEO ).

TACIT Knowledge Systems is the pioneer and leader in ‘Enterprise Expertise Automation’.

TACIT products ‘automatically and continuously inventories’ the ‘skills’ and ‘work focus’ of an ‘entire organization’ for ‘dynamic location’ of ‘connections to expertise needed’ – when needed to make decisions, solve problems, and serve customers.

TACIT products also include its award winning flagship product KnowledgeMail™. In June 200, TACIT was voted one of the “Hot 100 Private Companies,” by Upside Magazine.

In 2000 and 2001, TACIT was one ( 1 ) of the “100 Companies that Matter,” by KM World [ Knowledge Management World ].

TACIT attracted a ‘world class advisory board’ with interest from ‘venture capital’ and Fortune 500 ‘clients’ of ‘enterprise’ and ‘customers’, including:

IN-Q-TEL; JP MORGAN; CHEVRON-TEXACO ( petroleum and chemical ); UNISYS; HEWLETT-PACKARD; NORTHROP-GRUMAN ( aerospace & defense ); and, ELI LILLY ( pharmaceuticals ).

TACIT investors, include:

IN-Q-TEL; DRAPER FISHER JURVETSON; REUTERS GREENHOUSE FUND; and, ALTA PARTNERS.

TRACTION SOFTWARE [ http://www.tractionsoftware.com ] ( Harvest and Use Information from All Sources )

TRACTION SOFTWARE, founded in 1996, is located in Providence, Rhode Island ( USA ).

TRACTION® Software is the leader in ‘Enterprise Weblog’ software, bringing together working ‘communications’, ‘knowledge management’, ‘content management’, ‘collaboration’, and the ‘writable intranet portal’.

TRACTION TeamPage™ product addresses the need for ‘unified on-demand view’ of ‘team content’ and ‘team communication’ from ‘all document sources’ in ‘context’ and over ‘time’.

TRACTION TeamPage deploys quickly and easily on an existing network and delivers a ‘capstone communication system’ by turning ‘e-mail’ and ‘web browser’ into powerful tools for end-users.

TeamPage targets ‘program teams’ and ‘product management teams’ in ‘government’ and ‘business’.

TRACTION also supports a wide range of applications and business processes, including but not limited, to:

Business Intelligence and Market Research;

Collection Highlighting and Media Distribution;

Investor Relations E-Mail and Public Relations E-Mail Triage and Response; and,

Tracking Exception Process and Reporting Exception Process.

TRACTION SOFTWARE investors, include:

IN-Q-TEL; SLATER CENTER FOR INTERACTIVE TECHNOLOGY; and, private investors.

ZAPLET INCORPORATED [ http://www.zaplet.com ] ( Enterprise Collaboration Tools For Email )

ZAPLET INC., founded in 1999, is located in Redwood Shores, California ( USA ).

ZAPLET INC. is an enterprise software and services company and creator of the Zaplet Appmail System™ collaboration software that brings application functionality directly to a user’s inbox to complete business processes.

ZAPLET INC. Appmail, using a server-based platform, combines power, ‘centralized control’ and ‘robust security’ for traditional enterprise application systems with the convenience and ease-of-use of e-mail.

ZAPLET Appmail in-box becomes the gateway to a protected server where the application functionality and data securely reside.

Zaplet™ Appmail can be used, to:

Manage and Streamline mission-critical business processes; Requires no additional client-side upgrades; and, Instantly expandable for work teams ‘beyond’ the ‘enterprise’.

ZAPLET INC. has received numerous awards, including:

Red Herring 100; Enterprise Outlook – Investors’ Choice; and, Internet Outlook – Investors’ Choice.

ZAPLET INC. customers, include leading companies, in:

Finance; Telecommunication; High technologies; and, Government.

ZAPLET INC. is backed by world class investors, including:

KLEINER PERKINS CAUFIELD & BYERS ( KPCB ); ACCENTURE TECHNOLOGY VENTURES; QUESTMARK PARTNERS L.P.; RESEARCH IN MOTION LIMITED ( RIM ); INTEGRAL CAPITAL PARTNERS; ORACLE CORPORATION; CISCO SYSTEMS INC.; and, NOVELL INC.

– –

Circa: 2010

IN-Q-TEL

Investments –

Portfolio of Companies ( 2010 ) – Partial List

3VR Security AdaptivEnergy Adapx Arcxis Asankya Basis Technology Bay Microsystems CallMiner Cambrios Carnegie Speech CleverSafe ( SAIC ) CopperEye Destineer Elemental Technlogies Ember Corporation Endeca Etherstack FEBIT FireEye FluiDigm

Reference(s)

http://web.archive.org/web/20020630223724/http://www.inqtel.com/news/attachments/InQTelInvestmentPortfolio.pdf http://www.iqt.org/technology-portfolio/orionmagic.html http://defense-ventures.com/in-q-tel/

– – – –

Research References

“Information Technology Trends And Their Impact On CIA,” January 1999, declassified report of the U.S. Central Intelligence Agency by, CIA Chief Information Officer.

– – – –

 

Submitted for review and commentary by,

 

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

/

/