CIA StarShip Speed 1 LightYear In 1 EarthYear

CIA StarShip Speed 1 LightYear In 1 EarthYear

by, Concept Activity Research Vault ( CARV / Paul Collin )

April 26, 2013 13:22 ( PST ) Update ( Originally Published: July 14, 2012 )

Relative Topics -i

– Virgo A ( constellation, of 5 ‘cluster galaxies’ );
– Supernovae ( ’2 star collision’, a ‘dual galaxy collision’ );
– Supermassive Black-Hole ( SmBH – resultant effect from a Supernovae dual galaxy collision );
– Centaurus A ( Cen-A, A1847, NGC 5128, PKS 1322-427, 1RXH J132519.8-430312, CXOU J132519.9-43031 );
– Gamma-Ray Burst ( GRB – inverse / reverse Compton effect Cherenkov Muons, Cherenkov Light, and Cherenkov Radiation );
– Jet ( Whole Jet, Counter Jet – extragalactic SmBH inverse / reverse Compton effect Cherenkov multi-messenger neutron light radiation Cosmic Ray particle beam jets );
– Atmosphere, Atmospherics;
– Oxygen ( Oxygen molecules ), Ozone atmospheric layer;
– Tau Air-Shower ( invisible Cherenkov light muon Thomson scatter patterned atmospheric showers );
– Planetary magamatic superconductivity volcanic excitations;
– Earthquake, seismicity;
– BowShock, Local Interstellar Cloud ( LIC ), InterStellar Medium ( ISM );
– NASA PlanetQuest ( Planet Quest );
– Living With a Star ( LWS ) U.S. Presidential Executive Order Programs and Projects [ http://en.wikipedia.org/wiki/Living_With_a_Star ];
– U.S. President Executive Order ( 2004 – NASA missioned to locate exoplanets for Earth colonization ) [ http://georgewbush-whitehouse.archives.gov/news/releases/2004/01/20040130-7.html ];
– ExoPlanetary exploration ( ExoPlanet, planets outside Earth solar system ) [ http://govinfo.library.unt.edu/moontomars/docs/M2MReportScreenFinal.pdf ];
– InterStellar SpaceCraft ( ISSC );
– Interstellar Propulsion;
– AntiHydrogen Energy Dense AntiMatter Hydrogen Propellants developed by CERN ( Geneva, Switzerland ) [ http://arxiv.org/pdf/astro-ph/0410511.pdf ];
– RAND CORPORATION ( AntiHydrogen energy development research );
– RAYTHEON CORPORATION ( AntiHydrogen energy propulsion development and test-engineering );
– ORBITAL SCIENCES CORPORATION ( AntiHydrogen energy propulsion test platform and demonstrator );
– BOEING CORPORATION ( AntiHydrogen Propulsion );
– National Reconnaissance Office ( NRO ), U.S. Navy, U.S. Air Force ( USAF ), U.S. National Security Agency ( NSA );

Constellation Virgo A

Virgo A, extragalactic resident attributes, consist of:

– Five ( 5 ) Galaxies ( positioned in ‘1 cluster’ within ‘Virgo A’ );
– One ( 1 ) Supermassive Black-Hole ( SmBH ) resultant from a ‘dual galactic collision’ ( 2 );
– Two ( 2 ) InterStellar Medium ( ISM ) BowShock Radio-Frequency ( RF ) Wave ( Radio Wave ) Lobe Clouds; and,
– Two ( 2 ) Gamma-Ray Burst ( GRB ) Jets ( Whole Jet & Counter Jet ).

According to recent scientific reports, the central galactic universe source for ‘all Cosmic Rays’ ( CR ) continues ricocheting throughout our galactic universe today from, amongst many smaller extragalactic sources, one ( 1 ) lone Gamma-Ray Burst ( GRB ) “Whole Jet” ( ‘not’ its dual counter-part, the: “Counter Jet” ) that occured eons ago when two ( 2 ) Supermassive galaxies collided with one another forming what is known as a Supernovae ( ‘dual’ Supernovae ) that was so powerful and so bright that the GRB Cherenkov neutron light radiation particle beam jet was ‘visually observed with the human eye’ of an astronomer citing notes that were traced back to having been coming out-of the “Scorpius-Centaur” extragalactic universe during the 16th Century ( 1522 ) that astronomers identify as originating from near the center of “Centaurus A” ( also known as ) “Cen-A” where this gigantic particle beam jet diameter is about the same diameter size of Earth’s solar system, but wait – there’s more:

This same particle beam jet contains “Superluminous” light energy radiation particle elements moving at up to six ( 6 ) times faster than the speed of light. That’s correct, the jet light is moving 6 times faster than lightspeed, and was shot out of the extragalactic Supermassive Black-Hole in ‘forward time’ that began catching up with Earth time in December 2004 when within 48-hours of the Indonesia Andaman – Sumatra 9.3 magnitude earthquake that sent a tsunami killing nearly 300,000 people and also burnt an incredibly huge hole right out-of the Earth’s ozone atmospheric layer, and it is additionally unfortunate to note that not until very recently did these two ( 2 ) seemingly dissimilar Earth Events finally become put together by astrophysicists and geoastrophysicists – whom never saw to publicly announcing these facts to the public through mainstream news media. Why?

Earth ‘southern hemisphere’ telescope observatories view this galactic system hazard growing worse as it quickly nears our solar system, in-fact the National Science Foundation ( NSF ) has increased huge sums of global investment monies into the South Pole Telescope ( SPT ), Pierre Auger Observatory Sud ( Argentina ), Cherenkov Telescope Array ( CTA ), and more.

NASA pressed Hubble Space Telescope ( HST ) retrofits and its new space-based replacement James Webb Telescope Observatory in-addition to many multiple other ground-based and space-based telescope observatories around the world trying to calculate the ‘date when’ the Cen-A GRB particle beam jet is expected to elminate all Earth oxygen particles.

How can this be? No one heas heard anything on the news about this! This is ‘why’ this report was written, to notify the public to begin researching it for themself because their governments are ‘no longer obligated to warn them of any pending natural disaster’, a new law just passed into effect for the United States during early 2012.

While the trajectory of this Cen-A GRB particle beam jet propagating velocity is directed just north and above our solar system the fact is that when this particle beam jet comes to within a range of 1,000 light-years away from Erath’s solar system it will become what scientists and astrophysicists and some mebers of government already know it has been labelled as, a “Planetary Atmosphere Killer.”

Earth already received a sample of the power involved within the Cen-A GRB particle beam jet when only a small scouting team of these particle beam entered Earth’s upper ionosphere from where it rained down into Earth’s oxygen atmosphere layer as recorded by detectors registering what are called “Tau Air-Showers” invisible pin-prick light streams characterized by straight line patterned Thomson scatter or self ricocheting sprays of intense radiation some just categorize as “Cosmic Rays” that hit Earth during latter December 2004.

Unfortunately, much more of the same is headed our way and such bombardments are continuing in intensity day by day with increasing frequencies too on Earth.

Enterprise Sustainability Black-Hole Economics

Public Private Partnership Program Project Sub-Contractor Sub-Program Sub-Projects

For more than 62-years, multiple trillions of dollars – allocated by authority of The World Bank through the International Monetary Fund ( IMF ) to multi-national foundations have been and continue being spent under auspices of “Global Initiatives” instituting multi-national government agreements utilizing private-sector businesses, vendor sub-contractors, universities, colleges, institutes coordinated by liaisons with think-tanks conceptualizing, theorizing, calculating, researching, developing, testing, designing, building, launching, maintaining, retrofitting and relaunching ground-based and space-based platform monitoring stations equipped with ultra advanced emerging technology remote sensing collectors, systems and devices feeding incalculable amounts of raw data into high-performance supercomputers with centralized network analytic technicians translating human comprehendable information back to yet other think-tanks developing plans to somehow carry a ‘few’ ( ‘not all’ ) human beings off Earth and out of harms way; away from the Cen-A GRB particle beam jet – that has included, but not been limited to, some ( ‘not all’ ) of the following target areas ( below ):

– Earth-based Deserts ( high altitude and low-altitude ), Mountains ( high-altitude ), Waterways ( natural ocean and artificially controlled man-made lake ) and Polar Ice ( South Pole ) regions; plus,

– Space-based Satellite Spacecraft platforms monitoring Earth solar system local space regions, deep space and ultra-deep space extragalactic spectral imaging sensor and particle collector ‘downstreams’ to ground-stations, plus ‘classified spacecraft mission retrievables ( i.e. ‘particle collection unit replacement modules’ and ‘data platens’ ).

Detailed information, although publicly available, remains unpackaged and unknown for public comprehension.

People, especially in the United States of America ( USA ), are so busy with distractions that they barely allow time in their daily life for so much as even a ‘partial news broadcast’ so, there is a better than even chance that people as a whole are not going to ‘make time for researching anything’ ( ‘if they knew how’ ) or even bother opening their local telephone book Yellow Page directory where unfortunately ‘private researchers are not even listed’ in ‘local retail neighorhood directories’.

Just attempting to put all these galactic and technical observation puzzle pieces together into a ‘layman easily-read information package’ surrounding the Cen-A GRB particle beam headed toward Earth would not be difficult for the United States government. Why should government upset its citizenry status quo and that of market money heirarchy fueled by rat raced society today?

U.S. Government Printing Office ( Pueblo, Colorado, USA )

How does government sell a drastic change, to the way everyone currently lives their own lives today, without creating chaos?

How long will it take to fully adjust a global population?

You remember the pen name author George Orwell who wrote his book entitled “1984 ( printed in “1948″ ) that showed how future shock took place for citizen masses, and like any good snake oil salesman of yore, the key is all in how you pitch the new idea.

Living With AstroParticles ( LWA ) – Next Earth New Life Guides

– Oxygen Alternatives ( QxyAlt );
– OxyAlt Vitamin Energy Boosters ( Recreational Biosynthetic Narcotics By Rx Only );
– OxyAlt Breathing Boosters;
– OxyAlt Cosmetics ( Men & Women ); and,
– Others.

U.S. Government Printing Office ( Pueblo, Colorado, USA ) may very well hold a type of guidebook for population centers remaining in the dark on such matters involving “Carbony1 Sulfide” alternatives most human beings on Earth today know absolutely nothing about that alter molecular structures defining new breathable gas mixtures substituting what we breathe today ( oxygen ) for what we will be breathing ( carbony1 sulfide mixtures ) in the future.

Only a few looming questions remain:

1. “Will Government Ever Announce This?”
2. “Why Didn’t Government Announce The 2004 Particle Beam Ozone Destruction?”
3. “Has Government Announced ‘Your Preparatory Time’ ( 15-years, 20-years, 35-years )?”
4. “When Will You Prepare Your Timetable?”
5. “Will You Continue Waiting For A Government Announcement That Will Never Come?”
6. “Did You Elect Government Control Over Your Life And That Of Your Family?”

The Cen-A Cherenkov particle beam GRB jet possesses other enhanced characteristics still being studied, however we do know this particular hazardous light radiation beam, is:

– High-Directional ( light-beam focus length );
– Thick ( approx. diameter of Earth’s solar system );
– Planetary Atmosphere Killer ( ‘incredibly superbright’ and ‘intensely hot photonic’ and ‘extremely hazardous radiation particles’; and,
– Distant Death Ray ( even as far away as 1,000 light-years from any planet’, these ‘GRB jet light beams’ instantly disintegrate oxygen molecules’ throughout entire atmospheric ( ozone layer ) layers.

Few planets are rich in ‘oxygen molecules’, however many planets in our solar system ‘do have some atmosphere’, although believed by many as humanly unbreatheable, nevertheless Gamma-Ray Burst ( GRB ) jets destroy oxygen molecules ‘above planets’ and recently discovered ‘within planets’ to some degree.

Space-borne “Ray” ( “Radiation” ) Relative Yet Confusing Terms –

– Cosmic Rays ( CR );
– Gamma-Rays ( GR );
– Gamma Ray Bursts ( GRB );
– Whole Jet ( WJ );
– Counter Jet ( CJ );
– Jet ( e.g. Cherenkov muon multi-messenger astroparticles – atomic and subatomic spaceborne particles );
– Stream ( e.g. Cherenkov muon multi-messenger astroparticles – atomic and subatomic spaceborne particles );
– Beam ( e.g. Cherenkov muon multi-messenger astroparticles – atomic and subatomic spaceborne particles );
– FlashLight ( e.g. Cherenkov muon multi-messenger astroparticles – atomic and subatomic spaceborne particles );
– Tau AirShowers ( e.g. Space-borne invisible but powerful light energy ‘astroparticles’ penetrate Earth atmospherics beneath where “Thomson scatter” ‘patterned astroparticles’ ricochet Ultra-High Energies ( UHE ) penetrate Earth surfaces ( ground and waterway floor beds ) down to varying depths – including more than 1-mile deep );
– Sprites;
– Elves; and,
– Others.

Modern day terms define any “vehicle” as an “engine driven carrying machine” – a ‘generalized term identifying certain objects’. Further name labeling identifies “vehicle” object ‘types’, specifically:

– Automobile ( Car );
– Truck;
– Van;
– Motorcycle ( Motor-Bicycle / Motor Bike );
– ATV ( All Terrain Vehicle );
– Amphibian ( Amphibious Car / Boat / AS – Aerial Submersible );
– Boat ( Motorized Boat / Yacht / SailBoat / Jet-Ski );
– Submarine ( Submersible / Mini-Sub ); and,
– Train ( Locomotive );
– Helicopter;
– Airplane; and,
– Spacecraft.

Cen-A GRB is our solar system major point of origin for ‘all’ “Cosmic Rays”

Cosmic Rays ( CR ) encapsulate, for the most part, extragalactic high-energy, power and light radiation level atomic and subatomic particle element energy propagation velocity phenomenon.

Cosmic Ray GRB jets consist of intertwinded powerful behaviors, involving:

– ‘Atomic particles’ are ‘semi-visible Matter’, involving photons, protons, electrons, atoms, etc. we’ve all heard mentioned before; and,

– ‘Subatomic particles’ are ‘invisible AntiMatter’, involving neutrinos, muons, leptons, neutralinos, neutrinos, etc. we rarely hear anything about.

Neutrino type particles ‘penetrate planets and stars instantly’. In-fact, in ‘far less time than a human eye blink’ neutrino-like particles will penetrate a through solid rock or thermonuclear dynamos of any planet or sun and right out the other side shooting off into the rest of the universe ).

Subatomic neutrinos were believed by the public to typically pass through all Earth life human beings without any adverse affects as nothing appeared to happen, however, recent discoveries prove ‘subatomic particles’ ( penetrating incredible hard objects ) linked to ‘atomic particle radiation types’ and into the rest of galactic space ), e.g. Cherenkov muon light radiation, X-rays, Gamma-rays, Cosmic Rays ) beam ( 4,000 light-years long fro”Centaurus A” ( Cen-A ) GRB is known as the ‘brightest object from Earth’s southern sky telescope observatories.

Earth bombardments therefrom, consisting of certain Compton effect Cherenkov muon light radiation photonics bound with UltraHigh-Energy ( UHE ) extremely bright blue-white pin pricks of light of incredibly extreme heat intensity powered by ( multiple Trillion electron Volts / TeV) within two ( 2 ) Gamma-Ray Burst ( GRB ) jets streaming more than 4,000 lightyears in length – forward in time – out-of the accretion disk in a Supermassive Black-Hole ( SmBH ) resultant from a ‘dual galaxy collision’ Supernovae eons ago additionally producing a ‘dual lobe’ InterStellar Medium ( ISM ) Local Interstellar Cloud ( LIC ) packing two ( 2 ) bowshock gravitational wave shells travelling with incredible velocity already mushroomed now directly into the path of Earth’s solar system Sun.

What is being done to ‘protectively shelter Earth life’ and/or ‘prolong Earth life’?

Is Earth being abandoned? If so, ‘How’ and ‘When’?

Information was presented to the current President of the United States ( POTUS ) hurriedly co-signed his predecessor’s previous U.S. Presidential Executive Order ( EO ) instituting, the following:

Living With a Star ( LWS ) Program [ http://en.wikipedia.org/wiki/Executive_Order_13326 ].

Office of the President of the United States ( POTUS ) Executive Order ( EO ) 13326 Living With a Star Program [ http://en.wikipedia.org/wiki/Executive_Order_13326 ] involves something recently brought to light in the mainstream news media, surrounding:

– Exoplanetary Research, Exploration, Sustainability, and Colonization.

U.S. NASA Crazy? Like A Fox!

Former high-ranking U.S. military personnel retirees, usually enter private-sector industrial U.S. defense complex companies funded by U.S. government citizen tax dollars [ http://georgewbush-whitehouse.archives.gov/news/releases/2004/01/20040130-7.html ] involving educational institution Programs building brilliantly talented student minds recruited into certain entities missioned with herculean tasks ( e.g. design, engineer, test, demonstrate and produce advanced projects for secret programs ) of which the BOEING interstellar spacecraft reality bought and built all of the aforementioned through [ http://patft.uspto.gov/netacgi/nph-Parser?Sect2=PTO1&Sect2=HITOFF&asmp;p=1&u=/netahtml/PTO/search-bool.html&r=1&f=G&l=50&d=PALL&RefSrch=yes&Query=PN/6813330 ] RAYTHEON ( and other entities ) where employees had no clue their Project was aonly a sub-Project for another larger Project that was part of a global network of sub-Programs tied into CERN ( Geneva, Switzerland ) that held 5 ( 5 ) other global Projects leading sub-Programs traversed in and out of several multinational government space Science & Technology ( S&T ) Programs leading many multiple sub-Project ultra-deep space ground-based and space-based observatory platforms retrofitted with new advanced adaptive optics, replacement mirrors, spectral sensor data imagery converters, extragalactic particle collectors.

CIA InterStellar Propulsion SpaceCraft Attains 1 Light-Year In 1 Earth-Year

[ http://api.ning.com:80/files/dnFzKnIyGRYCzDJmMen9Ee5UUpegrAM3Ij6jeMowczzQ9hvavlbK9nkItmEfy8ATu1ySDg*k4pa0vsGaANzB1pp8fEmW0UUp/2008BOEINGElSegundoCaliforniaOrbitalSciencesandRaytheonSAS.jpg ]

[ http://api.ning.com:80/files/dnFzKnIyGRYCzDJmMen9Ee5UUpegrAM3Ij6jeMowczzQ9hvavlbK9nkItmEfy8ATu1ySDg*k4pa0vsGaANzB1pp8fEmW0UUp/2008BOEINGElSegundoCaliforniaOrbitalSciencesandRaytheonSAS.jpg ]

What began to develop from using Soviet bloc country physicists secretly smuggled in and out of the MCDONNELL DOUGLAS Space Systems Center “Quad Building” ( Huntington Beach, California, USA ) through today’s current successor BOEING Space Systems Center ( Huntington Beach, California ), RAYTHEON CORPORATION ( Waltham, MA ) and RAYTHEON SAS ( El Segundo, California, USA ) has been discovered [ http://patents.justia.com/assignee/RAYTHEONCOMPANY.html ] and selling what other foreign nations spent billions of dollars on CERN ( Geneva, Switzerland ) projects, however secretly CERN knew well its sole mission for the United States of America: “Refine AntiMatter Hydrogen SubAtomic Production Into AntiHydrogen Propulsion Power for the U.S. Interstellar SpaceShip. Sound ridiculous?

In 1986, RAND CORPORATION ( Santa Monica, California, USA ) was the U.S. Central Intelligence Agency ( C.I.A. ) Science and Technology ( S&T ) private-sector foreign intelligence contractor missioned to spy and steal secrets from physicists particularly inside Eastern Europe countries and governments ( Czechoslovakia, Soviet Union, et. al. ) and on Western Europe U.S. friendly foreign nations. RAND CORPORATION primary intelligence mission was for the [ http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA229279 ] U.S. government military to obtain Anti-Hydrogen propulsion power for its own extragalactic starship [ http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA229279 ] one [ http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA229279 ] of [ http://www.dtic.mil/dtic/tr/fulltext/u2/a238011.pdf ] two [ http://www.dtic.mil/dtic/tr/fulltext/u2/a238011.pdf ] reports ).

In a novel published in 2000 that was turned into 2009 motion picture film entitled, “Angels & Demons,” a hidden vat of potentially explosive AntiHydrogen was depicted buried under the Holy Roman Catholic Church St. Peter Basilica cathedral ( Rome, Italy ), the CERN LBNL UCB physicists’ experiment numbers ( 38 ) of AntiHydrogen atoms and duration ( 1/10th of 1-second ) do not actually threaten the Vatican.

In 2002, CERN created the first ( 1st ) artificially produced Low-Energy ( LE ) antihydrogen atoms, consisting of a positron ( AntiMatter Electron / AntiElectron ) orbiting an AntiProton nucleus, but at that time atoms were ( striking normal ‘Matter’ ) were ( within microseconds ) instantly annihilated by Gamma-Rays.

Another CERN sub-Program sub-Project experimental test ( CPT symmetry ) named ALPHA ( Antihydrogen Laser PHysics Apparatus ) involved global collaboration, including physicists from Lawrence-Livermore Berkeley National Laboratory ( LBNL – Berkeley, California ) and the University of California at Berkeley ( UCB ) TRAPping ( for a little more than 1/10th of 1-second ) a total of thirty-eight ( 38 ) AntiHydrogen atoms before instantly annihilating ( again ).

Later, another CERN sub-Project experiment named ATHENA, a larger global collaboration shutdown in 2004, first ( 1st ) detection of Cold AntiHydrogen.

Alladin Magic Lamp Powerful Genie Trapped

CERN sole purpose was to create Interstellar Propulsion Power for the United States of America BOEING starship, which CERN successfully accomplished after ‘creating tens of millions of AntiHydrogen atoms’.

CERN used autoresonance on Cold AntiProtons compressed into two ( 2 ) individual tiny ( 20 mm long x 1.4 mm diameter / size of a ‘wooden match-stick head’ ) tubular-shaped bubbles.

In 2004, two ( 2 ) tiny tubular bubble clouds were positioned within an ‘expensive’ hollowed-out bottle, an octupole superconducting magnet wall bottle ( lamp ), used to stablize UltraHigh-Energy ( UHE ) Trillion electron Volt ( TeV ) plasmodic discharges.

One ( 1 ) tiny ( 20-mm x 1.4-mm ) tubular-shaped bubble cloud was tasked to flow over the other ( 2nd ) tiny tublar-shaped bubble when the AntiHydrogen power was finally trapped in the containment tank ( bottle ).

There is barely any weight involved at all for propulsion power.

AntiHydrogen SubAtomic power had been harnessed for almost 3-years before the BOEING U.S. interstellar starship was witnessed on November 11, 2011.

Interstellar starship staring-plane mosaic guidance systems and antihydrogen energy propellant has been confidentially reported as already having made interstellar flights in a spaceship that has travelled “1 light-year away away from Earth in about 1 Earth-year,” and on November 11, 2011 at 8:15 p.m. such interstellar starship was viewed by two ( 2 ) adult men.

Believed under secret flight orders dispatched through the U.S. National Reconnaissance Office ( NRO ) that dispacthes CIA overflight missions, the interstellar starship was recorded travelling silently above ground at an incredibly low altitude of approximately 1,000 feet and doing so at an incredibly slow speed of approximately 35-miles per hour. The men remarked at how such an incredibly huge elongated black triangle craft could even remain airborne while exhibiting such.

One ( 1 ) of the witnesses, was a former U.S. Air Force intelligence specialist who the following morning ( November 12, 2011 at 9:30 a.m. ) was visited by a retired Lieutenant General of the United States Air Force. Details of their conversation have not been released to the public to-date.

AntiMatter force deflection dematerializing space debris from otherwise penetrating the outer skin ( akin to ‘UltraViolet light zapping flying insects’ ) undoubtedly protected craft structural integrity of the BOEING interstellar starship ( “Enterprise III” ).

Exoplanetary Mission & Interstellar Spaceship ( NOA ARC ) Living With a Star Program – [ http://www.nasa.gov/pdf/60736main_M2M_report_small.pdf ] Moon, Mars and Beyond How ‘big’ would would an interstellar spacecraft need to be ‘extragalactic Noah Ark spacecraft’ be to ‘transport all Earth life’ out-of Earth’s current solar system to ‘another planet’ chosen as ‘suitable for sustaining human life?

Chances are more than likely today, that while much of the public is convinced ‘no human being ( living on Earth today )’ might ‘endure travelling at the speed of light’ ( or ‘live long enough throughout light-speed light-year distances’ ), CERN ( Geneva, Switzerland ) primary purpose was to develop interstellar propulsion power, which was successfully found in 2004 on behalf of the RAYTHEON CORPORATION to have already filed several [ http://patft.uspto.gov/netacgi/nph-Parser?Sect2=PTO1&Sect2=HITOFF&p=1&u=/netahtml/PTO/search-bool.html&r=1&f=G&l=50&d=PALL&RefSrch=yes&Query=PN/6813330 ] U.S. Patents, one ( 1 ) patent of which subsequently led-up to the BOEING interstellar starship appearance on November 11, 2011 at a 1,000 foot altitude only a few miles south of RAYTHEON SAS ( El Segundo, California, USA ) and the Los Algeles Air Force Station Space Center, and 1-mile north of the U.S. National Petroleum Reserve facility adjacent the Port of Los Angeles in San Pedro, California but more specifically spotted by numerous witnesses at 8:15 p.m. navigating over Harbor City, California.

U.S. government secret spacecraft already travelling to other planets powered by already perfected CERN AntiHydrogen refinery fuel.

An anonymous individual indicated, “Yes, indeed!”

Reasons ‘why their so sure’?

1. Officially presented with officially-verified contract source text information;

2. Officially presented with officially-verified contract precise final diagramatics of the ‘interstellar propulsion antimatter containment tank’;

3. Officially presented with officially-verified color photographs of the antimatter propulsion system demonstrator test ( roof-mounted building bench test ) including color photo of opposing-positioned camera equipment; traced to multiple otherwise secret photographs; and,

4. Two ( 2 ) eyewitnesses whom together experienced ‘up-close’ ( less than 1,000-feet away ) and ‘very personal’ but short-viewing timeframe ( 15-minute ) of this particular U.S. antimatter propulsion interstellar spacecraft, a sharp-lined semi-flat and elongated version of what amounted to a “BattleStar Gallactica” ( former American television series and film ) but not as clumsy looking as the scapecraft depicted on the TV series. Anonymous individual, believes the spacecraft seen was nothing more than a U.S. National Reconnaissance Office Advanced Programs Long-Range Spacecraft – ‘not’ a “NOA ARC” as pictured elsewhere.

Exoplanetary Earth Life Seedings

Today ( 2012 ) it is no longer beyond belief and certainly within the realm of possibility to extragalactically transport, such things as ‘initial seedlings’:

A. “Early-Earth pure ( non-hybrid ) agricultural ( food ) seed storage;”

B. “Earth human genome structures ( genetic cryogenic ) storage;” and,

C. Humanoid tasking onboard spacecraft controls, certain experiments, and ‘other taskings’, See, e.g. International Space Station ( ISS ) Robonaut-2 ( R-2 ).

Earth Sudden Impact

Why are details being kept away-from the global public?

Ask ‘difficult questions’, be prepared for ‘soft information’, and don’t be surprised when you see what Anonymous witnessed with others.

Non-status-quo independent conceptualists are endangered low commodity human species government prefers not want to see poisioning its sheep.

Bah-bah black sheep, have you any ‘more’ wool – covering your eyes?

Submitted for review and commentary by,

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.com
WWW: http://ConceptActivityResearchVault.WordPress.Com

Research Note(s)

Click to access 19890001573_1989001573.pdf

/

/

Earth Event Alerts

Earth Event Alerts

[ IMAGE ( above ): IBM Stratus and Cirrus supercomputers analyze Global Environmental Intelligence ( click to enlarge ) ]

Earth Event Alerts

by, Kentron Intellect Research Vault [ E-MAIL: KentronIntellectResearchVault@Gmail.Com ]

August 17, 2012 19:00:42 ( PST ) Updated ( Originally Published: March 23, 2011 )

MARYLAND, Fort George G. Meade – August 17, 2012 – IBM Stratus and IBM Cirrus supercomputers as well as CRAY XK6m and CRAY XT5 ( Jaguar ) massive parallel supercomputers and vector supercomputers are securely controlled via the U.S. National Security Agency ( NSA ) for analyzing Global Environmental Intelligence ( GEI ) data extracted from ground-based ( terrestrial ) monitoring stations and space-based ( extraterrestrial ) spaceborne platforms studying Earth Event ( Space Weather ) effects via High-Performance Computing ( HPC ) as well as, for:

– Weather Forecasting ( including: Space Weather ); – U.S. Government Classified Projects; – Scientific Research; – Design Engineering; and, – Other Research.

[ IMAGE ( above ): CRAY XK6m supercomputers analyze Global Environmental Intelligence ( click to enlarge ) ]

CRAY INC. largest customers are U.S. government agencies, e.g. the U.S. Department of Defense ( DoD ) Defense Advanced Research Projects Agency ( DARPA ) and the U.S. Department of Energy ( DOE ) Oak Ridge National Laboratory ( ORNL ), which accounts for about 3/4 of revenue for CRAY INC. – as well as other supercomputers used worldwide by academic institutions ( universities ) and industrial companies ( private-sector firms ).

CRAY INC. additionally provides maintenance, support services and sells data storage products from partners ( e.g. BlueArc, LSI and Quantum ).

Supercomputer competitors, of CRAY INC., are:

– IBM; – HEWLETT-PACKARD; and, – DELL.

On May 24, 2011 CRAY INC. announced its new CRAY XK6 supercomputer, a hybrid supercomputing system combining its Gemini InterConnect, AMD Opteron™ 6200 Series processors ( code-named: InterLagos ) and NVIDIA Tesla 20 Series GPUs into a tightly integrated upgradeable supercomputing system capable of more than 50 petaflops ( i.e. ‘quadrillions of computing operations’ per ‘second’ ), a multi-purpose supercomputer designed for the next-generation of many-core High Performance Computing ( HPC ) applications.

The SWISS NATIONAL SUPERCOMPUTING CENTRE ( CSCS ) – located in Manno, Switzerland – is the CRAY INC. first ( 1st ) customer for the new CRAY XK6 system. CSCS ( Manno, Switzerland ) promotes and develops technical and scientific services in the field of High-Performance Computing ( HPC ) for the Swiss research community, and is upgrading its CRAY XE6m system ( nick-named: Piz Palu ) into a multiple cabinet new CRAY XK6 supercomputer. The SWISS NATIONAL SUPERCOMPUTING CENTRE ( CSCS ) supports scientists working, in:

– Weather Forecasting; – Physics; – Climatology; – Geology; – Astronomy; – Mathematics; – Computer Sciences; – Material Sciences; – Chemistry; – Biology; – Genetics; and, – Experimental Medicine.

Data additionally analyzed by these supercomputers, include:

– Ultra-Deep Sea Volcanoes located in continental plate fracture zones several miles beneath ocean basin areas ( e.g. Asia-Pacific Rim also known as the “Pacific Ring of Fire” where a circum-Pacific seismic belt of earthquakes frequently impact areas far across the Pacific Ocean in the Americas ).

Global geoscience realizes Earth ‘ground movement shaking’ earthquakes hide alot, of what people are actually walking on-top-of, large geographic land mass areas known as ‘continental shelves’ or “continental plates” that move ( tectonics ) because of superheated pressurized extrasuperconducting magnetic energy properties released from within molten magma material violently exploding beneath the surface of the Earth down in ultra-deep seas.

[ IMAGE ( above ): Global Tectonic Plate Boundaries & Major Volcano HotSpots ( click to enlarge ) ]

Significant volcanoes are positioned like dots along this global 25,000-mile circular region known as the “Pacific Ring of Fire” extending from south of Australia up the ‘entire eastcoast’ of Japan, China and the Kamchatka Pennisula of Russia to across the Aleutian Islands of Alaska and then south down the ‘entire westcoast’ of North America and Latin America.

[ IMAGE ( above ): Ultra-Deep Sea Pacific Ocean Basin ( click to enlarge ) ]

March 11, 2011 Tohoku-chiho Taiheiyo-oki Japan 9.0 earthquake held several secrets, including U.S. government contractors simultaneously monitoring a significant ”moment of magnitude” ( Mw ) Earth Event occurring parallel to the eastcoast of Japan beneath the Western Pacific Ocean where an entire suboceanic mountain range was being split in-half ( south to north ) 310-miles long and split open 100-feet wide ( east to west ), which the public was unaware of nor were they told details about.

Interestingly, the March 11, 2011 Japan island earthquakes have not yet stopped, as the swarm of 4.0, 5.0, 6.0 and 7.0 Richter scale earthquakes continue as a direct and proximate cause of erupting ‘suboceanic volcanoes‘ moving these large “plates” beginning to force yet others to slam into one another thousands of miles away.

Japan’s Western Pacific Ocean ‘eastcoast’ has a ’continental plate’ slamming point meeting the ’westcoast’ of North America near the Cascade mountain range ‘plate’ reacting in one ( 1 ) of two ( 2 ) ways, i.e. ’seaward’ ( plate thrusting toward Japan ) or ‘landward’ ( plate thrusting toward the Pacific Northwest ) of the United States and/or Canada.

What The Public Never Knew

Government leadership, globally, is acutely familiar with these aforementioned types of major Earth Events, including ‘monstrous plate tectonic pushing matches’, which usually collapse one or more ‘national infrastructures’ and typically spells ‘death’ and ‘serious injuries’ for populations in developed areas.

Extremely familiar with mass public panic resulting from Earth Event catastrophes, government ‘contingency actions’ pre-approved by ‘governing bodies’ and/or ‘national leadership’ Executive Order Directives, which although not advertised is a matter of public record, ‘immediately calls’ upon ‘all military forces’ to carry-out “risk reduction” ( ‘minimization of further damages and dangers’ ) through what is referred to as “mitigation” ( ‘disaster management’ ) within “National Disaster Preparedness Planning” ( ‘national contingency measures’ ) details citizens are unaware-of. Government decision-makers know a “national emergency can bring temporary suspension of Constitutional Rights and a loss of freedoms – a volatile subject few care to discuss because ’any significant natural disaster’ will result in government infringment on many civil liberties most populations are accustomed to enjoying.

Before 1-minute and 40-seconds had passed into the March 11, 2011 Tohoku, Japan earthquake ( Richter scale: M 9.0 ), key U.S. government decision-makers discussed the major Earth Event unfolding off Japan’s eastcoast Plate-Boundary subduction zone beneath the ultra-deep sea of the Western Pacific Ocean where Japan’s monstrous volcano mountain range had split at least 100-feet wide open and cracked 310-miles long in a northern direction headed straight for the Aleutian Islands of Alaska in the United States.

U.S. Military Contingent Standby “Red Alert” Notification

U.S. Air Force ( USAF ) ‘subordinate organization’ Air and Space Operations ( ASO ) Communications Directorate ( A6 ) ‘provides support’ over ‘daily operations’, ‘contingency actions’ and ‘general’ Command, Control, Communication and Computer Intelligence ( C4I ) for the U.S. Air Force Weather Agency ( USAFWA ) saw its 1st Weather Group ( 1ST WXG ) Directorate ready its USAFWAWXGOWS 25th Operational Weather Squadron ( OWS ) at Davis-Monthan Air Force Base ( Tucson, Arizona ) responsibile for conjuntive communication notification issuance of an Earth Event “Red Alert” immediately issued directly to U.S. Army Western Command ( WESTCOM ) with a “Standby-Ready” clause pausing Western Region ( CONUS ) mobilization of Active, Reserve and National Guard military forces at specific installations based on the Japan Earth Event “moment of magnitude” ( Mw ) Plate-Boundary consequential rebound expected to strike against the North America westcoast Plate-Boundary of the Cascadia Range reactionarily triggering its subduction zone into a Cascadia ‘great earthquake’.

CALTECH Public News Suppression Of Major Earth Event

Officials, attempting to diminish any clear public understanding of the facts only knowing a Richter scale level earthquake ‘magnitude’ ( never knowing or hearing about what a major Earth Event “moment of magnitude” ( Mw ) entailed ), only served-up ‘officially-designed double-speak psycho-babble terms’ unfamiliar to the public as ‘creative attention distraction’ announcing the “Japan earthquake experienced,” a:

– “Bilateral Rupture;” and,

– “Slip Distribution.”

The facts are that, the Japan ‘earthquake’ would ‘never have occurred’, ‘unless’:

1ST – “Bilateral Rupture” ( ‘suboceanic subterranean tectonic plate split wide open  ) occurred; followed by,

2ND – “Slip Distribution” ( ‘tectonic plate movement’ ); then finally,

3RD – “Ground Shaking” ( ‘earthquake’ ) response.   Officials failed the public without any notification a major Earth Event “moment of magnitude” ( Mw ) on the “Pacific Ring of Fire” ( circum-Pacific seismic belt ) in the Western Pacific Ocean had a, huge:

1. Continental Plate Break Off;

3. Undersea Plate Mountain Range Crack Wide Open; plus,

2. Mountain Range Split Open 310-Miles Long.

There are some, laying at rest, that might ‘not consider’ the aforementioned three ( 3 ) major Earth Event occurences significant, except those ‘still living’ on Earth.

Asia-Pacific Rim

This western Pacific Ocean huge ‘undersea mountain range’ moved ‘east’, crushing into the smaller portion of its tectonic plate’ toward the continent of Asia, which commenced continous streams of day and night significant earthquakes still registering 5.0 + and 6.0 + according to Richter scale levels of magnitude now and for over 12-days throughout the area surrounding Japan, the point nearest where the tectonic plate meets the continent of Asia within the western Pacific Ocean from where this ‘monstorous undersea mountain range’ suddenly split, sending the ‘eastern half’ – with the ‘tectonic plate’ broken beneath it – slamming into the continent of Asia.

Simultaneously pushed, even greater with more force outward ( note: explosives – like from out-of a cannon or from a force-shaped explosive – project blasts outward from the ‘initial explosive blast’ is blunted by a back-stop ) away-from the Asia continent, was this ‘monstorous undersea mountain range’ split-off ( 310-miles / 500-kilometers long ) ‘western half’ slammed west up against the Americas ‘western tectonic plates’ .

This ‘is’ the ‘major’ “Earth Event” that will have consequential global impact repurcussions, ‘officially minimized’ by ‘focusing public attention’ on a ‘surface’ Earth Event earthquake 9.0 Richter scale magnitude ( once ), while even further diminishing the hundreds of significant earthquakes that are still occuring 12-days after the initial earthquake.

Asia-Pacific Rim “Ring Of Fire”

Many are unaware the “Asia-Pacific Rim” is ( also known as ) the “Ring of Fire” whereunder the ”ultra-deep sea Pacific Ocean’ exists ‘numerous gigantic volatile volcanoes’ positioned in an ‘incredibly large circle’ ( “ring” ) around a ‘huge geographic land mass area’ comprised of ‘tectonic plates’ that ‘connect’ the ‘Eastern Asias’ to the ‘Western Americas’.

Yellowstone National Park Super Volcano

Many people are still wondering ‘why’ the Japan earthquakes have not yet stopped, and why they are being plagued by such a long swarm of siginificant earthquakes still to this very day nearly 60-days later. The multiple color video clips viewed ( below ) provides information on unusual earthquake swarm patterns and reversals while studying the World’s largest supervolcano in Wyoming ( USA ) located at Yellowstone National Park, a global public attraction viewing natural underground volcano steam vents known as geyser eruptions:

[ PHOTO ( above ): Major HotSpot at Yellowstone displays Half Dome cap of granite rock above unerupted volcano magma. ]

Ultra-Deep Sea Volcanoes

When huge undersea volcanoes erupt they dynamically force incredibly large geographic land mass plates to move whereupon simultaneously and consequentially movement is experienced on ‘surface land areas’ people know as ’earthquakes’ with their ’aftermath measurements’ provided in “Richter scale level” measurements that most do not understand. These Richter scale measurements are only ‘officially provided estimates’, as ’officials are never presented with totally accurate measurements’ because many of which are ‘not obtained with any great precision for up-to 2-years after the initial earthquake’.

Rarely are ‘precise measurements’ publicly provided, and at anytime during that 2-year interim the public may hear their previously reported earthquake Richter scale level measurement was either “officially upgraded” or “officially downgraded.” Often, this is apparently dependent when one sees ’many other countries contradicting U.S. public news announcements’ about the magnitude of a particularly controversial earthquake. An example of this was seen surrounding the March 12, 2011 earthquake in Japan:

– Japan 1st public announcement: 9.2 Richter scale;

– United States 1st public announcement: 8.9 Richter scale;

– United States 2nd public announcement: 9.0 Richter scale; and,

– United States 3rd public announcement: 9.1 Richter scale.

What will the March 12, 2011 Japan earthquake be officially reported as in 2-years? Who knows?

Never publicly announced, however are measurements of an earthquake ‘force strength pressure accumulation’ transmitted through suboceanic tectonic plates grinding against one another, a major Earth Event ‘geographic pushing process’, having been seen by U.S. NSA supercomputers from global ground and space-based monitoring analysis surrounding the “Asia-Pacific Rim Ring of Fire” – stretching from the ‘Eastern Asias’ to the ‘Western Americas’ and beyond.

This ‘domino plate tectonic principle’ results from combined amounts of ‘volcanic magmatic eruptive force strength’ and ‘tectonic plate accumulative pressure build-up’ against ‘adjacent tectonic plates’ causing ‘suboceanic, subterranean and surface land to move’ whereupon ‘how significant such amounts occur determines strength’ of both consequential ‘earthquakes’ and resultant ‘tsunamis’.

Waterway Tsunamis

When most of the public ‘hears about’ a “tsunami”, they ‘think’ ‘high waves’ near “ocean” coastal regions presented with significant floods over residents of cities nearby. Few realize the ‘vast majority of Earth’s population predominantly live all along ocean coastal regions. Few realize one ( 1 ) ‘gigantic tsunami’ could ‘kill vast populations living near oceans in the wake of popular beaches, a tough trade-off for some while logical others choose ‘living further inland’ – away from large bodies of water like ‘large lakes’ where ‘tide levels are also effected by the gravitational pull of the moon’ that can also can a ‘vast deep lake body’ bring a tsunami dependent on which direction tectonic plates move a ‘force directionalized earthquake’ creating a ‘tsunami’ with significant innundating floods over residents living in cities near those ‘large shoreline’ areas too.

What most of the public does not yet fully realize is that ‘large river bodies of water’, like the Mississippi River that is a ‘north’ to ‘south’ directional river’ could easily see ‘east to ‘west’ directional ‘tectonic plates’ move adjacent states – along the New Madrid Fault subduction zone – with significant ‘earthquakes’ – from tectonic plate movement easily capable of squeezing the side banks of even the Missippi River forcing huge amounts of water hurled out onto both ‘east’ and ‘west’ sides resulting in ‘seriously significant inland flooding’ over residents living in ‘low lying’ states of the Central Plains of the United States.

Japan “Pacific Ring Of Fire” Earthquakes To Americas Cascadia Fault Zone

Japan accounts, of a co-relative tsunami, suggest the Cascadia Fault rupture occurred from one ( 1 ) single earthquake triggering a 9-Mw Earth Event on January 26, 1700 where geological evidence obtained from a large number of coastal northern California ( USA ) up to southern Vancouver Island ( Canada ), plus historical records from Japan show the 1,100 kilometer length of the Cascadia Fault subduction zone ruptured ( split cauding that earthquake ) major Earth Event at that time. While the sizes of earlier Cascadia Fault earthquakes are unknown, some “ruptured adjacent segments” ( ‘adjacent tectonic plates’ ) within the Cascadia Fault subduction zone were created over periods of time – ranging from as little as ‘hours’ to ‘years’ – that has historically happened in Japan.

Over the past 20-years, scientific progress in understanding Cascadia Fault subduction zone behavior has been made, however only 15-years ago scientists were still debating whether ‘great earthquakes’ occured at ‘fault subduction zones’. Today, however most scientists realize ‘great earthquakes’ actually ‘do occur in fault subduction zone regions’.

Now, scientific discussions focus on subjects, of:

– Earth crust ‘structural changes’ when a “Plate Boundary” ruptures ( splits ) – Related tsunamis; – Seismogenic zone ( tectonic plate ‘locations’ and ‘width sizes’ ).

Japan America Earthquakes And Tsunamis Exchange

Great Cascadia earthquakes generate tsunamis, which most recently was at-least a ’32-foot high tidal wave’ onto the Pacific Ocean westcoast of Washington, Oregon, and California ( northern portion of state ), and that Cascadia earthquake tsunami sent a consequential 16-foot high todal wave onto Japan.

These Cascadia Fault subduction zone earthquake tsunamis threaten coastal communities all around the Pacific Ocean “Ring of Fire” but have their greatest impact on the United States westcoast and Canada being struck within ’15-minutes’ to ’40-minutes’ shortly ‘after’ a Cascadia Fault subduction zone earthquake occurs.

Deposits, from past Cascadia Fault earthquake tsunamis, have been identified at ‘numerous coastal sites’ in California, Oregon, Washington, and even as far ‘north’ as British Columbia ( Canada ) where distribution of these deposits – based on sophisticated computer software simulations for tsunamis – indicate many coastal communities in California, Oregon, Washington, and even as far ‘north’ as British Columbia ( Canada ) are well within flooding inundation zones of past Cascadia Fault earthquake tsunamis.

California, Oregon, Washington, and even as far ‘north’ as British Columbia ( Canada ) westcoast communities are indeed threatened by future tsunamis from Cascadia great earthquake event ‘tsunami arrival times’ are dependent on measuring the distance from the ‘point of rupture’ ( tectonic plate split, causing earthquake ) – within the Cascadia Fault subduction zone – to the westcoast “landward” side.

Cascadia Earthquake Stricken Damage Zone Data

Strong ground shaking from a “moment of magnitude” ( Mw ) “9″ Plate-Boundary earthquake will last 3-minutes or ‘more’, dominated by ‘long-periods of further earthquakes’ where ‘ground shaking movement damage’ will occur as far inland as the cities of Portland, Oregon; Seattle, Washington; and Vancouver, British Columbia ( Canada ).

Tsunami Optional Wave Patterns “Following Sea”

Large cities within 62-miles to 93-miles of the nearest point of the Cascadia Plate-Boundary zone inferred rupture, will not only experience ‘significant ground shaking’ but also experience ‘extreme duration ground shaking’ lasting far longer, in-addition to far more powerful tsunamis carrying far more seawater because of their consequential “lengthened  wave periods” ( ‘lengthier distances’ between ‘wave crests’ or ‘wave curls’ ) bringing inland akin to what fisherman describe as a deadly “following sea” ( swallowing everything within an ‘even more-so powerful waterpath’ ), the result of which inland causes ‘far more significant damage’ on many ‘tall buildings’ and ‘lengthy structures’ where ‘earthquake magnitude strength’ will be felt ‘strongest’ – all along the United States of America Pacific Ocean westcoast regional areas – experiencing ‘far more significant damage’.Data Assessments Of Reoccuring Cascadia Earthquakes

cascadia ‘great earthquakes’ “mean recurrence interval” ( ‘time period occuring between one earthquake with the next earthquake ) – specific ‘at the point’ of the Cascadia Plate-Boundary – time is between 500-years up-to 600-years, however Cascadia Fault earthquakes in the past have occurred well within the 300-year interval of even less time since the Cascadia ‘great earthquake’ of the 1700s. Time intervals, however between ‘successive great earthquakes’ only a few centuries up-to 1,000 years has little ‘well-measured data’ as to ‘reoccurance interval’ because the numbers of recorded Cascadia earthquakes have rarely measured over ‘five’ ( 5 ). Data additionally indicates Cascadia earthquake intermittancy with irregular intervals when they did occur, plus data lacks ‘random distribution’ ( ‘tectonic plate shift’ or ‘earth movement’ ) or ‘cluster’ of these Cascadia earthquakes over a lengthier period of time so ‘more accurate assessments are unavailable’ for knowning anything more about them. Hence, because Cascadia earthquake ‘recurrence pattern’ is so ‘poorly known’, knowing probabilities of the next Cascadia earthquake occurrence is unfortunately unclear with extremely sparse ‘interval information’ details.

Cascadia Plate-Boundary “Locked” And “Not Locked”

Cascadia Plate-Boundary zone is ‘currently locked’ off the U.S. westcoast shoreline where it has accumulating plate tectonic pressure build-up – from other tectonic plates crashing into it for over 300-years.

The Cascadia Fault subduction zone, at its widest point, is located northwest just off the coast of the State of Washington where the maximum area of seismogenic rupture is approximately 1,100 kilometers long and 50 kilometers up-to 150 kilometers wide. Cascadia Plate-Boundary seismogenic portion location and size data is key-critical for determining earthquake magnitude, tsunami size, and the strength of ground shaking.

Cascadia Plate-Boundary “landward limit” – of only its “locked” portion – where ‘no tectonic plate shift has yet to occur’ is located between the Juan de Fuca tectonic plate and North America tectonic plate were it came to be “locked” between Cascadia earthquakes, however this “ocked” notion has only been delineated from ‘geodetic measurements’ of ‘surface land deformation’ observations. Unfortunately, its “seaward limit” has ‘very few constraints’ up-to ‘no constraints’ for travelling – on its so-called “locked zone” portion – that could certainly move at any time.

Cascadia Plate Continues Sliding

Cascadia transition zone, separating its “locked zone” from its “continuous sliding zone” headed east into the continent of North America, is constrained ( held-back ) poorly so, Cascadia rupture may extend an unknown distance – from its now “locked zone” to its “continously sliding transition zone.”

On some Earth crust faults, near coastal regions, earthquakes may also experience ‘additional Plate-Boundary earthquakes’, ‘increased tsunami tidal wave size’ plus ‘intensification of local area ground shaking’.

Earth Event Mitigation Forces Global Money Flow

Primary ‘government purpose’ to ‘establishing international’ “risk reduction” is solely to ‘minimize global costs from damages’ associated with major magnitude Earth Events similar-to but even-greater than the what happend on March 11, 2011 all over Japan.

Historical earthquake damages assist in predictive projections of damage loss studies suggesting disastrous future losses will occur in the Pacific Northwest from a Cascadia Fault subduction ‘great earthquake’. National ‘loss mitigation efforts’ – studying ‘other seismically active regions’ plus ‘national cost-benefit studies’ indicate that ‘earthquake damage loss mitigation’ may effectively ‘reduce losses’ and ‘assist recovery’ efforts in the future. Accurate data acquired, geological and geophysical research and immediate ‘technological information transfer’ to ‘national key decision-makers’ was to reduce Pacific Northwest Cascadia Fault subduction zone additional risks to those of the Western North America coastal region.

Damage, injuries, and loss of life from the next great earthquake from the Cascadia Fault subduction zone will indeed be ‘great’, ‘widespread’ and ‘significantly ‘impact national economies’ ( Canada and United States ) for years to decades in the future, which has seen a global concerted increase, in:

– International Cooperative Research; – International Information Exchanges; – International Disaster Prepardeness; – International Damage Loss Mitigation Planning; – International Technology Applications; and, – More.

Tectonics Observatory

CALTECH Advanced Rapid Imaging and Analysis ( ARIA ) Project collaborative members of the NASA Jet Propulsion Laboratory ( JPL ), University of California Institute of Technology ( Pasadena ) Tectonics Observatory ARIA Project members, CALTECH scientists, Shengji Wei and Anthony Sladen ( of GEOAZUR ) modelled the Japan Tohoku earthquake fault zone sub-surface ( below surface ) ‘tectonic plate movement’, dervived from:

– TeleSeismic Body Waves ( long-distance observations ); and,

– Global Positioning Satellites ( GPS ) ( near-source observations ).

A 3D image of the fault moving, can be viewed in Google Earth ( internet website webpage link to that KML file is found in the “References” at the bottom of this report ) projects that fault rupture in three dimensional images, which can be viewed from any point of reference, with ‘that analysis’ depicting the rupture ( ground splitting open 100-feet ) resulting in the earthquake ( itself ) ‘triggered from 15-miles ( 24-kilometers ) beneath the ultra-deep sea of the Western Pacific Ocean, with the ‘entire island of Japan being moved east’ by 16-feet ( 5 meters ) from its ‘before earthquake location’.

[ IMAGE ( above ): NASA JPL Project ARIA Tectonic Plate Seismic Wave Direction Map ( click image to enlarge and read ) ]

National Aeronautics and Space Administration ( NASA ) Jet Propulsion Laboratory ( JPL ) at the University of California ( Pasadena ) Institute of Technology ( also known as ) CALTECH Project Advanced Rapid Imaging and Analysis ( ARIA ) used GEONET RINEX data with JPL GIPSY-OASIS software to obtain kinematic “precise point positioning solutions” from a bias fixing method of a ‘single station’ matched-up to JPL orbit and clock products to produce their seismic displacement projection map details that have an inherent ’95% error-rating’ that is even an ‘estimate’, which ‘proves’ these U.S. government organization claims that ‘all they supposedly know’ ( after spending billions of dollars ) are what they are ‘only willing to publicly provide may be ‘only 5% accurate’. So much for what these U.S. government organizations ‘publicly announce’ as their “precise point positioning solutions.”

Pay Any Price?

More ‘double-speak’ and ‘psycho-babble’ serves to ‘only distract the public away from the ‘truth’ as to ‘precisely what’ U.S. taxpayer dollars are ‘actually producing’, and ‘now knowing this’ if ‘those same officials’ ever ‘worked for a small business’ they would either be ‘arrested’ for ‘fraud’ or ‘fired’ because of ‘incompetence’, however since ‘none of them’ will ever ‘admit to their own incometence’ their ‘leadership’ needs to see ‘those responsible’ virtually ‘swing from’ the end of an ‘incredibly long U.S. Department of Justice rope’.

Unfortunately, the facts surrounding all this only get worse.

[ IMAGE ( above ): Tectonic Plates ( brown color ) Sinking and Sunk On Earth Core. ]

Earthquake Prediction Falacy

Earthquake prediction will ‘never be an accomplished finite science for people to ever rely upon’, even though huge amounts of money are being wasted on ‘technology’ for ‘detection sensors’ reading “Seismic Waveforms” ( also known as ) “S Waves” that ‘can be detected and stored in computer databases’, because of a significant fact that will never be learned no matter how much money or time may be devoted to trying to solve the unsolvable problem of the Earth’s sub-crustal regions that consist primarily of ‘molten lake regions’ filled with ‘floating tectonic plates’ that are ‘moving while sinking’ that ‘cannot be tested’ for ‘rock density’ or ‘accumulated pressures’ existing ‘far beneath’ the ‘land surface tectonic plates’.

The very best, and all, that technology can ever perform for the public is to record ‘surface tectonic plates grinding aganist one another’ where ‘only that action’ ( alone ) does in-fact emit the generation of upward ‘accoustic wave form patterns’ named as being ‘seismic waves’ or ‘s-waves’ that ‘do occur’ but ‘only when tectonic plates are moving’.

While a ‘public early warning’ might be helpful for curtailing ‘vehicular traffic’ crossing an ‘interstate bridge’ that might collapse or ‘train traffic’ travel being stopped, thousands of people’s lives could be saved but it would fail to serve millions more living in buildings that collapse.

Early Warning Exclusivity

Knowing governments, using publicly unfamiliar terms, have ‘statisticly analyzed’ “international economics” related to “national infrastructure preparedness” ( ‘early warning systems’ ) – both “public” ( i.e. ‘utility companies’ via ‘government’ with ‘industrial leadership’ meetings ) and “private” ( i.e. ‘residents’ via ‘television’, ‘radio’, ‘newspaper’ and ‘internet’ only ‘commercial advertisements’ ) between which two ( 2 ) sees “national disaster mitigation” ‘primary designated provisions’ for “high density population centers” near “coastal or low-lying regions” ( ‘large bodies of ocean, lake and river water’ ) “early warning” but for only one ( 1 ) being “public” ( i.e. ‘utility companies’ via ‘government’ with ‘industrial leadership’ meetings ) “in the interest of national security” limiting ‘national economic burdens’ from any significant Earth Event impact ‘aftermath’.

In short, and without all the governmentese ‘psycho-babble’ and double-speak’, costs continue being spent on ‘high technology’ efforts to ‘perfect’ a “seismic early warning” for the “exclusive use” ( ‘national government control’ ) that “provides” ( ‘control over’ ) “all major utility company distribution points” ( facilities from where ‘electrical power is only generated’ ) able to “interrupt power” ( ‘stop the flow of electricity nationwide’ from ‘distribution stations’ ), thus “saving additional lives” from “disasterous other problems” ( ‘aftermath loss of lives and injuries’ caused by ‘nuclear fallout radiation’, ‘exploding electrical transformers’, and ‘fires associated with overloaded electrical circuits’ ).

Logically, ‘much’ – but ‘not all’ – of the aforementioned ‘makes perfect sense’, except for “John Doe” or “Jane Doe” ‘exemplified anonomously’ ( herein ) as individuals whom if ‘earlier warned’ could have ‘stopped their vehicle ‘before crossing the bridge that collapsed’ or simply ‘stepped out of the way of a huge sign falling on them’ being ‘killed’ or ‘maimed’, however one might additionally consider ‘how many more would ‘otherwise be killed or maimed’ after an ‘ensuing mass public mob panics’ by ‘receiving’ an “early warning.” Tough call for many, but few.

Earth Data Publicly Minimized

Tohoku-oki earthquake ‘seismic wave form data’ showing the Japan eastcoast tectonic plate “bilaterally ruptured” ( split in-half for a distance of over 310-miles ) was obtained from the USArray seismic stations ( United States ) was analyzed and later modelled by Caltech scientists Lingsen Meng and Jean-Paul Ampuero whom created preliminary data animation demonstrating a ‘super major’ Earth Event simultaneously occurring when the ‘major’ earthquake struck Japan.

U.S. National Security Stations Technology Systems Projects

United States Seismic Array ( USArray ) Data Management Plan Earthscope is composed of three ( 3 ) Projects:

1. Incorporated Research Institutions for Seismology ( IRIS ), a National Science Foundation ( NSF ) consortium of universities, Data Management Center ( DMC ) is ‘managed’ by the “United States Seismic Array ( USArray )” Project;

2. UNAVCO INC. ‘implemented’ “Plate-Boundary Observatory ( PBO )” Project; and,

3. U.S. Geological Service ( USGS ) ‘operated’ “San Andreas Fault Observatory at Depth ( SAFOD )” Project at Stanford University ( California ).

Simultaneous Earth Data Management

USArray component “Earthscope” data management plan is held by USArray IRIS DMC.

USArray consists of four ( 4 ) data generating components:

Permanent Network

Advanced National Seismic System ( ANSS ) BackBone ( BB ) is a joint effort – between IRIS, USArray and USGS – to establish a ‘Permanent Network’ of approximately one-hundred ( 100 ) Earth monitoring ‘receiving stations’ ( alone ) located in the Continental United States ( CONUS ) or lowere 48 states of America, in-addition to ‘other stations’ located in the State of Alaska ( alone ).

Earth Data Multiple Other Monitors

USArray data contribution to the Advanced National Seismic System ( ANSS ) BackBone ( BB ) consists, of:

Nine ( 9 ) new ‘international Earth data accumulation receiving stations’ akin to the Global Seismic Network ( GSN );

Four ( 4 ) “cooperative other stations” from “Southern Methodist University” and “AFTAC”;

Twenty-six ( 26 ) ‘other receiving stations’ from the Advanced National Seismic System ( ANSS ) with ‘upgrade funding’ taken out-of the USArray Project “EarthScope;” plus,

Sixty ( 60 ) additional stations of the Advanced National Seismic System ( ANSS ) BackBone ( BB ) network that ‘currently exist’, ‘will be installed’ or ‘will be upgraded so that ‘data channel stream feeds’ can and ‘will be made seamlessly available’ through IRIS DMC where ‘data can be continuously recorded’ at forty ( 40 ) samples per second and where 1 sample per second can and ‘will be continously transmitted in real-time back into IRIS DMC where quality assurance is held at facilities located in ‘both’ Albuquerque, New Mexico and Golden, Colorado with ‘some’ U.S. Geological Survey ( USGS ) handling ‘some operational responsiblities’ thereof.

Albuquerque Seismological Laboratory ( ASL ) –

Albuquerque Seismological Laboratory ( ASL ) supports operation and maintenance of seismic networks for the U.S. Geological Survey ( USGS ) portion of the Global Seismographic Network ( GSN ) and Advanced National Seismic System ( ANSS ) Backbone network.

ASL runs the Advanced National Seismic System ( ANSS ) depot facility supporting the Advanced National Seismic System ( ANSS ) networks.

ASL also maintains the PASSCAL Instrument Center ( PIC ) facility at the University of New Mexico Tech ( Socorro, New Mexico ) developing, testing, and evaluating seismology monitoring and recording equipment.

Albuquerque Seismological Laboratory ( ASL ) staff are based in ‘both’ Albuquerque, New Mexico and Golden, Colorado.

Top-Down Bottom-Up Data Building Slows Earthquake Notifications

Seismic waveform ( ‘seismic Wave form frequency’ ) data is received by the Global Seismic Network ( GSN ) and Advanced National Seismic System ( ANSS ) BackBone ( BB ) network by electronic transmissions sent ‘slower than real-time’ by sending only ‘near-time data’ ( e.g. tape and compact disc recordings ) to the National Earthquake Information Center ( NEIC ) ‘station’ of the U.S. Geological Survey ( USGS ) ‘officially heralded’ for so-called “rapid earthquake response,”

Unbelieveably is the fact that in-addition to the aforementioned ‘slow Earth Event data delivery process’, an additional number of ‘data receiving stations’ have absolutely ‘no data streaming telemetry’ transmission capabilities whatsoever so, those station data recordings – on ‘tapes’ and ‘compact discs’ – are delivered by ‘other even more time consuming routes’ before that data can even reach the U.S. Geological Survey ( USGS ). In short, all the huge amounts of money being spent goes to ‘increasing computer technologies, sensors, satellites, ‘data stream channel networks’ and ‘secure facility building stations’ from the ‘top, down’ instead of building ‘monitoring stations’ and ‘recording stations’ from the ‘bottom, up’ until the entire earthquake monitoring and notification system is finally built properly. As it curreently stands, the ‘apple cart stations continue being built more and more’ while ‘apple tree stations are not receiving the proper technological nutrients’ to ‘delivery apples ( ‘data’ ) and ‘fed into notification markets’ ( ‘public’ ) where all this could do some good.

U.S. National Security Reviews Delay Already Slow Earthquake Notifications

IRIS Data Management Center ( DMC ) – after processing all incoming data streams from reporting stations around the world – then distributes seismic waveform data ‘back to’ both the Global Seismic Network ( GSN ) and Advanced National Seismic System ( ANSS ) BackBone ( BB ) network operations, but only ‘after seismic waveform data has been ‘thoroughly screened’ by what U.S. national security government Project leadership has deemed its ‘need to control all data’ by “limiting requirements” ( ‘red tape’ ) because ‘all data must undergo’ a long ardous ‘secure data clearing process’ before any data can be released’. Amusingly to some, the U.S. government – in its race to create another ‘official acronym’ of ‘double-speak’ – that national security requirement clearing process’ was ever so aptly named:

“Quality Assurance Framework” ( QUACK )

Enough said.

Let the public decide what to do with ‘those irresponsible officials’, afterall ‘only mass public lives’ are ‘swinging in the breeze’ at the very end-of a now-currently endless ‘dissinformation service rope’ being paid for by the tax-paying public.

In the meantime, while we are all ‘waiting for another Earth Event to take place far beyond, what ( besides this report ) might ‘slap the official horse’, spurring it to move quickly?

How about us? What shhould we do? Perhaps, brushing-up on a little basic knowledge might help.

Inner Earth Deeper Structure Deep Focus Earthquakes Rays And Related Anomalies

There is no substitute for knowledge, seeing information technology ( IT ) at the focal point of many new discoveries aided by supercomputing, modelling and analytics, but common sense does pretty good.

The following information, although an incredibily brief overview on such a wide variety of information topics surrounding a great deal of the in’s and out’s surrounding planet Earth, scratches more than just the surface but deep structure and deep focus impacting a multitude of generations from as far back as 700 years before the birth of Christ ( B.C. ).

Clearly referenced “Encyclopaedia Britannica” general public access information is all second-hand observations of records from other worldwide information collection sources, such as:

– Archives ( e.g. governments, institutions, public and private );

– Symposiums ( e.g. white papers );

– Journals ( professional and technical publications );

– Other information collection sources; and,

– Other information publications.

Encyclopaedias, available in a wide variety of styles and formats, are ’portable catalogs containing a large amount of basic information on a wide variety of topics’ available worldwide to billions of people for increasing their knowledge.

Encyclopedia information formats vary, and through ’volume reading’, within:

– Paper ‘books’ with either ’printed ink’ ( sighted ) or ’embossed dots’ ( Braille );

– Plastic ‘tape cartridges’ ( ‘electromagnetic’ media ) or ‘compact discs’ ( ‘optical’ media ) with ‘electronic device display’; or,

– Electron ‘internet’ ( ‘signal computing’ via ‘satellite’ or ‘telecomputing’ via ’landline’ or ‘node’ networking ) with ‘electronic device display’.

After thoroughly reviewing the Encyclopedia Britannica ‘specific compilation’, independent review found reasonable a facsimile of the original reformatted for easier public comprehension ( reproduced further below ).

Suprisingly, after that Encyclopedia Britannica ‘specific compilation’ information was reformatted for clearer reading comprehension, otherwise inner Earth ‘deep-structure’ geophysical studies formed an amazing correlation with additional factual activities within an equally amazing date chronology of man-made nuclear fracturing reformations of Earth geology geophysical – activities documented worldwide more than 1/2 century ago but somehow forgotten; either by chance or secret circumstance.

How could the Encyclopedia Britannica, or for that matter anyone else, missed something on such a grand scale that is now so obvious?

… [ TEMPORARILY EDITED-OUT FOR REVISION PURPOSES ONLY –  ] …

For more details, about the aforementioned, Click: Here!

Or,

To understand how all this relates, ‘begin with a better basic understanding’ by continuing to read the researched information ( below ):

====

Circa: March 21, 2012

Source:  Encyclopaedia Britannica

Earthquakes

Definition, Earthquake: Sudden shaking of Earth ground caused by passage of seismic waves through Earth rocks.

Seismic waves are produced when some form of energy stored in the Earth’s crust is suddenly released, usually when masses of rock straining against one another suddenly fracture and “slip.” Earthquakes occur most often along geologic faults, narrow zones where rock masses move in relation to one another. Major fault lines of the world are located at the fringes of the huge tectonic plates that make up the Earth’s crust. ( see table of major earthquakes further below )

By the early 20th Century ( 1900s ), little was understood about earthquakes until the emergence of seismology, involving scientific study of all aspects of earthquakes, now yielding answers to long-standing questions as to why and how earthquakes occur.

About 50,000 earthquakes, large enough to be noticed without the aid of instruments, occur every year over the entire Earth, and of these approximately one-hundred ( 100 ) are of sufficient size to produce substantial damage if their centers are near human habitation.

Very great earthquakes, occur on average about once a year, however over centuries these earthquakes have been responsible for millions of human life deaths and an incalculable amount of property damage.

Earthquakes A -Z

Earth’s major earthquakes occur primarily in belts coinciding with tectonic plate margins, apparent since early ( 700 B.C. ) experienced earthquake catalogs, and now more readily discernible by modern seismicity maps instrumentally depicting determined earthquake epicentres.

Most important, is the earthquake Circum-Pacific Belt affecting many populated coastal regions around the Pacific Ocean, namely:

South America;

– North America & Alaska;

Aleutian Islands;

Japan;

New Zealand; and,

New Guinea.

80% of the energy, estimated presently released in earthquakes, comes from those whose epicentres are in the Circum-Pacific Belt belt.

Seismic activity is by no means uniform throughout the belt, and there are a number of branches at various points. Because at many places the Circum-Pacific Belt is associated with volcanic activity, it has been popularly dubbed the “Pacific Ring of Fire.”

A second ( 2nd ) belt, known as the Alpide Belt, passes through the Mediterranean region eastward through Asia and joining the Circum-Pacific Belt in the East Indies where energy released in earthquakes from the Alpide Belt is about 15%of the world total.

There are also seismic activity ‘striking connected belts’, primarily along oceanic ridges including, those in the:

Arctic Ocean;

Atlantic Ocean;

Indian Ocean ( western ); and along,

East Africa rift valleys.

This global seismicity distribution is best understood in terms of its plate tectonic setting.

Forces

Earthquakes are caused by sudden releases of energy within a limited region of Earth rocks, and apparent pressure energy can be released, by:

Elastic strain;

– Gravity;

Chemical Reactions; and / or,

– Massive rock body motion.

Of all these, release of elastic rock strain is most important because this form of energy is the only kind that can be stored in sufficient quantities within the Earth to produce major ground disturbances.

Earthquakes, associated with this type of energy release, are called: Tectonic Earthquakes.

Tectonics

Tectonic plate earthquakes are explained by the so-called elastic rebound theory, formulated by the American geologist Harry Fielding Reid after the San Andreas Fault ruptured in 1906, generating the great San Francisco earthquake.

According to Reid theory of elastic rebound, a tectonic earthquake occurs when energy strains in rock masses have accumulated ( built-up ) to a point where resulting stresses exceed the strength of the rocks where then sudden fracturing results.

Fractures propagate ( travel ) rapidly ( see speeds further below ) through the rock, usually tending in the same direction and sometimes extending many kilometres along a local zone of weakness.

In 1906, for instance, the San Andreas Fault slipped along a plane 270-miles ( 430 kilometers) long, a line alongwhich ground was displaced horizontally as much as 20-feet ( 6 meters ).

As a fault rupture progresses along or up the fault, rock masses are flung in opposite directions, and thus spring back to a position where there is less strain.

At any one point this movement may take place not at-once but rather in irregular steps where these sudden slowings and restartings give rise to vibrations that propagate as seismic waves.

Such irregular properties of fault rupture are now included in ‘physical modeling” and ‘mathematical modeling’ earthquake sources.

Earthquake Focus ( Foci )

Roughnesses along the fault are referred to as asperities, and places where the rupture slows or stops are said to be fault barriers. Fault rupture starts at the earthquake focus ( foci ), a spot that ( in many cases ) is close to being from 5 kilometers to 15 kilometers ‘under the surface where the rupture propagates ( travels )’ in one ( 1 ) or both directions over the fault plane until stopped ( or slowed ) at a barrier ( boundary ).

Sometimes, instead of being stopped at the barrier, the fault rupture recommences on the far side; at other times the stresses in the rocks break the barrier, and the rupture continues.

Earthquakes have different properties depending on the type of fault slip that causes them.

The usual ‘fault model’ has a “strike” ( i.e., direction, from north, taken by a horizontal line in the fault plane ) and a “dip” ( i.e. angle from the horizontal shown by the steepest slope in the fault ).

Movement parallel to the dip is called dip-slip faulting.

In dip-slip faults, if the hanging-wall block moves downward relative to the footwall block, it is called “normal” faulting; the opposite motion, with the hanging wall moving upward relative to the footwall, produces reverse or thrust faulting. The lower wall ( of an inclined fault ) is the ‘footwall’, and laying over the footwall is the hanging wall.

When rock masses slip past each other ( parallel to the strike area ) movement is known as strike-slip faulting.

Strike-slip faults are right lateral or left lateral, depending on whether the block on the opposite side of the fault from an observer has moved to the right or left.

All known faults are assumed to have been the seat of one or more earthquakes in the past, though tectonic movements along faults are often slow, and most geologically ancient faults are now a-seismic ( i.e., they no longer cause earthquakes ).

Actual faulting, associated with an earthquake, may be complex and often unclear whether in one ( 1 ) particular earthquake, where total energy, is being issued from a single ( 1 ) fault plane.

Observed geologic faults sometimes show relative displacements on the order of hundreds of kilometres over geologic time, whereas the sudden slip offsets that produce seismic waves may range from only several centimetres to tens of metres.

During the 1976 Tangshan earthquake ( for example ), a surface strike-slip of about 1 meter was observed along the causative fault east of Beijing, China, and later ( as another example ) during the 1999 Taiwan earthquake the Chelung-pu fault slipped vertically up to 8 meters.

Volcanism & Earthquake Movement

A separate type of earthquake is associated with volcano activity known as a volcanic earthquake.

Although likely, even in such cases, disturbance is officially believed resultant from sudden slip of rock masses adjacent a volcano being consequential release of elastic rock strain energy, however stored energy may be partially of hydrodynamic origin due heat provided by magma flowing movements ( tidal ) throughout underground reservoirs beneath volcanoes or releasing under pressure gas, but then there certainly is a clear corresponding distinction between geographic distribution of volcanoes and major earthquakes particularly within the Circum-Pacific Belt traversing ocean ridges.

Volcano vents, however, are generally several hundred kilometres from epicentres of most ‘major shallow earthquakes’, and it is believed ’many earthquake sources’ occur ‘nowhere near active volcanoes’.

Even in cases where earthquake focus occurs where structures are marked ’directly below volcanic vents’, officially there is probably no immediate causal connection between the two ( 2 ) activities where likely both may be resultant on same tectonic processes.

Earth Fracturing

Artificially Created Inductions

Earthquakes are sometimes caused by human activities, including:

– Nuclear Explosion ( large megaton yield ) detonations underground;

– Oil & Gas wells ( deep Earth fluid injections )

– Mining ( deep Earth excavations );

– Reservoirs ( deep Earth voids filled with incredibly heavy large bodies of water ).

In the case of deep mining, the removal of rock produces changes in the strain around the tunnels.

Slip on adjacent, preexisting faults or outward shattering of rock into where new cavities may occur.

In fluid injection, the slip is thought to be induced by premature release of elastic rock strain, as in the case of tectonic earthquakes after fault surfaces are lubricated by the liquid.

Large underground nuclear explosions have been known to produce slip on already strained faults in the vicinity of test devices.

Reservoir Induction

Of the various earthquake causing activities cited above, the filling of large reservoirs ( see China ) being most prominent.

More than 20 significant cases have been documented in which local seismicity has increased following the impounding of water behind high dams. Often, causality cannot be substantiated, because no data exists to allow comparison of earthquake occurrence before and after the reservoir was filled.

Reservoir-induction effects are most marked for reservoirs exceeding 100 metres ( 330 feet ) in depth and 1 cubic km ( 0.24 cubic mile ) in volume. Three ( 3 ) sites where such connections have very probably occurred, are the:

Hoover Dam in the United States;

Aswan High Dam in Egypt; and.

Kariba Dam on the border between Zimbabwe and Zambia in Africa.

The most generally accepted explanation for earthquake occurrence in such cases assumes that rocks near the reservoir are already strained from regional tectonic forces to a point where nearby faults are almost ready to slip. Water in the reservoir adds a pressure perturbation that triggers the fault rupture. The pressure effect is perhaps enhanced by the fact that the rocks along the fault have lower strength because of increased water-pore pressure. These factors notwithstanding, the filling of most large reservoirs has not produced earthquakes large enough to be a hazard.

Specific seismic source mechanisms associated with reservoir induction have been established in a few cases. For the main shock at the Koyna Dam and Reservoir in India ( 1967 ), the evidence favours strike-slip faulting motion. At both the Kremasta Dam in Greece ( 1965 ) and the Kariba Dam in Zimbabwe-Zambia ( 1961 ), the generating mechanism was dip-slip on normal faults.

By contrast, thrust mechanisms have been determined for sources of earthquakes at the lake behind Nurek Dam in Tajikistan. More than 1,800 earthquakes occurred during the first 9-years after water was impounded in this 317 meter deep reservoir in 1972, a rate amounting to four ( 4 ) times the average number of shocks in the region prior to filling.

Nuclear Explosion Measurement Seismology Instruments

By 1958 representatives from several countries, including the United States and the Russia Soviet Union government, met to discuss the technical basis for a nuclear test-ban treaty where amongst matters considered was feasibility of developing effective means to detect underground nuclear explosions and to distinguish them seismically from earthquakes.

After that conference, much special research was directed to seismology, leading to major advances in seismic signal detection and analysis.

Recent seismological work on treaty verification has involved using high-resolution seismographs in a worldwide network, estimating the yield of explosions, studying wave attenuation in the Earth, determining wave amplitude and frequency spectra discriminants, and applying seismic arrays. The findings of such research have shown that underground nuclear explosions, compared with natural earthquakes, usually generate seismic waves through the body of the Earth that are of much larger amplitude than the surface waves. This telltale difference along with other types of seismic evidence suggest that an international monitoring network of two-hundred and seventy ( 270 ) seismographic stations could detect and locate all seismic events over the globe of magnitude 4.0 and above ( corresponding to an explosive yield of about 100 tons of TNT ).

Earthquake Effects

Earthquakes have varied effects, including changes in geologic features, damage to man-made structures, and impact on human and animal life. Most of these effects occur on solid ground, but, since most earthquake foci are actually located under the ocean bottom, severe effects are often observed along the margins of oceans.

Surface Phenomena

Earthquakes often cause dramatic geomorphological changes, including ground movements – either vertical or horizontal – along geologic fault traces; rising, dropping, and tilting of the ground surface; changes in the flow of groundwater; liquefaction of sandy ground; landslides; and mudflows. The investigation of topographic changes is aided by geodetic measurements, which are made systematically in a number of countries seriously affected by earthquakes.

Earthquakes can do significant damage to buildings, bridges, pipelines, railways, embankments, and other structures. The type and extent of damage inflicted are related to the strength of the ground motions and to the behaviour of the foundation soils. In the most intensely damaged region, called the meizoseismal area, the effects of a severe earthquake are usually complicated and depend on the topography and the nature of the surface materials. They are often more severe on soft alluvium and unconsolidated sediments than on hard rock. At distances of more than 100 km (60 miles) from the source, the main damage is caused by seismic waves traveling along the surface. In mines there is frequently little damage below depths of a few hundred metres even though the ground surface immediately above is considerably affected.

Earthquakes are frequently associated with reports of distinctive sounds and lights. The sounds are generally low-pitched and have been likened to the noise of an underground train passing through a station. The occurrence of such sounds is consistent with the passage of high-frequency seismic waves through the ground. Occasionally, luminous flashes, streamers, and bright balls have been reported in the night sky during earthquakes. These lights have been attributed to electric induction in the air along the earthquake source.

Tsunamis

Following certain earthquakes, very long-wavelength water waves in oceans or seas sweep inshore. More properly called seismic sea waves or tsunamis ( tsunami is a Japanese word for “harbour wave” ), they are commonly referred to as tidal waves, although the attractions of the Moon and Sun play no role in their formation. They sometimes come ashore to great heights—tens of metres above mean tide level—and may be extremely destructive.

The usual immediate cause of a tsunami is sudden displacement in a seabed sufficient to cause the sudden raising or lowering of a large body of water. This deformation may be the fault source of an earthquake, or it may be a submarine landslide arising from an earthquake.

Large volcanic eruptions along shorelines, such as those of Thera (c. 1580 bc) and Krakatoa (ad 1883), have also produced notable tsunamis. The most destructive tsunami ever recorded occurred on December 26, 2004, after an earthquake displaced the seabed off the coast of Sumatra, Indonesia. More than 200,000 people were killed by a series of waves that flooded coasts from Indonesia to Sri Lanka and even washed ashore on the Horn of Africa.

Following the initial disturbance to the sea surface, water waves spread in all directions. Their speed of travel in deep water is given by the formula (√gh), where h is the sea depth and g is the acceleration of gravity.

This speed may be considerable—100 metres per second ( 225 miles per hour ) when h is 1,000 metres ( 3,300 feet ). However, the amplitude ( i.e., the height of disturbance ) at the water surface does not exceed a few metres in deep water, and the principal wavelength may be on the order of hundreds of kilometres; correspondingly, the principal wave period—that is, the time interval between arrival of successive crests—may be on the order of tens of minutes. Because of these features, tsunami waves are not noticed by ships far out at sea.

When tsunamis approach shallow water, however, the wave amplitude increases. The waves may occasionally reach a height of 20 to 30 metres above mean sea level in U- and V-shaped harbours and inlets. They characteristically do a great deal of damage in low-lying ground around such inlets. Frequently, the wave front in the inlet is nearly vertical, as in a tidal bore, and the speed of onrush may be on the order of 10 metres per second. In some cases there are several great waves separated by intervals of several minutes or more. The first of these waves is often preceded by an extraordinary recession of water from the shore, which may commence several minutes or even half an hour beforehand.

Organizations, notably inJapan,Siberia,Alaska, andHawaii, have been set up to provide tsunami warnings. A key development is the Seismic Sea Wave Warning System, an internationally supported system designed to reduce loss of life in thePacific Ocean. Centred inHonolulu, it issues alerts based on reports of earthquakes from circum-Pacific seismographic stations.

Seiches

Seiches are rhythmic motions of water in nearly landlocked bays or lakes that are sometimes induced by earthquakes and tsunamis. Oscillations of this sort may last for hours or even for 1-day or 2-days.

The great Lisbon earthquake of 1755 caused the waters of canals and lakes in regions as far away as Scotland and Sweden to go into observable oscillations. Seiche surges in lakes in Texas, in the southwestern United States, commenced between 30 and 40 minutes after the 1964 Alaska earthquake, produced by seismic surface waves passing through the area.

A related effect is the result of seismic waves from an earthquake passing through the seawater following their refraction through the seafloor. The speed of these waves is about 1.5 km (0.9 mile) per second, the speed of sound in water. If such waves meet a ship with sufficient intensity, they give the impression that the ship has struck a submerged object. This phenomenon is called a seaquake.

Earthquake Intensity and Magnitude Scales

The violence of seismic shaking varies considerably over a single affected area. Because the entire range of observed effects is not capable of simple quantitative definition, the strength of the shaking is commonly estimated by reference to intensity scales that describe the effects in qualitative terms. Intensity scales date from the late 19th and early 20th centuries, before seismographs capable of accurate measurement of ground motion were developed. Since that time, the divisions in these scales have been associated with measurable accelerations of the local ground shaking. Intensity depends, however, in a complicated way not only on ground accelerations but also on the periods and other features of seismic waves, the distance of the measuring point from the source, and the local geologic structure. Furthermore, earthquake intensity, or strength, is distinct from earthquake magnitude, which is a measure of the amplitude, or size, of seismic waves as specified by a seismograph reading ( see below Earthquake magnitude )

A number of different intensity scales have been set up during the past century and applied to both current and ancient destructive earthquakes. For many years the most widely used was a 10-point scale devised in 1878 by Michele Stefano de Rossi and Franƈois-Alphonse Forel. The scale now generally employed in North America is the Mercalli scale, as modified by Harry O. Wood and Frank Neumann in 1931, in which intensity is considered to be more suitably graded.

A 12-point abridged form of the modified Mercalli scale is provided below. Modified Mercalli intensity VIII is roughly correlated with peak accelerations of about one-quarter that of gravity ( g = 9.8 metres, or 32.2 feet, per second squared ) and ground velocities of 20 cm (8 inches) per second. Alternative scales have been developed in bothJapan andEurope for local conditions.

The European ( MSK ) scale of 12 grades is similar to the abridged version of the Mercalli.

Modified Mercalli scale of earthquake intensity

  I. Not felt. Marginal and long-period effects of large earthquakes.

  II. Felt by persons at rest, on upper floors, or otherwise favourably placed to sense tremors.

  III. Felt indoors. Hanging objects swing. Vibrations are similar to those caused by the passing of light trucks. Duration can be estimated.

  IV. Vibrations are similar to those caused by the passing of heavy trucks (or a jolt similar to that caused by a heavy ball striking the walls). Standing automobiles rock. Windows, dishes, doors rattle. Glasses clink, crockery clashes. In the upper range of grade IV, wooden walls and frames creak.

  V. Felt outdoors; direction may be estimated. Sleepers awaken. Liquids are disturbed, some spilled. Small objects are displaced or upset. Doors swing, open, close. Pendulum clocks stop, start, change rate.

  VI. Felt by all; many are frightened and run outdoors. Persons walk unsteadily. Pictures fall off walls. Furniture moves or overturns. Weak plaster and masonry cracks. Small bells ring (church, school). Trees, bushes shake.

  VII. Difficult to stand. Noticed by drivers of automobiles. Hanging objects quivering. Furniture broken. Damage to weak masonry. Weak chimneys broken at roof line. Fall of plaster, loose bricks, stones, tiles, cornices. Waves on ponds; water turbid with mud. Small slides and caving along sand or gravel banks. Large bells ringing. Concrete irrigation ditches damaged.

  VIII. Steering of automobiles affected. Damage to masonry; partial collapse. Some damage to reinforced masonry; none to reinforced masonry designed to resist lateral forces. Fall of stucco and some masonry walls. Twisting, fall of chimneys, factory stacks, monuments, towers, elevated tanks. Frame houses moved on foundations if not bolted down; loose panel walls thrown out. Decayed pilings broken off. Branches broken from trees. Changes in flow or temperature of springs and wells. Cracks in wet ground and on steep slopes.

  IX. General panic. Weak masonry destroyed; ordinary masonry heavily damaged, sometimes with complete collapse; reinforced masonry seriously damaged. Serious damage to reservoirs. Underground pipes broken. Conspicuous cracks in ground. In alluvial areas, sand and mud ejected; earthquake fountains, sand craters.

  X. Most masonry and frame structures destroyed with their foundations. Some well-built wooden structures and bridges destroyed. Serious damage to dams, dikes, embankments. Large landslides. Water thrown on banks of canals, rivers, lakes, and so on. Sand and mud shifted horizontally on beaches and flat land. Railway rails bent slightly.

  XI. Rails bent greatly. Underground pipelines completely out of service.

  XII. Damage nearly total. Large rock masses displaced. Lines of sight and level distorted. Objects thrown into air.

With the use of an intensity scale, it is possible to summarize such data for an earthquake by constructing isoseismal curves, which are lines that connect points of equal intensity. If there were complete symmetry about the vertical through the earthquake’s focus, isoseismals would be circles with the epicentre (the point at the surface of the Earth immediately above where the earthquake originated) as the centre. However, because of the many unsymmetrical geologic factors influencing intensity, the curves are often far from circular. The most probable position of the epicentre is often assumed to be at a point inside the area of highest intensity. In some cases, instrumental data verify this calculation, but not infrequently the true epicentre lies outside the area of greatest intensity.

Earthquake Magnitude

Earthquake magnitude is a measure of the “size” or amplitude of the seismic waves generated by an earthquake source and recorded by seismographs.

Types and nature of these waves are described in Seismic waves ( further below ).

Because the size of earthquakes varies enormously, it is necessary for purposes of comparison to compress the range of wave amplitudes measured on seismograms by means of a mathematical device.

In 1935, American seismologist Charles F. Richter set up a magnitude scale of earthquakes as the logarithm to base 10 of the maximum seismic wave amplitude ( in thousandths of a millimetre ) recorded on a standard seismograph ( the Wood-Anderson torsion pendulum seismograph ) at a distance of 60-miles ( 100 kilometers ) from the earthquake epicentre.

Reduction of amplitudes observed at various distances to the amplitudes expected at the standard distance of 100 kilometers ( 50-miles ) is made on the basis of empirical tables.

Richter magnitudes ML are computed on the assumption the ratio of the maximum wave amplitudes at two ( 2 ) given distances is the same for all earthquakes and is independent of azimuth.

Richter first applied his magnitude scale to shallow-focus earthquakes recorded within 600 km of the epicentre in the southern California region. Later, additional empirical tables were set up, whereby observations made at distant stations and on seismographs other than the standard type could be used. Empirical tables were extended to cover earthquakes of all significant focal depths and to enable independent magnitude estimates to be made from body- and surface-wave observations.

A current form of the Richter scale is shown in the table.

Richter scale of earthquake magnitude

magnitude level

category

effects

earthquakes per year

less than 1.0 to 2.9

micro

generally not felt by people, though recorded on local instruments

more than 100,000

3.0-3.9

minor

felt by many people; no damage

12,000-100,000

4.0-4.9

light

felt by all; minor breakage of objects

2,000-12,000

5.0-5.9

moderate

some damage to weak structures

200-2,000

6.0-6.9

strong

moderate damage in populated areas

20-200

7.0-7.9

major

serious damage over large areas; loss of life

3-20

8.0 and higher

great

severe destruction and loss of life over large areas

fewer than 3

At the present time a number of different magnitude scales are used by scientists and engineers as a measure of the relative size of an earthquake. The P-wave magnitude (Mb), for one, is defined in terms of the amplitude of the P wave recorded on a standard seismograph. Similarly, the surface-wave magnitude (Ms) is defined in terms of the logarithm of the maximum amplitude of ground motion for surface waves with a wave period of 20 seconds.

As defined, an earthquake magnitude scale has no lower or upper limit. Sensitive seismographs can record earthquakes with magnitudes of negative value and have ‘recorded magnitudes up to’ about ‘9.0’ ( 1906 San Francisco earthquake, for example, had a Richter magnitude of 8.25 ).

A scientific weakness is that there is no direct mechanical basis for magnitude as defined above. Rather, it is an empirical parameter analogous to stellar magnitude assessed by astronomers. In modern practice a more soundly based mechanical measure of earthquake size is used—namely, the seismic moment (M0). Such a parameter is related to the angular leverage of the forces that produce the slip on the causative fault. It can be calculated both from recorded seismic waves and from field measurements of the size of the fault rupture. Consequently, seismic moment provides a more uniform scale of earthquake size based on classical mechanics. This measure allows a more scientific magnitude to be used called moment magnitude (Mw). It is proportional to the logarithm of the seismic moment; values do not differ greatly from Ms values for moderate earthquakes. Given the above definitions, the great Alaska earthquake of 1964, with a Richter magnitude (ML) of 8.3, also had the values Ms = 8.4, M0 = 820 × 1027 dyne centimetres, and Mw = 9.2

Earthquake Energy

Energy in an earthquake passing a particular surface site can be calculated directly from the recordings of seismic ground motion, given, for example, as ground velocity. Such recordings indicate an energy rate of 105 watts per square metre (9,300 watts per square foot) near a moderate-size earthquake source. The total power output of a rupturing fault in a shallow earthquake is on the order of 1014 watts, compared with the 105 watts generated in rocket motors.

The surface-wave magnitude Ms has also been connected with the surface energy Es of an earthquake by empirical formulas. These give Es = 6.3 × 1011 and 1.4 × 1025 ergs for earthquakes of Ms = 0 and 8.9, respectively. A unit increase in Ms corresponds to approximately a 32-fold increase in energy. Negative magnitudes Ms correspond to the smallest instrumentally recorded earthquakes, a magnitude of 1.5 to the smallest felt earthquakes, and one of 3.0 to any shock felt at a distance of up to 20 km ( 12 miles ). Earthquakes of magnitude 5.0 cause light damage near the epicentre; those of 6.0 are destructive over a restricted area; and those of 7.5 are at the lower limit of major earthquakes.

The total annual energy released in all earthquakes is about 1025 ergs, corresponding to a rate of work between 10,000,000 million and 100,000,000 million kilowatts. This is approximately one ( 1 ) 1,000th the ‘annual amount of heat escaping from the Earth interior’.

90% of the total seismic energy comes from earthquakes of ‘magnitude 7.0 and higher’ – that is, those whose energy is on the order of 1023 ergs or more.

Frequency

There also are empirical relations for the frequencies of earthquakes of various magnitudes. Suppose N to be the average number of shocks per year for which the magnitude lies in a range about Ms. Then log10 N = abMs fits the data well both globally and for particular regions; for example, for shallow earthquakes worldwide, a = 6.7 and b = 0.9 when Ms > 6.0. The frequency for larger earthquakes therefore increases by a factor of about 10 when the magnitude is diminished by one unit. The increase in frequency with reduction in Ms falls short, however, of matching the decrease in the energy E. Thus, larger earthquakes are overwhelmingly responsible for most of the total seismic energy release. The number of earthquakes per year with Mb > 4.0 reaches 50,000.

Earthquake Occurrences & Plate Tectonic associations

Global seismicity patterns had no strong theoretical explanation until the dynamic model called plate tectonics was developed during the late 1960s. This theory holds that the Earth’s upper shell, or lithosphere, consists of nearly a dozen large, quasi-stable slabs called plates. The thickness of each of these plates is roughly 50-miles ( 80 km ). Plates move horizontally relative to neighbouring plates at a rate of 0.4 to 4 inches ( 1-cm to 10-cm ) per year over a shell of lesser strength called the asthenosphere. At the plate edges where there is contact between adjoining plates, boundary tectonic forces operate on the rocks, causing physical and chemical changes in them. New lithosphere is created at oceanic ridges by the upwelling and cooling of magma from the Earth’s mantle. The horizontally moving plates are believed to be absorbed at the ocean trenches, where a subduction process carries the lithosphere downward into the Earth’s interior. The total amount of lithospheric material destroyed at these subduction zones equals that generated at the ridges. Seismological evidence ( e.g. location of major earthquake belts ) is everywhere in agreement with this tectonic model.

Earthquake Types:

– Shallow Earthquakes;

– Intermediate Earthquakes;

– Deep Focus ( Deep-Foci ) Earthquakes; and

– Deeper Focus ( Deeper-Foci ) Earthquakes.

Earthquake sources, are concentrated along oceanic ridges, corresponding to divergent plate boundaries.

At subduction zones, associated with convergent plate boundaries, deep-focus earthquakes and intermediate focus earthquakes mark locations of the upper part of a dipping lithosphere slab.

Focal ( Foci ) mechanisms indicate stresses aligned with dip of the lithosphere underneath the adjacent continent or island arc.

IntraPlate Seismic Event Anomalies

Some earthquakes associated with oceanic ridges are confined to strike-slip faults, called transform faults offset ridge crests. The majority of earthquakes occurring along such horizontal shear faults are characterized by slip motions.

Also in agreement with plate tectonics theory is high seismicity encountered along edges of plates where they slide past each other. Plate boundaries of this kind, sometimes called fracture zones include, the:

San Andreas Fault system in California; and,

– North Anatolian fault system in Turkey.

Such plate boundaries are the site of interplate earthquakes of shallow focus.

Low seismicity within plates is consistent with plate tectonic description. Small to large earthquakes do occur in limited regions well within the boundaries of plates, however such ‘intraplate seismic events’ can be explained by tectonic mechanisms other than plate boundary motions and their associated phenomena.

Most parts of the world experience at least occasional shallow earthquakes – those that originate within 60 km ( 40 miles ) of the Earth’s outer surface. In fact, the great ‘majority of earthquake foci ( focus ) are shallow’. It should be noted, however, that the geographic distribution of smaller earthquakes is less completely determined than more severe quakes, partly because the ‘availability of relevant data dependent on distribution of observatories’.

Of the total energy released in earthquakes, 12% comes from intermediate earthquakes—that is, quakes with a focal depth ranging from about 60 to 300 km. About 3 percent of total energy comes from deeper earthquakes. The frequency of occurrence falls off rapidly with increasing focal depth in the intermediate range. Below intermediate depth the distribution is fairly uniform until the greatest focal depths, of about 700 km (430 miles), are approached.

Deeper-Focus Earthquakes

Deeper-focus earthquakes commonly occur in patterns called Benioff zones dipping into the Earth, indicating presence of a subducting slab where dip angles of these slabs average about 45° – with some shallower – and others nearly vertical.

Benioff zones coincide with tectonically active island arcs, such as:

– Aleutian islands;

– Japan islands;

– Vanuatu islands; and

– Tonga.

Island arcs are, normally ( but not always ) associated, with:

Ultra-Deep Sea Ocean Trenches, such as the:

South America ( Andes mountain system ).

Exceptions to this rule, include:

– Romania ( East Europe ) mountain system; and

Hindu Kush mountain system.

Most Benioff zones,  deep-earthquake foci and intermediate-earthquake foci are usually found within a narrow layer, however recent more precise hypocentral locations – in Japan and elsewhere – indicate two ( 2 ) distinct parallel bands of foci only 12 miles ( 20 kilometers ) apart.

Aftershocks, Swarms and Foreshocks

Major or even moderate earthquake of shallow focus is followed by many lesser-size earthquakes close to the original source region. This is to be expected if the fault rupture producing a major earthquake does not relieve all the accumulated strain energy at once. In fact, this dislocation is liable to cause an increase in the stress and strain at a number of places in the vicinity of the focal region, bringing crustal rocks at certain points close to the stress at which fracture occurs. In some cases an earthquake may be followed by 1,000 or more aftershocks a day.

Sometimes a large earthquake is followed by a similar one along the same fault source within an hour or perhaps a day. An extreme case of this is multiple earthquakes. In most instances, however, the first principal earthquake of a series is much more severe than the aftershocks. In general, the number of aftershocks per day decreases with time.

Aftershock frequency, is ( roughly ):

Inversely proportional to time since occurrence of largest earthquake in series.

Most major earthquakes occur without detectable warning, but some principal earthquakes are preceded by foreshocks.

Japan 2-Years of Hundreds of Thousands Of Earthquakes

In another common pattern, large numbers of small earthquakes may occur in a region for months without a major earthquake.

In the Matsushiro region of Japan, for instance, there occurred ( between August 1965 and August 1967 ) a ‘series of earthquakes’ numbering in the hundreds of thousands – some sufficiently strong ( up to Richter magnitude 5.0 ) causing property damage but no casualties.

Maximum frequency? 6,780 small earthquakes on just April 17, 1966.

Such series of earthquakes are called earthquake swarms.

Earthquakes, associated with volcanic activity often occur in swarms – though swarms also have been observed in many nonvolcanic regions.

Study of Earthquakes

Seismic waves

Principal types of seismic waves

Seismic Waves ( S Waves ), generated by an earthquake source, are commonly classified into three ( 3 ) ‘leading types’.

The first two ( 2 ) leading types, propagate ( travel ) within the body of the Earth, are known as:

P ( Primary ) Seismic Waves; and,

S ( Secondary ) Seismic Waves ( S / S Waves ).

The third ( 3rd ) leading types, propagate ( travel ) along surface of the Earth, are known as:

L ( Love ) Seismic Waves; and,

R ( Rayleigh ) Seismic Waves.

During the 19th Century, existence of these types of seismic waves were mathematically predicted, and modern comparisons show close correspondence between such ‘theoretical calculations‘ and ‘actual measurements’ of seismic waves.

P seismic waves travel as elastic motions at the highest speeds, and are longitudinal waves transmitted by both solid and liquid materials within inner Earth.

P waves ( particles of the medium) vibrate in a manner ‘similar to sound waves’ transmitting media ( alternately compressed and expanded ).

The slower type of body wave, the S wave, travels only through solid material. With S waves, the particle motion is transverse to the direction of travel and involves a shearing of the transmitting rock.

Focus ( Foci )

Because of their greater speed, P waves are the first ( 1st ) to reach any point on the Earth’s surface. The first ( 1st ) P-wave onset ‘starts from the spot where an earthquake originates’. This point, usually at some depth within the Earth, is called the focus (also known as ) hypocentre.

Epicenter

Point ‘at the surface’ ( immediately ‘above the Focus / Foci’ ) is known as the ‘epicenter’.

Love waves and Rayleigh waves, guided by the free surface of the Earth, trail after P and S waves have passed through the body of planet Earth.

Rayleigh waves ( R Waves ) and Love waves ( L Waves ) involve ‘horizontal particle motion’, however ‘only Rayleigh waves exhibit ‘vertical ground displacements’.

Rayleigh waves ( R waves ) and Love ( L waves ) travel ( propagate ) and disperse into long wave trains, when occurring away-from ‘alluvial basin sources’, at substantial distances cause much of the Earth surface ground shaking felt during earthquakes.

Seismic Wave Focus ( Foci ) Properties

At all distances from the focus ( foci ), mechanical properties of rocks, such as incompressibility, rigidity and density play roles, in:

– Speed of ‘wave travel’;

– Duration of ‘wave trains’; and,

– Shape of ‘wave trains’.

Layering of the rocks and the physical properties of surface soil also affect wave characteristics.

In most cases, ‘elastic behaviors occur in earthquakes, however strong shaking ( of surface soils from the incident seismic waves ) sometimes result in ‘nonelastic behavior’, including slumping ( i.e., downward and outward movement of unconsolidated material ) and liquefaction of sandy soil.

Seismic wave that encounters a boundary separating ‘rocks of different elastic properties’ undergo reflection and refraction where a special complication exists because conversion between wave types usually also occur at such a boundary where an incident P or S wave can yield reflected P and S waves and refracted P and S waves.

Between Earth structural layers, boundaries give rise to diffracted and scattered waves, and these additional waves are partially responsible for complications observed in ground motion during earthquakes.

Modern research is concerned with ‘computing, synthetic records of ground motion realistic comparisons with observed actual ground shaking’, using wave theory in complex structures.

Grave Duration Long-Periods, Audible Earthquake Frequencies, and Other Earth Anomalies

Frequency range of seismic waves is widely varied, from being ‘High Frequency’ ( HF ) as an ‘audible range’ ( i.e. greater than > 20 hertz ) to, as Low Frequency ( LF ) as subtle as ‘free oscillations of planet Earth’ – with grave Long-Periods being 54-minutes ( see below Long-Period oscillations of the globe ).

Seismic wave attenuations in rock imposes High-Frequency ( HF ) limits, and in small to moderate earthquakes the dominant frequencies extend in Surface Waves from about 1.0 Hz to 0.1 Hertz.

Seismic wave amplitude range is also great in most earthquakes.

Displacement of ground ranges, from: 10−10 to 10−1 metre ( 4−12 to 4-inches ).

Great Earthquake Speed

Great Earthquake Ground Speed Moves Faster Than 32-Feet Per Second, Squared ( 9.8 Metres Per Second, Squared ) –

In the greatest earthquakes, ground amplitude of predominant P waves may be several centimetres at periods of 2-seconds to 5-seconds, however very close to seismic sources of ‘great earthquakes’, investigators measured ‘large wave amplitudes’ with ‘accelerations of the ground exceeding ( speed of gravity ) 32.2 feet per second squared ( 9.8 meters per second, squared ) at High Frequencies ( HF ) and ground displacements of 1 metre at Low Frequencies ( LF ).

Seismic Wave Measurement

Seismographs and Accelerometers

Seismographs ‘measure ground motion’ in both ‘earthquakes’ and ‘microseisms’ ( small oscillations described below ).

Most of these instruments are of the pendulum type. Early mechanical seismographs had a pendulum of large mass ( up to several tons ) and produced seismograms by scratching a line on smoked paper on a rotating drum.

In later instruments, seismograms ( also known as seismometers ) recorded via ‘rays of light bounced off a mirror’ within a galvanometer using electric current from electromagnetic induction ‘when the pendulum of the seismograph moved’.

Technological developments in electronics have given rise to ‘higher-precision pendulum seismometers’ and ‘sensors of ground motion’.

In these instruments electric voltages produced by motions of the pendulum or the equivalent are passed through electronic circuitry to ‘amplify ground motion digitized for more exactness’ readings.

Seismometer Nomenclature Meanings

Seismographs are divided into three ( 3 ) types of instruments knowingly confused by the public because of their varied names, as:

– Short-Period;

– Intermediate-Period ( also known as Long-Period );

– Long-Period ( also known as Intermediate-Period );

– Ultra-Long-Period ( also known as Broadband or Broad-Band ); or,

– Broadband ( also known as Ultra Long-Period or UltraLong-Period ).

Short-Period instruments are used to record P and S body waves with high magnification of the ground motion. For this purpose, the seismograph response is shaped to peak at a period of about 1-second or less.

Intermediate-period instruments, the type used by the World-Wide Standardized Seismographic Network ( WWSSN ) – described in the section Earthquake observatories – had about a 20-second ( maximum ) response.

Recently, in order to provide as much flexibility as possible for research work, the trend has been toward the operation of ‘very broadband seismographs’ digitizing representation of signals. This is usually accomplished with ‘very long-period pendulums’ and ‘electronic amplifiers’ passing signals in the band between 0.005 Hz and 50 Hertz.

When seismic waves close to their source are to be recorded, special design criteria are needed. Instrument sensitivity must ensure that the largest ground movements can be recorded without exceeding the upper scale limit of the device. For most seismological and engineering purposes the wave frequencies that must be recorded are higher than 1 hertz, and so the pendulum or its equivalent can be small. For this reason accelerometers that measure the rate at which the ground velocity is changing have an advantage for strong-motion recording. Integration is then performed to estimate ground velocity and displacement. The ground accelerations to be registered range up to two times that of gravity. Recording such accelerations can be accomplished mechanically with short torsion suspensions or force-balance mass-spring systems.

Because many strong-motion instruments need to be placed at unattended sites in ordinary buildings for periods of months or years before a strong earthquake occurs, they usually record only when a trigger mechanism is actuated with the onset of ground motion. Solid-state memories are now used, particularly with digital recording instruments, making it possible to preserve the first few seconds before the trigger starts the permanent recording and to store digitized signals on magnetic cassette tape or on a memory chip. In past design absolute timing was not provided on strong-motion records but only accurate relative time marks; the present trend, however, is to provide Universal Time ( the local mean time of the prime meridian ) by means of special radio receivers, small crystal clocks, or GPS ( Global Positioning System ) receivers from satellite clocks.

Prediction of strong ground motion and response of engineered structures in earthquakes depends critically on measurements of the spatial variability of earthquake intensities near the seismic wave source. In an effort to secure such measurements, special arrays of strong-motion seismographs have been installed in areas of high seismicity around the world.

Large-aperture seismic arrays (linear dimensions on the order of about 1/2 mile ( 0.6 mile ) to about 6 miles ( 1 kilometer to 10 kilometers ) of strong-motion accelerometers now used to improve estimations of speed, direction of propagation and types of seismic wave components.

Particularly important for full understanding of seismic wave patterns at the ground surface is measurement of the variation of wave motion with depth where to aid in this effort special digitally recording seismometers have been ‘installed in deep boreholes’.

Ocean-Bottom Measurements

70% of the Earth’s surface is covered by water so, ocean-bottom seismometers augment ( add to ) global land-based system of recording stations.

Field tests have established the feasibility of extensive long-term recording by instruments on the seafloor.

Japan has a ‘semi-permanent seismograph’ system of this type placed on the seafloor off the Pacific Ocean eastcoast of centralHonshu,Japan in 1978 by means of a ‘cable’.

Because of mechanical difficulties maintaining ‘permanent ocean-bottom instrumentation’, different systems have been considered.

They ‘all involve placement of instruments on the ocean bottom’, though they employ various mechanisms for data transmission.

Signals may be transmitted to the ocean surface for retransmission by auxiliary apparatus or transmitted via cable to a shore-based station. Another system is designed to release its recording device automatically, allowing it to float to the surface for later recovery.

Ocean bottom seismograph use should yield much-improved global coverage of seismic waves and provide new information on the seismicity of oceanic regions.

Ocean-bottom seismographs will enable investigators to determine the details of the crustal structure of the seafloor and, because of the relative ‘thinness of the oceanic crust‘, should make possible collection of clear seismic information about Earth’s upper mantle.

Ocean bottom seismograph systems are also expected to provide new data, on Earth:

– Continental Shelf Plate Boundaries;

– MicroSeisms ( origins and propagations ); and,

– Ocean to Continent behavior margins.

MicroSeisms Measurements

MicroSeisms ( also known as ) ‘small ground motions’ are commonly recorded by seismographs. Small weak seismic wave motions ( also known as ) MicroSeisms are ‘not generated by earthquakes’ but in some instances can complicate accurate earthquake measurement recording. MicroSeisms are of scientific interest because their form relates to Earth surface structure.

Microseisms ( some ) have ‘local cause’, for example:

Microseisms due to traffic ( or machinery ) or local wind effects, storms and rough surf against an extended steep coastline.

Another class of microseisms exhibits features that are very similar on records traced at earthquake observatories that are widely separated, including approximately simultaneous occurrence of maximum amplitudes and similar wave frequencies. These microseisms may persist for many hours and have more or less regular periods of about five to eight seconds.

The largest amplitudes of such microseisms are on the order of 10−3 cm ( 0.0004 inch ) and ‘occur in coastal regions’. Amplitudes also depend to some extent on local geologic structure.

Some microseisms are produced when ‘large standing water waves are formed far out at sea’. The period of this type of microseism is ‘half’ of the Standing Wave.

Observations of Earthquakes

Earthquake Observatories

During the late 1950s, there were only about seven-hundred ( 700 ) seismographic stations worldwide, equipped with seismographs of various types and frequency responses – few instruments of which were calibrated; actual ground motions could not be measured, and ‘timing errors of several seconds’ were common.

The World-Wide Standardized Seismographic Network ( WWSSN ), became the first modern worldwide standardized system established to remedy that situation.

Each of the WWSSN had six ( 6 ) seismograph stations with three ( 3 ) short-period and three ( 3 ) long-period seismographs with timing and accuracy maintained by quartz crystal clocks, and a calibration pulse placed daily on each record.

By 1967, the WWSSN consisted of about one-hundred twenty ( 120 ) stations throughout sixty ( 60 ) countries, resulting in data to provide the basis for significant advances in research, on:

– Earthquakes ( mechanisms );

– Plate Tectonics ( global ); and,

– Deep-Structure Earth ( interior ).

By the 1980s a further upgrading of permanent seismographic stations began with the installation of digital equipment by a number of organizations.

Global digital seismograph station networks, now in operation, consist of:

– Seismic Research Observatories ( SRO ) within boreholes drilled 330 feet ( 100 metres ) deep in Earth ground; and,

– Modified high-gain long-period earthquake observatories located on Earth ground surfaces.

The Global Digital Seismographic Network in particular has remarkable capability, recording all motions from Earth ocean tides to microscopic ground motions at the level of local ground noise.

At present there are about 128 sites. With this system the long-term seismological goal will have been accomplished to equip global observatories with seismographs that can record every small earthquake anywhere over a broad band of frequencies.

Epicentre Earthquakes Located

Many observatories make provisional estimates of the epicentres of important earthquakes. These estimates provide preliminary information locally about particular earthquakes and serve as first approximations for the calculations subsequently made by large coordinating centres.

If an earthquake’s epicentre is less than 105° away from an earthquake observatory, the epicentre position can often be estimated from the readings of three ( 3 ) seismograms recording perpendicular components of the ground motion.

For a shallow earthquake the epicentral distance is indicated by the interval between the arrival times of P and S waves; the azimuth and angle of wave emergence at the surface indicated by somparing sizes and directions of the first ( 1st ) movements indicated by seismograms and relative sizes of later waves – particularly surface waves.

Anomaly

It should be noted, however, that in certain regions the first ( 1st ) wave movement at a station arrives from a direction differing from the azimuth toward the epicentre. This ‘anomaly is usually explained’ by ‘strong variations in geologic structures’.

When data from more than one observatory are available, an earthquake’s epicentre may be estimated from the times of travel of the P and S waves from source to recorder. In many seismically active regions, networks of seismographs with telemetry transmission and centralized timing and recording are common. Whether analog or digital recording is used, such integrated systems greatly simplify observatory work: multichannel signal displays make identification and timing of phase onsets easier and more reliable. Moreover, online microprocessors can be programmed to pick automatically, with some degree of confidence, the onset of a significant common phase, such as P, by correlation of waveforms from parallel network channels. With the aid of specially designed computer programs, seismologists can then locate distant earthquakes to within about 10 km (6 miles) and the epicentre of a local earthquake to within a few kilometres.

Catalogs of earthquakes felt by humans and of earthquake observations have appeared intermittently for many centuries. The earliest known list of instrumentally recorded earthquakes with computed times of origin and epicentres is for the period 1899 – 1903, afterwhich cataloging of earthquakes became more uniform and complete.

Especially valuable is the service provided by the International Seismological Centre ( ISC ) in Newbury, UK that monthly receives more than 1,000,000 seismic readings from more than 2,000 seismic monitoring stations worldwide and preliminary estimates locations of approximately 1,600 earthquakes from national and regional agencies and observatories.

The ISC publishes a monthly bulletin about once ( 1 ) every 2-years. The bulletin, when published, provides ‘all available information that was’ on each of more than 5,000 earthquakes.

Various national and regional centres control networks of stations and act as intermediaries between individual stations and the international organizations.

Examples of long-standing national centers include, the:

Japan Meteorological Agency; and,

U.S. National Earthquake Information Center ( NEIC ), a subdivision of the U.S. Geological Survey ( USGS ).

Centers, such as the aforementioned, normally make ‘local earthquake estimates’, of:

– Magnitude;

– Epicentre;

– Time origin; and,

– Focal depth.

Global seismicity data is continually accessible via Incorporated Research Institutions for Seismology ( IRIS ) website.

An important research technique infers the character of faulting ( in an earthquake ) from recorded seismograms.

For example, observed distributions ( of the directions of the first onsets in waves arriving at the Earth’s surface ) have been effectively used.

Onsets are called “compressional” or “dilatational,” according to whether the direction is ‘away from’ or ‘toward’ the focus, respectively.

A polarity pattern becomes recognizable when the directions of the P-wave onsets are plotted on a map – there are broad areas in which the first onsets are predominantly compressions, separated from predominantly dilatational areas by nodal curves near which the P-wave amplitudes are abnormally small.

In 1926 the American geophysicist Perry E. Byerly used patterns of P onsets over the entire globe to infer the orientation of the fault plane in a large earthquake. The polarity method yields two P-nodal curves at the Earth’s surface; one curve is in the plane containing the assumed fault, and the other is in the plane ( called the auxiliary plane ) that passes through the focus and is perpendicular to the forces of the plane.

The recent availability of worldwide broad-based digital recording enabled computer programs written estimating the fault mechanism and seismic moment based on complete pattern of seismic wave arrivals.

Given a well-determined pattern at a number of earthquake observatories, it is possible to locate two ( 2 ) planes, one ( 1 ) of which is the plane containing the fault.

Earthquake Prediction

Earthquake Observations & Interpretations

Statistical earthquake occurrences are believed theorized, not widely accepted nor detecting periodic cycles, records of which old periodicities in time and space for major / great earthquakes cataloged are as old as 700 B.C. with China holding the ‘world’s most extensive catalog’ of approximately one ( 1 ) one-thousand ( 1,000 ) destructive earthquakes where ‘magnitude ( size )’ measurements were assessed based on ‘damage reports’ and experienced periods of ‘shaking’ and ‘other observations’ determining ‘intensity’ of those earthquakes.

Earthquake Attributions to Postulation

Precursor predictability approaches involve what some believe is sheer postulating what the initial trigger mechanisms are that force Earth ruptures, however where this becomes bizarre is where such forces have been attributed, to:

Weather Severity;

– Volcano Activity; and,

– Ocean Tide Force ( Moon ).

EXAMPLE: Correlations between physical phenomena assumed providing trigger mechanisms for earthquake repetition.

Professionals believe such must always be made to discover whether a causative link is actually present, and they further believe that to-date: ‘no cases possess any trigger mechanism’ – insofaras ‘moderate earthquakes’ to ‘large earthquakes’ unequivocally finding satisfaction with various necessary criteria.

Statistical methods also have been tried with populations of regional earthquakes with such suggested, but never established generally, that the slope b of the regression line between the logarithm of the number of earthquakes and the magnitude for a region may change characteristically with time.

Specifically, the claim is that the b value for the population of ‘foreshocks of a major earthquake’ may be ‘significantly smaller’ than the mean b value for the region averaged ‘over a long interval of time’.

Elastic rebound theory, of earthquake sources, allows rough prediction of the occurrence of large shallow earthquakes – for example – Harry F. Reid gave a crude forecast of the next great earthquake near San Francisco ( theory also predicted, of-course, the place would be along the San Andreas Fault or associated fault ). Geodetic data indicated that during an interval of 50 years relative displacements of 3.2 metres ( 10-1/2 feet ) had occurred at distant points across the fault. Elastic-rebound maximum offset ( along the fault in the 1906 earthquake ) was 6.5 metres. Therefore, ( 6.5 ÷ 3.2 ) × 50 or about 100-years would again elapse before sufficient strain accumulated for the occurrence of an earthquake comparable to that of 1906; premises being regional strain will grow uniformly and various constraints have not been altered by the great 1906 rupture itself ( such as by the onset of slow fault slip ).

Such ‘strain rates’ are now, however being more adequately measured ( along a number of active faults, e.g.San Andreas Fault) using networks of GPS sensors.

Earthquake Prediction Research

For many years prediction research has been influenced by the basic argument that ‘strain accumulates in rock masses in the vicinity of a fault, resulting in crustal deformation.

Deformations have been measured in ‘horizontal directions’ along active faults via ‘trilateration’ and ‘triangulation’ and in ‘vertical directions’ via ‘precise leveling and tiltmeters’.

Investigators ( some ) believe ‘ground-water level changes occur prior to earthquakes’ with variations of such reports from China.

Ground water levels respond to an array of complex factors ( e.g. ‘rainfall’ ) where such would have to be removed if changes in water level changes were studied in relation to earthquakes.

Phenomena Precursor Premonitories

Dilatancy theory ( i.e., volume increase of rock prior to rupture ) once occupied a central position in discussions of premonitory phenomena of earthquakes, but now receives less support based on observations that many solids exhibit dilatancy during deformation. For earthquake prediction, significance of dilatancy, if real, effects various measurable quantities of crustal Earth, i.e. seismic velocity, electric resistivity and ground and water levels. Consequences of dilatancy for earthquake prediction are summarized in the table ( below ):

The best-studied consequence is the effect on seismic velocities. The influence of internal cracks and pores on the elastic properties of rocks can be clearly demonstrated in laboratory measurements of those properties as a function of hydrostatic pressure. In the case of saturated rocks, experiments predict – for shallow earthquakes – that dilatancy occurs as a portion of the crust is stressed to failure, causing a decrease in the velocities of seismic waves. Recovery of velocity is brought about by subsequent rise of the pore pressure of water, which also has the effect of weakening the rock and enhancing fault slip.

Strain buildup in the focal region may have measurable effects on other observable properties, including electrical conductivity and gas concentration. Because the electrical conductivity of rocks depends largely on interconnected water channels within the rocks, resistivity may increase before the cracks become saturated. As pore fluid is expelled from the closing cracks, the local water table would rise and concentrations of gases such as radioactive radon would increase. No unequivocal confirming measurements have yet been published.

Geologic methods of extending the seismicity record back from the present also are being explored. Field studies indicate that the sequence of surface ruptures along major active faults associated with large earthquakes can sometimes be constructed.

An example is the series of large earthquakes inTurkeyin the 20th Century, which were caused mainly by successive westward ruptures of the North Anatolian Fault.

Liquefaction effects preserved in beds of sand and peat have provided evidence ( using radiometric dating methods ) for large paleoearthquakes back more than 1,000 years in many seismically active zones, including the U.S. Northwest Pacific Ocean Coastal Region.

Less well-grounded precursory phenomena, particularly earthquake lights and animal behaviour, sometimes draw more public attention than the precursors discussed above.

Unusual lights in the sky reported, and abnormal animal behaviour, preceding earthquakes are known to seismologists – mostly in anecdotal form.

Both phenomena, are usually explained away in terms of ( prior to earthquakes ) there being:

– Gaseous emmissions from Earth ground;

– Electric stimuli ( various ), e.g. HAARP, etcetera, from Earth ground; and,

– Acoustic stimuli ( various ), e.g. Seismic Wave subsonic emmissions from Earth ground.

At the present time, there is no definitive experimental evidence supporting reported claims of animals sometimes sensing an approaching earthquake.

… [ CENSORED-OUT ] …

Earthquake Hazard Reduction Methods

Considerable work has been done in seismology to explain the characteristics of the recorded ground motions in earthquakes. Such knowledge is needed to predict ground motions in future earthquakes so that earthquake-resistant structures can be designed.

Although earthquakes cause death and destruction via such secondary effects ( i.e. landslides, tsunamis, fires and fault rupture ), the greatest losses ( human lives and property ) result from ‘collapsing man-made structures’ amidst violent ground shaking.

The most effective way to mitigate ( minimize ) damage from earthquakes – from an engineering standpoint – is to design and construct structures capable of withstanding ‘strong ground motions’.

Interpreting recorded ground motions

Most ‘elastic waves’ recorded ( close to an extended fault source ) are complicated and difficult to interpret uniquely.

Understanding such, near-source motion, can be viewed as a 3 part problem.

The first ( 1st ) part stems from ‘elastic wave generations’ radiating ( from the slipping fault ) as the ‘moving rupture sweeps-out an area of slip’ ( along the fault plane ) – within a given time.

Wave pattern production dependencies on several parameters, such as:

Fault dimension and rupture velocity.

Elastic waves ( various types ) radiate, from the vicinity of the moving rupture, in all directions.

Geometric and frictional properties of the fault, critically affect wave pattern radiation from it.

The second ( 2nd ) part of the problem concerns the passage of the waves through the intervening rocks to the site and the effect of geologic conditions.

The third ( 3rd ) part involves the conditions at the recording site itself, such as topography and highly attenuating soils. All these questions must be considered when estimating likely earthquake effects at a site of any proposed structure.

Experience has shown that the ground strong-motion recordings have a variable pattern in detail but predictable regular shapes in general ( except in the case of strong multiple earthquakes ).

EXAMPLE: Actual ground shaking ( acceleration, velocity and displacement ) recorded during an earthquake ( see figure below ).

In a strong horizontal shaking of the ground near the fault source, there is an initial segment of motion made up mainly of P waves, which frequently manifest themselves strongly in the vertical motion. This is followed by the onset of S waves, often associated with a longer-period pulse of ground velocity and displacement related to the near-site fault slip or fling. This pulse is often enhanced in the direction of the fault rupture and normal to it. After the S onset there is shaking that consists of a mixture of S and P waves, but the S motions become dominant as the duration increases. Later, in the horizontal component, surface waves dominate, mixed with some S body waves. Depending on the distance of the site from the fault and the structure of the intervening rocks and soils, surface waves are spread out into long trains.

Expectant Seismic Hazard Maps Constructed

In many regions, seismic expectancy maps or hazard maps are now available for planning purposes. The anticipated intensity of ground shaking is represented by a number called the peak acceleration or the peak velocity.

To avoid weaknesses found in earlier earthquake hazard maps, the following general principles are usually adopted today:

The map should take into account not only the size but also the frequency of earthquakes.

The broad regionalization pattern should use historical seismicity as a database, including the following factors: major tectonic trends, acceleration attenuation curves, and intensity reports.

Regionalization should be defined by means of contour lines with design parameters referred to ordered numbers on neighbouring contour lines ( this procedure minimizes sensitivity concerning the exact location of boundary lines between separate zones ).

The map should be simple and not attempt to microzone the region.

The mapped contoured surface should not contain discontinuities, so that the level of hazard progresses gradually and in order across any profile drawn on the map.

Developing resistant structures

Developing engineered structural designs that are able to resist the forces generated by seismic waves can be achieved either by following building codes based on hazard maps or by appropriate methods of analysis. Many countries reserve theoretical structural analyses for the larger, more costly, or critical buildings to be constructed in the most seismically active regions, while simply requiring that ordinary structures conform to local building codes. Economic realities usually determine the goal, not of preventing all damage in all earthquakes but of minimizing damage in moderate, more common earthquakes and ensuring no major collapse at the strongest intensities. An essential part of what goes into engineering decisions on design and into the development and revision of earthquake-resistant design codes is therefore seismological, involving measurement of strong seismic waves, field studies of intensity and damage, and the probability of earthquake occurrence.

Earthquake risk can also be reduced by rapid post-earthquake response. Strong-motion accelerographs have been connected in some urban areas, such as Los Angeles, Tokyo, and Mexico City, to interactive computers.

Recorded waves are correlated with seismic intensity scales and rapidly displayed graphically on regional maps via the World Wide Web.

Exploration of the Earth’s interior with seismic waves

Seismological Tomography

Deep Structure Earth seismological data from several sources, including:

– Nuclear explosions containing P-Waves and S-Waves;

– Earthquakes containing P-Waves and S-Waves;

– Earth ‘surface wave dispersions’ from ‘distant earthquakes’; and,

– Earth ‘planetary vibration’ from ‘Great Earthquakes’

One of the major aims of seismology was to infer a minimum set of properties surrounding the planet interior of Earth that might explain recorded seismic ‘wave trains’ in detail.

Deep Structure Earth exploration made ‘tremendous progress during the first half of the 20th Century ( 1900s – 1950s ), realizing goals was severely limited until the 1960s because of laborious effort required just to evaluate theoretical models and process large amounts of recorded earthquake data.

Today’s application of supercomputer high-speed data processing enormous quantities of stored data and information retrieval capabilities opened information technology ( IT ) passageways leading to major advancements in the way data is manipulated ( data handling ) for advanced theoretical modeling, research analytics and developmental prototyping.

Earth structure realistic modeling studies by researchers since the middle 1970s include continental and oceanic boundaries, mountains and river valleys rather than simple structures such as those involving variation only with depth, and various technical developments have benefited observational seismology.

EXAMPLE: Deep Structure Earth significant exploration using 3D ( three dimensional ) imaging with equally impressive display ( monitor ) equipment possible from advanced microprocessor architecture redesign, new discoveries of materials and new concepts making seismic exploratory techniques developed by petroleum industry adaptations ( e.g. seismic reflection ) highly recognized as adopted procedures.

Deep Structure Earth major methods for determining planet interior is detailed analysis of seismograms of seismic waves; noting earthquake readings additionally provide estimates of, Earth internal:

Wave velocities;

– Density; and,

– Parameters of ‘elasticity’ ( stretchable ) and ‘inelasticity’ ( fixed ).

Earthquake Travel Time

Primary procedure is to measure the travel times of various wave types, such as P and S, from their source to the recording seismograph. First, however, identification of each wave type with its ray path through the Earth must be made.

Seismic rays for many paths of P and S waves leaving the earthquake focus F are shown in the figure.

Deep-Focus Deep-Structure Earth Coremetrics

Rays corresponding to waves that have been reflected at the Earth’s outer surface (or possibly at one of the interior discontinuity surfaces) are denoted as PP, PS, SP, PSS, and so on. For example, PS corresponds to a wave that is of P type before surface reflection and of S type afterward. In addition, there are rays such as pPP, sPP, and sPS, the symbols p and s corresponding to an initial ascent to the outer surface as P or S waves, respectively, from a deep focus.

An especially important class of rays is associated with a discontinuity surface separating the central core of the Earth from the mantle at a depth of about 2,900 km (1,800 miles) below the outer surface. The symbol c is used to indicate an upward reflection at this discontinuity. Thus, if a P wave travels down from a focus to the discontinuity surface in question, the upward reflection into an S wave is recorded at an observing station as the ray PcS and similarly with PcP, ScS, and ScP. The symbol K is used to denote the part (of P type) of the path of a wave that passes through the liquid central core. Thus, the ray SKS corresponds to a wave that starts as an S wave, is refracted into the central core as a P wave, and is refracted back into the mantle, wherein it finally emerges as an S wave. Such rays as SKKS correspond to waves that have suffered an internal reflection at the boundary of the central core.

The discovery of the existence of an inner core in 1936 by the Danish seismologist Inge Lehmann made it necessary to introduce additional basic symbols. For paths of waves inside the central core, the symbols i and I are used analogously to c and K for the whole Earth; therefore, i indicates reflection upward at the boundary between the outer and inner portions of the central core, and I corresponds to the part (of P type) of the path of a wave that lies inside the inner portion. Thus, for instance, discrimination needs to be made between the rays PKP, PKiKP, and PKIKP. The first of these corresponds to a wave that has entered the outer part of the central core but has not reached the inner core, the second to one that has been reflected upward at the inner core boundary, and the third to one that has penetrated into the inner portion.

By combining the symbols p, s, P, S, c, K, i, and I in various ways, notation is developed for all the main rays associated with body earthquake waves.

Hidden Inner Earth Deep Structure Anomalies

The symbol J, introduced to correspond with S waves located within Earth’s inner core, is only evidence if such ( if ever ) be found for such waves. Use of times of travel along rays to infer a hidden structure is analogous to the use of X-rays in medical tomography. The method involves reconstructing an image of internal anomalies from measurements made at the outer surface. Nowadays, hundreds of thousands of travel times of P and S waves are available in earthquake catalogs for the tomographic imaging of the Earth’s interior and the mapping of internal structure.

Inner Earth Deep Structure

Thinest & Thickest Part of Earth’s Crust

Inner Earth, based on earthquake records and imaging studies, are officially represented, as:

A solid layer flowing patterns of a mantle, at its ’thickest point’ being about 1,800-miles ( 2,900 kilometers ) thick, although at its ‘thinest point’ less than 6-miles ( 10 kilometers ) beneath the ocean seafloor bed beneath the surface of the ultra-deep sea.

The thin surface rock layer surrounding the mantle is the crust, whose lower boundary is called the Mohorovičić discontinuity. In normal continental regions the crust is about 30 kilometers to 40 km thick; there is usually a superficial low-velocity sedimentary layer underlain by a zone in which seismic velocity increases with depth. Beneath this zone there is a layer in which P-wave velocities in some places fall from 6 to 5.6 km per second. The middle part of the crust is characterized by a heterogeneous zone with P velocities of nearly 6 to 6.3 km per second. The lowest layer of the crust ( about 10 km thick ) has significantly higher P velocities, ranging up to nearly 7 km per second.

In the deep ocean there is a sedimentary layer that is about 1 km thick. Underneath is the lower layer of the oceanic crust, which is about 4 km thick. This layer is inferred to consist of basalt that formed where extrusions of basaltic magma at oceanic ridges have been added to the upper part of lithospheric plates as they spread away from the ridge crests. This crustal layer cools as it moves away from the ridge crest, and its seismic velocities increase correspondingly.

Below Earth’s mantle at its ‘thickest point’ exists a shell depth of 1,800 miles ( 2,255 km ), which seismic waves indicate, has liquid property form, and at Earth’s ‘shallowest point’ only 6-miles ( 10 kilometers ) located beneath the ultra-deep seafloor of the planet ocean.

At the very centre of the planet is a separate solid core with a radius of 1,216 km. Recent work with observed seismic waves has revealed three-dimensional structural details inside the Earth, especially in the crust and lithosphere, under the subduction zones, at the base of the mantle, and in the inner core. These regional variations are important in explaining the dynamic history of the planet.

Long-Period Global Oscillations

Sometimes earthquakes can be so great, the entire planet Earth will vibrate like a ringing bell’s echo, with the deepest tone of vibration recorded by modern man on planet Earth is a period of measurement where the length of time between the arrival of successive crests in a wave train has been 54-minutes considered by human beings as ‘grave’ ( an extremely significant danger ).

Knowledge of these vibrations has come from a remarkable extension in the ‘range of periods of ground movements’ now able to be recorded by modern ‘digital long-period seismographs’ spanning the entire allowable spectrum of earthquake wave periods, from: ordinary P waves ( with periods of tenths of seconds ) to vibrations ( with periods on the order of 12-hours and 24-hours ), i.e. those movements occuring within Earth ocean tides.

The measurements of vibrations of the whole Earth provide important information on the properties of the interior of the planet. It should be emphasized that these free vibrations are set up by the energy release of the earthquake source but continue for many hours and sometimes even days. For an elastic sphere such as the Earth, two types of vibrations are known to be possible. In one type, called S modes, or spheroidal vibrations, the motions of the elements of the sphere have components along the radius as well as along the tangent. In the second [ 2nd ] type, which are designated as T modes or torsional vibrations, there is shear but no radial displacements. The nomenclature is nSl and nTl, where the letters n and l are related to the surfaces in the vibration at which there is zero motion. Four ( 4 ) examples are illustrated in the figure. The subscript n gives a count of the number of internal zero-motion ( nodal ) surfaces, and l indicates the number of surface nodal lines.

Several hundred types of S and T vibrations have been identified and the associated periods measured. The amplitudes of the ground motion in the vibrations have been determined for particular earthquakes, and, more important, the attenuation of each component vibration has been measured. The dimensionless measure of this decay constant is called the quality factor Q. The greater the value of Q, the less the wave or vibration damping. Typically, for oS10 and oT10, the Q values are about 250.

The rate of decay of the vibrations of the whole Earth with the passage of time can be seen in the figure, where they appear superimposed for 20 hours of the 12-hour tidal deformations of the Earth. At the bottom of the figure these vibrations have been split up into a series of peaks, each with a definite frequency, similar to that of the spectrum of light.

Such a spectrum indicates the relative amplitude of each harmonic present in the free oscillations. If the physical properties of the Earth’s interior were known, all these individual peaks could be calculated directly. Instead, the internal structure must be estimated from the observed peaks.

Recent research has shown that observations of long-period oscillations of the Earth discriminate fairly finely between different Earth models. In applying the observations to improve the resolution and precision of such representations of the planet’s internal structure, a considerable number of Earth models are set up, and all the periods of their free oscillations are computed and checked against the observations. Models can then be successively eliminated until only a small range remains. In practice, the work starts with existing models; efforts are made to amend them by sequential steps until full compatibility with the observations is achieved, within the uncertainties of the observations. Even so, the resulting computed Earth structure is not a unique solution to the problem.

Extraterrestrial Seismic Phenomena

Space vehicles have carried equipment onto the our Moon and Mars surface recording seismic waves from where seismologists on Earth receive telemetry signals from seismic events from both.

By 1969, seismographs had been placed at six sites on the Moon during the U.S. Apollo missions. Recording of seismic data ceased in September 1977. The instruments detected between 600 and 3,000 moonquakes during each year of their operation, though most of these seismic events were very small. The ground noise on the lunar surface is low compared with that of the Earth, so that the seismographs could be operated at very high magnifications. Because there was more than one station on the Moon, it was possible to use the arrival times of P and S waves at the lunar stations from the moonquakes to determine foci in the same way as is done on the Earth.

Moonquakes are of three types. First, there are the events caused by the impact of lunar modules, booster rockets, and meteorites. The lunar seismograph stations were able to detect meteorites hitting the Moon’s surface more than 1,000 km (600 miles) away. The two other types of moonquakes had natural sources in the Moon’s interior: they presumably resulted from rock fracturing, as on Earth. The most common type of natural moonquake had deep foci, at depths of 600 to 1,000 km; the less common variety had shallow focal depths.

Seismological research on Mars has been less successful. Only one of the seismometers carried to the Martian surface by the U.S. Viking landers during the mid-1970s remained operational, and only one potential marsquake was detected in 546 Martian days.

Historical Major Earthquakes

Major historical earthquakes chronological listing in table ( below ).

 

Major Earthquake History

Year

Region / Area

Affected

* Mag.

Intensity

Human Death

Numbers

( approx. )

Remarks

c. 1500 BCE

Knossos,

Crete

(Greece)

X

One of several events that leveled the capital of Minoan civilization, this quake accompanied the explosion of the nearby volcanic islandof Thera.

27 BCE

Thebes

(Egypt)

This quake cracked one of the statues known as the Colossi of Memnon, and for almost two centuries the “singing Memnon” emitted musical tones on certain mornings as it was warmed by the Sun’s rays.

62 CE

Pompeii

and Herculaneum

(Italy)

X

These two prosperous Roman cities had not yet recovered from the quake of 62 when they were buried by the eruption of Mount Vesuvius in 79.

115

AntiochAntakya,

(Turkey)

XI

A centre of Hellenistic and early Christian culture, Antiochsuffered many devastating quakes; this one almost killed the visiting Roman emperor Trajan.

1556

Shaanxi

( province )

China

IX

830,000

Deadliest earthquake ever recorded, possible.

1650

Cuzco

(Peru)

8.1

VIII

Many ofCuzco’s Baroque monuments date to the rebuilding of the city after this quake.

1692

Port Royal (Jamaica)

2,000

Much of thisBritish West Indiesport, a notorious haven for buccaneers and slave traders, sank beneath the sea following the quake.

1693

southeasternSicily,

(Italy)

XI

93,000

Syracuse, Catania, and Ragusa were almost completely destroyed but were rebuilt with a Baroque splendour that still attracts tourists.

1755

Lisbon,Portugal

XI

62,000

The Lisbon earthquake of 1755 was felt as far away asAlgiers and caused a tsunami that reached theCaribbean.

1780

Tabriz

(Iran)

7.7

200,000

This ancient highland city was destroyed and rebuilt, as it had been in 791, 858, 1041, and 1721 and would be again in 1927.
1811 – 1812

NewMadrid,Missouri

(USA)

7.5 – 7.7

XII

A series of quakes at the New Madrid Fault caused few deaths, but the New Madrid earthquake of 1811 – 1812 rerouted portions of the Mississippi River and was felt fromCanada to theGulf of Mexico.

1812

Caracas

(Venezuela)

9.6

X

26,000

A provincial town in 1812,Caracasrecovered and eventually becameVenezuela’s capital.

1835

Concepción,

(Chile)

8.5

35

British naturalist Charles Darwin, witnessing this quake, marveled at the power of the Earth to destroy cities and alter landscapes.

1886

Charleston,South Carolina

(USA)

IX

60

This was one of the largest quakes ever to hit the easternUnited States.

1895

Ljubljana

(Slovenia)

6.1

VIII

ModernLjubljanais said to have been born in the rebuilding after this quake.

1906

San Francisco,California

(USA)

7.9

XI

700

San Franciscostill dates its modern development from the San Francisco earthquake of 1906 and the resulting fires.

1908

Messina and Reggio di Calabria,Italy

7.5

XII

110,000

These two cities on theStraitofMessinawere almost completely destroyed in what is said to beEurope’s worst earthquake ever.

1920

Gansu

( province )

China

8.5

200,000

Many of the deaths in this quake-prone province were caused by huge landslides.

1923

Tokyo-Yokohama,

(Japan)

7.9

142,800

Japan’s capital and its principal port, located on soft alluvial ground, suffered severely from the Tokyo-Yokohama earthquake of 1923.

1931

Hawke Bay,New Zealand

7.9

256

The bayside towns of Napier and Hastings were rebuilt in an Art Deco style that is now a great tourist attraction.

1935

Quetta (Pakistan)

7.5

X

20,000

The capital of Balochistan province was severely damaged in the most destructive quake to hitSouth Asiain the 20th century.

1948

Ashgabat (Turkmenistan)

7.3

X

176,000

Every year,Turkmenistancommemorates the utter destruction of its capital in this quake.

1950

Assam,India

8.7

X

574

The largest quake ever recorded inSouth Asiakilled relatively few people in a lightly populated region along the Indo-Chinese border.

1960

Valdivia

and

Puerto Montt,

(Chile)

9.5

XI

5,700

The Chile earthquake of 1960, the largest quake ever recorded in the world, produced a tsunami that crossed the Pacific Ocean toJapan, where it killed more than 100 people.

1963

Skopje,Macedonia

6.9

X

1,070

The capital ofMacedoniahad to be rebuilt almost completely following this quake.

1964

Prince William Sound,Alaska,U.S.

9.2

131

Anchorage, Seward, and Valdez were damaged, but most deaths in the Alaska earthquake of 1964 were caused by tsunamis inAlaska and as far away asCalifornia.

1970

Chimbote,Peru

7.9

70,000

Most of the damage and loss of life resulting from the Ancash earthquake of 1970 was caused by landslides and the collapse of poorly constructed buildings.

1972

Managua,Nicaragua

6.2

10,000

The centre of the capital ofNicaraguawas almost completely destroyed; the business section was later rebuilt some 6 miles (10 km) away.

1976

Guatemala City,Guatemala

7.5

IX

23,000

Rebuilt following a series of devastating quakes in 1917–18, the capital ofGuatemalaagain suffered great destruction.

1976

Tangshan,

(China)

7.5

X

242,000

In the Tangshan earthquake of 1976, this industrial city was almost completely destroyed in the worst earthquake disaster in modern history.

1985

Michoacán state and Mexico City,Mexico

8.1

IX

10,000

The centre of Mexico City, built largely on the soft subsoil of an ancient lake, suffered great damage in the Mexico City earthquake of 1985.

1988

Spitak and Gyumri,Armenia

6.8

X

25,000

This quake destroyed nearly one-third ofArmenia’s industrial capacity.

1989

Loma Prieta,California,U.S.

7.1

IX

62

The San Francisco–Oakland earthquake of 1989, the first sizable movement of the San Andreas Fault since 1906, collapsed a section of the San Francisco–Oakland Bay Bridge.

1994

Northridge,

California

(USA)

6.8

IX

60

Centred in the urbanized San Fernando Valley, the Northridge earthquake of 1994 collapsed freeways and some buildings, but damage was limited by earthquake-resistant construction.

1995

Kobe,

(Japan)

6.9

XI

5,502

The Great Hanshin Earthquake destroyed or damaged 200,000 buildings and left 300,000 people homeless.

1999

Izmit,Turkey

7.4

X

17,000

The Izmit earthquake of 1999 heavily damaged the industrial city ofIzmit and the naval base at Golcuk.

1999

Nan-t’ou county,Taiwan

7.7

X

2,400

The Taiwan earthquake of 1999, the worst to hitTaiwan since 1935, provided a wealth of digitized data for seismic and engineering studies.

2001

Bhuj,

Gujarat

( state )

India

8.0

X

20,000

The Bhuj earthquake of 2001, possibly the deadliest ever to hitIndia, was felt acrossIndia andPakistan.

2003

Bam

(Iran)

6.6

IX

26,000

This ancientSilk Roadfortress city, built mostly of mud brick, was almost completely destroyed.

2004

Aceh

( province )

Sumatra

(Indonesia)

9.1

200,000

The deaths resulting from this offshore quake actually were caused by a tsunami originating in the Indian Ocean that, in addition to killing more than 150,000 inIndonesia, killed people as far away asSri Lanka andSomalia.

2005

Azad Kashmir

(Pakistanadministered )

( Kashmir )

7.6

VIII

80,000

The Kashmir earthquake of 2005, perhaps the deadliest shock ever to strikeSouth Asia, left hundreds of thousands of people exposed to the coming winter weather.

2008

Sichuan

( province )

(China

7.9

IX

69,000

The Sichuan earthquake of 2008 left over 5 million people homeless across the region, and over half of Beichuan city was destroyed by the initial seismic event and the release of water from a lake formed by nearby landslides.

2009

L’Aquila,

(Italy)

6.3

VIII

300

The L’Aquila earthquake of 2009 left more than 60,000 people homeless and damaged many of the city’s medieval buildings.

2010

Port-au-Prince,

(Haiti)

7.0

IX

316,000

The Haiti earthquake of 2010 devastated the metropolitan area ofPort-au-Prince and left an estimated 1.5 million survivors homeless.

2010

Maule,

(Chile)

8.8

VIII

521

The Chile earthquake of 2010 produced widespread damage inChile’s central region and triggered tsunami warnings throughout the Pacific basin.

2010

Christchurch,(New Zealand)

7.0

VIII

180

Most of the devastation associated with the Christchurch earthquakes of 2010–11 resulted from a magnitude-6.3 aftershock that struck on February 22, 2011.

2011

Honshu,

(Japan)

9.0

VIII

20,000

The powerful Japan earthquake and tsunami of 2011, which sent tsunami waves across the Pacific basin, caused widespread damage throughout easternHonshu.

2011

Erciş

And

Van,

(Turkey)

7.2

IX

The Erciş-Van earthquake of 2011 destroyed several apartment complexes and shattered mud-brick homes throughout the region.
  Data Sources: National Oceanic and Atmospheric Administration ( NOAA ), National Geophysical Data Center ( NGDC ), Significant Earthquake Database ( SED ), a searchable online database using the Catalog of Significant Earthquakes 2150 B.C. – 1991 A.D. ( with Addenda ), and U.S. Geological Survey ( USGS ), Earthquake Hazards Program.  * Measures of magnitude may differ from other sources.

ARTICLE

AdditionalReading

Earthquakes are covered mainly in books on seismology.

Recommended introductory texts, are:

Bruce A. Bolt, Earthquakes, 4th ed. (1999), and Earthquakes and Geological Discovery (1993); and,

Jack Oliver, Shocks and Rocks: Seismology and the Plate Tectonics Revolution (1996).

Comprehensive books on key aspects of seismic hazards, are:

Leon Reiter, Earthquake Hazard Analysis – Issues and Insights (1990); and,

Robert S. Yeats, Kerry Sieh, and Clarence R. Allen, The Geology of Earthquakes (1997).

A history of discrimination, between:

Underground nuclear explosions and natural earthquakes, is given by:

Bruce A. Bolt, “Nuclear Explosions and Earthquakes: The Parted Veil” ( 1976 ).

More advanced texts that treat the theory of earthquake waves in detail, are:

Agustín Udías, Principles of Seismology (1999);

Thorne Lay and Terry C. Wallace, Modern Global Seismology (1995);

Peter M. Shearer, Introduction to Seismology (1999); and,

K.E. Bullen and Bruce A. Bolt, An Introduction to the Theory of Seismology, 4th ed. (1985).

LINKS

Year in Review

Britannica provides coverage of “earthquake” in the following Year in Review articles.

Bhutan  ( in  Bhutan )

geophysics  ( in  geophysics )

Japan

Kyrgyzstan  (in  Kyrgyzstan)

Nepal  (in  Nepal)

New Zealand  (in  New Zealand )

Chile  (in  Chile: Year In Review 2010)

China  (in  China: Year In Review 2010)

“Engineering for Earthquakes”  ( in  Engineering for Earthquakes: Year In Review 2010 (earthquake) )

geophysics  (in  Earth Sciences: Year In Review 2010)

Haiti  (in  Haiti: Year In Review 2010; in  Haiti earthquake of 2010 )

Mauritius  (in  Mauritius: Year In Review 2010)

New Zealand  (in  New Zealand: Year In Review 2010 )

Bhutan  (in  Bhutan: Year In Review 2009)

Costa Rica  (in  Costa Rica: Year In Review 2009 )

geophysics  (in  Earth Sciences: Year In Review 2009)

Indonesia  (in  Indonesia: Year In Review 2009)

Italy  (in  Italy: Year In Review 2009; in  Vatican City State: Year In Review 2009 )

Samoa  (in  Samoa: Year In Review 2009)

“Major Earthquake Shakes China’s Sichuan Province, A”  ( in  A Major Earthquake Shakes China’s Sichuan Province: Year In Review 2008 (earthquake) )

China  (in  China: Year In Review 2008; in  United Nations: Year In Review 2008 )

Congo, Democratic Republic of the  (in  Democratic Republic of the Congo: Year In Review 2008)

geology  (in  Earth Sciences: Year In Review 2008)

geophysics  (in  Earth Sciences: Year In Review 2008 )

geophysics  (in  Earth Sciences: Year In Review 2007)

paleontology  (in  Life Sciences: Year In Review 2007)

Peru  (in  Peru: Year In Review 2007 )

geophysics  (in  Earth Sciences: Year In Review 2006)

glaciers  (in  Earth Sciences: Year In Review 2006)

Mozambique  (in  Mozambique: Year In Review 2006)

archaeology  ( in  Anthropology and Archaeology: Year In Review 2005 )

geophysics  (in  Earth Sciences: Year In Review 2005)

India  (in  India: Year In Review 2005 )

Pakistan(in  Pakistan: Year In Review 2005 )

“Cataclysm in Kashmir”  (in  Cataclysm in Kashmir: Year In Review 2005 (Jammu and Kashmir))

geophysics  (in  Earth Sciences: Year In Review 2004)

Japan  (in  Japan: Year In Review 2004)

tsunami  (in  The Deadliest Tsunami: Year In Review 2004 (tsunami))

geophysics  (in  Earth Sciences: Year In Review 1996)

geophysics  (in  Earth and Space Sciences: Year In Review 1995)

LINKS

Other Britannica Sites

Get involved Share

Articles from Britannica encyclopedias for elementary and high school students.

Earthquake – Children’s Encyclopedia ( Ages 8-11 ) – During an earthquake, huge masses of rock move beneath the Earth’s surface and cause the ground to shake. Earthquakes occur constantly around the world. Often they are too small for people to feel at all. Sometimes, however, earthquakes cause great losses of life and property.

Earthquake – Student Encyclopedia ( Ages 11 and up ) – Sudden shaking of the ground that occurs when masses of rock change position below Earth’s surface is called an earthquake. The shifting masses send out shock waves that may be powerful enough to alter the surface, thrusting up cliffs and opening great cracks in the ground.

The topic earthquake is discussed at the following external Web sites.

Citations

To cite this page: MLAAPAHarvardChicago Manual of Style

MLA Style: “earthquake.” Encyclopædia Britannica. Encyclopædia Britannica Online. Encyclopædia Britannica Inc., 2012. Web. 21 Mar. 2012. http://www.britannica.com/EBchecked/topic/176199/earthquake

Reference

http://www.britannica.com/EBchecked/topic/176199/earthquake/247989/Shallow-intermediate-and-deep-foci?anchor=ref105456

– – – –

Feeling ‘educated’? Think you’re out-of the earthquake and tsunami water subject?

March 23, 2012 news, however contradicts decades of professional scientific knowledge and studies so, if you were just feeling ‘overly educated’ about earthquakes and tsunamis – don’t be. You’re now lost at sea, in the same proverbial ‘boat’, with all those global government scientific and technical ( S&T ) professionals who thought they understood previous information surrounding earthquakes and tsunamis.

After comparing Japan 9.0 ‘earthquake directional arrows’, depicted on the charts ( further above ), with ocean currents, tidal charts and trade winds from the global jet stream there’s a problem that cannot be explained when on March 23, 2012 British Columbia, Canada reported its northwest Pacific Ocean coastal sea waters held a 100-foot fishing boat ‘still afloat’ – more than 1-year after the Japan tsunami from its 9.0 earthquake on March 11, 2011.

[ IMAGE ( above ): 11MAR11 Japan 9.0 earthquake tsunami vistim fishing boat ( 50-metre ) found more than 1-year later still adrift in the Pacific Ocean – but thousands of miles away – off North America Pacific Ocean west coastal territory of Haida Gwaii, British Columbia, Canada ( Click on image to enlarge ) ]

– – – –

Source: CBS News – British Columbia ( Canada )

Tsunami Linked Fishing Boat Adrift Off B.C.

Nobody Believed Aboard 50-Meter Vessel Swept Away In 2011 Japanese Disaster CBC News

March 23, 2012 21:35 ( PST ) Updated from: 23MAR12 18:59 ( PST )

A Japanese fishing boat that was washed out to sea in the March 2011 Japanese tsunami has been located adrift off the coast of British Columbia ( B.C. ), according to the federal Transport Ministry.

The 50-metre vessel was spotted by the crew of an aircraft on routine patrol about 275 kilometres off Haida Gwaii, formerly known as the Queen Charlotte Islands, ministry spokeswoman Sau Sau Liu said Friday.

“Close visual aerial inspection and hails to the ship indicate there is no one on board,” Liu said. “The owner of the vessel has been contacted and made aware of its location.”

U.S. Senator Maria Cantwell, ofWashington, said in a release that the boat was expected to drift slowly southeast.

“On its current trajectory and speed, the vessel would not [ yet ] make landfall for approximately 50-days,” Cantwell said. Cantwell did not specify where landfall was expected to be.

First large debris

The boat is the first large piece of debris found following the earthquake and tsunami that struckJapanone year ago.

Scientists, at the University of Hawaii say a field of about 18,000,000 million tonnes of debris is slowly being carried by ocean currents toward North America. The field is estimated to be about 3,200 kilometres long and 1,600 kilometres wide.

Scientists have estimated some of the debris would hit B.C. shores by 2014.

Some people on the west coast of Vancouver Island believe ‘smaller pieces of debris have already washed ashore there’.

The March 11, 2011, tsunami was generated after a magnitude 9.0 earthquake struck off the coast of northern Japan. The huge waves and swells of the tsunami moved inland and then retreated back into the Pacific Ocean, carrying human beings, wreckage of buildings, cars and boats.

Nearly 19,000 people were killed.

Reference

http://www.cbc.ca/news/canada/british-columbia/story/2012/03/23/bc-fishing-boat-tsunami-debris.html?cmp=rss

– – – –

Submitted for review and commentary by,

Kentron Intellect Research Vault

E-MAIL: KentronIntellectResearchVault@Gmail.Com

WWW: http://KentronIntellectResearchVault.WordPress.Com

References

http://earthquake.usgs.gov/earthquakes/world/japan/031111_M9.0prelim_geodetic_slip.php
http://en.wikipedia.org/wiki/Moment_magnitude_scale
http://www.gsi.go.jp/cais/topic110315.2-index-e.html
http://www.seismolab.caltech.edu
http://www.tectonics.caltech.edu/slip_history/2011_taiheiyo-oki
http://supersites.earthobservations.org/ARIA_japan_co_postseismic.pdf
ftp://sideshow.jpl.nasa.gov/pub/usrs/ARIA/README.txt
http://speclib.jpl.nasa.gov/documents/jhu_desc
http://earthquake.usgs.gov/regional/pacnw/paleo/greateq/conf.php
http://www.passcal.nmt.edu/content/array-arrays-elusive-ets-cascadia-subduction-zone
http://wcda.pgc.nrcan.gc.ca:8080/wcda/tams_e.php
http://www.pnsn.org/tremor
http://earthquake.usgs.gov/earthquakes/recenteqscanv/Quakes/quakes_all.html
http://nthmp.tsunami.gov
http://wcatwc.arh.noaa.gov
http://www.pnsn.org/NEWS/PRESS_RELEASES/CAFE/CAFE_intro.html
http://www.pnsn.org/WEBICORDER/DEEPTREM/summer2009.html
http://earthquake.usgs.gov/prepare
http://www.passcal.nmt.edu/content/usarray
http://www.iris.washington.edu/hq
http://www.iris.edu/dms/dmc
http://www.iris.edu/dhi/clients.htm
http://www.iris.edu/hq/middle_america/docs/presentations/1026/MORENO.pdf
http://www.unavco.org/aboutus/history.html
http://earthquake.usgs.gov/monitoring/anss
http://earthquake.usgs.gov/regional/asl
http://earthquake.usgs.gov/regional/asl/data
http://www.usarray.org/files/docs/pubs/US_Data_Plan_Final-V7.pdf
http://neic.usgs.gov/neis/gis/station_comma_list.asc
http://earthquake.usgs.gov/research/physics/lab
http://earthquake.usgs.gov/regional/asl/data
http://pubs.usgs.gov/gip/dynamic/Pangaea.html
http://coaps.fsu.edu/scatterometry/meeting/docs/2009_august/intro/shimoda.pdf
http://coaps.fsu.edu/scatterometry/meeting/past.php
http://eqinfo.ucsd.edu/dbrecenteqs/anza
http://www.ceri.memphis.edu/seismic
http://conceptactivityresearchvault.wordpress.com/2011/03/28/global-environmental-intelligence-gei
http://www.af.mil/information/factsheets/factsheet_print.asp?fsID=157&page=1
https://login.afwa.af.mil/register/
https://login.afwa.af.mil/amserver/UI/Login?goto=https%3A%2F%2Fweather.afwa.af.mil%3A443%2FHOST_HOME%2FDNXM%2FWRF%2Findex.html
http://www.globalsecurity.org/space/library/news/2011/space-110225-afns01.htm
http://www.wrf-model.org/plots/wrfrealtime.php

MSN Warns Disasters

MSN Warns Disasters

 

MSN Warns Disasters by, Concept Activity Research Vault

March 7, 2012 18:22:42 ( PST ) Updated ( Originally Published: May 16, 2011 )

Los Angeles – May 16, 2011 – MSN Slate News reported ( read article below ) that a host of disasters are coming, which ‘the public should not become overly worried about’, but suggests throwing a celebration-like “18th Century Weekend” in-advance so ‘people can experience what a solar flare disaster might be like to live through’.

While the suggested Medieval celebratory affair ‘concept’ is ‘unique’, MSN suggesting the public ‘stock up on batteries’ was a bit off because apparently the journalist did not realize ‘batteries become drained’ subsequent to an environmental anomaly ‘overcharging’ from an ambient auroral current attributable to what occurs during a Solar Energetic Particle Event ( SEPE );  ‘candles’ or ‘lumeniscent gel sticks’ ( shake lights ) would work amidst such, however mainstream news media broadcasts and print media, without thoroughly researching facts first, have a habit of passing inaccurate information on to the general public, and at the same time, doing it mostly in a whimsical fashion so it can be easily swallowed by the public. That type of reporting does ‘not’ help the public, but only serves to provide the illusion that what is being reported about will probably never happen. Big mistake!

News reporting, as a public service, should take far more care when reporting about emergency disaster preparedness on ‘what to do’ and just ‘how to prepare’; especially when it comes to mentioning a ‘significant’ solar flare ( also known as ) a Solar Energetic Particle Event ( SEPE ) that could quickly and very seriously disable the national electricity infrastructure without warning.

To let CARV readers review how MSN Slate News recently put it to the general public, we cordially invite ‘you’ ( our readers ) to review the MSN Slate News report ( below ) so, you can be the judge on whom to rely on for delivering your emergency disaster preparedness information from.

– –

Source: MSN Slate News

Meltdowns. Floods. Tornadoes. Oil spills. Grid crashes. Why more and more things seem to be going wrong, and what we can do about it.

The Century of Disasters by, Joel Achenbach

May 13, 2011 5:56 PM ( EST )

This will be the century of disasters.

In the same way that the 20th century was the century of world wars, genocide, and grinding ideological conflict, the 21st century will be the century of natural disasters and technological crises and unholy combinations of the two.

It will be the century when the things we count on to go right will – for whatever reason – go wrong.

Late last month ( April 2011 ), as the Mississippi River rose in what is destined to be the worst flood in decades, residents of Alabama and other states rummaged through the debris of a historic tornado outbreak.

Physicists at a meeting in Anaheim, California had a discussion about the dangers posed by the Sun.

Solar flares, scientists believe, are a disaster waiting to happen. Thus one of the sessions at the American Physical Society annual meeting was devoted to discussing the hazard of electromagnetic pulses ( EMP ) caused by solar flares – or terrorist attacks. Such pulses ( EMP ) could fry transformers and knock out the electrical grid over much of the nation. Last year the Oak Ridge National Laboratory released a study saying the damage might take years to fix and cost trillions of dollars.

But maybe even that is not the disaster people should be worrying about.

Maybe they should worry instead about the “ARkStorm.” That’s the name the U.S. Geological Survey ( USGS ) Multihazards Demonstration Project ( MDP ) gave to a hypothetical storm that would essentially turn much of the California Central Valley into a bathtub. It has happened before, in 1861 – 1862, when it rained for 45-days continously. USGS explains, “The ARkStorm draws heat and moisture from the tropical Pacific, forming a series of “Atmospheric Rivers” ( AR ) that approach the ferocity of hurricanes and then slam into the United States West Coast over several weeks.” The result, the USGS determined, could be a flood that would cost $725,000,000 billion in direct property losses and economic impact.

While pondering this, don’t forget the Cascadia subduction zone, the plate boundary off the coast of the Pacific Northwest, that could generate a tsunami much like the one that devastated Japan in March 2011. The Cascadia subduction zone, runs from Vancouver Island to northern California, last rupturing in a major tsunami spawning earthquake on January 26, 1700. It could break at any moment, with catastrophic consequences.

All of these things have the common feature of low probability and high consequence.

They are known as “black swan” events.

They are unpredictable in any practical sense.

There are also things ordinary people probably should not worry about on a daily basis.

You can’t fear the Sun.

You cannot worry a rock will fall out of the sky and smash the Earth, or that the ground will open up and swallow you like a vitamin.

A key element of maintaining one’s sanity is ‘knowing how to ignore risks’ that are highly improbable at any given point in time.

And yet in the coming century, these or other ‘black swan events’ will seem to occur with surprising frequency.

There are several reasons for this.

We have chosen to engineer the planet.

We have built vast networks of technology.

We have created systems that, in general, work very well, but are still vulnerable to catastrophic failures.

It is harder and harder for any one person, institution, or agency to perceive all the interconnected elements of the technological society.

Failures can cascade.

There are unseen weak points in the network.

Small failures can have broad consequences.

Most importantly, we have more people and more stuff standing in the way of calamity.

We are not suddenly having more earthquakes, but there are now 7,000,000,000 billion of us, a majority living in cities.

In 1800, only Beijing, China could count 1,000,000 inhabitants, but at last count there were 381 cities with at least 1,000,000 people.

Many are MegaCities in seismically hazardous places like Mexico City, Caracas, Venezuela; Tehran, Iran and Kathmandu amongst those with a lethal combination of weak infrastructure ( unreinforced masonry buildings ) and shaky foundations.

Natural disasters will increasingly be accompanied by technological crises, and the other way around.

In March 2011, the Japan earthquake triggered the Fukushima Dai-Ichi nuclear power plant meltdown.

Last year ( 2010 ), a technological failure on the Deepwater Horizon drilling rig – in the Gulf of Mexico – led to the environmental crisis of the oil spill. ( I chronicle the Deepwater Horizon blowout and the ensuing crisis management in a new book: A Hole at the Bottom of the Sea: The Race to Kill the BP Oil Gusher. )

In both the Deepwater Horizon and Fukushima disasters, the safety systems were not nearly as robust as the industries believed.

In these technological accidents, there are hidden pathways for the gremlins to infiltrate the operation.

In the case of Deepwater Horizon, a series of decisions by BP ( oil company ) and its contractors led to a loss of well control — the initial blowout. The massive blowout preventer on the sea floor was equipped with a pair of pinchers known as ‘blind shear rams’. They were supposed to cut the drillpipe and shear the well. The forensic investigation indicated the initial eruption of gas buckled the pipe and prevented the blind shear rams from getting a clean bite on it so, the “backup” plan — of cutting the pipe — was effectively eliminated in the initial event; the loss of well control.

Fukushima also had a backup plan that was not far enough back. The nuclear power plant had backup generators – in case the grid went down – but the generators were on ‘low’ ground and were blasted by the tsunami.

Without electricity the power company had no way to cool the nuclear fuel rods.

In a sense, it was a very simple problem: a power outage.

Some modern reactors coming online have passive cooling systems for backups that rely on gravity and evaporation to circulate the cooling water.

Charles Perrow, author of Normal Accidents, told me that computer infrastructure is a disaster in the making.

“Watch out for failures in cloud computing,” he said by e-mail, “They will have consequences for medical monitoring systems and much else.”

Technology also mitigates disasters, of course.

Pandemics remain a threat, but modern medicine can help us stay a step ahead of evolving microbes.

Satellites and computer models helped meteorologists anticipate the deadly storms of April 27, 2011 and warn people to find cover in advance of the twisters.

Better building codes save lives in earthquakes. Chile, which has strict building codes, was hit with a powerful earthquake last year ( 2010 ) but suffered only a fraction of the fatalities and damage that impoverished Haiti endured just weeks earlier.

The current ( 2011 ) Mississippi flood is an example of technology at work for better and for worse.

As I write, the Army Corps of Engineers are poised to open the Morganza spillway and flood much of the Atchafalaya basin. That’s not a “disaster” but a solution of sorts, since the alternative is the flooding of cities downstream and possible levee failure. Of course, the levees might still fail. We’ll see. But this is how the system is ‘supposed’ to work.

On the other hand, the broader drainage system of the Mississippi River watershed is set up in a way that it makes floods more likely. Corn fields, for example in parts of the upper Midwest, have been “tiled” with pipes that carry excess rainwater rapidly to the rip-rap ( small stone ladden ) streams and onward down to rivers lined with levees. We gave up natural drainage decades ago.

The Mississippi is like a catheter, at this point. Had nature remained in charge, the river would have mitigated much of its downstream flooding by spreading into natural floodplains further up river ( and the main channel would have long ago switched to the Atchafalaya river basin — see John McPhee “The Control of Nature” — and New Orleans would no longer be a riverfront city).

One wild card for how disastrous this century will become is climate change.

There’s been a robust debate on the blogs about whether the recent weather events ( tornadoes and floods ) can be attributed to climate change.

It is a briar patch of an issue and I’ll exercise my right to skip past it for the most part.

But I think it’s clear that climate change will exacerbate natural disasters in general in coming years, and introduce a new element of risk and uncertainty into a future in which we have plenty of risks and uncertainties already. This, we don’t need.

And by the way, any discussion of “geoengineering” as a solution to climate change needs to be examined with the understanding that engineering systems can and will fail.

You don’t want to bet, the future of the planet, on an elaborate technological fix in which everything has to work perfectly. If failure is not an option, maybe you ‘should not’ try-it to begin-with.

So if we cannot engineer our-way out-of our ‘engineered disasters’, and if ‘natural disasters’ are going to keep pummeling us – as they have since the dawn of time — what is our strategy? Other than, you-know, despair? Well, that has always worked for me, but here are a few more practical thoughts to throw in the mix:

First [ 1st ], we might want to try some regulation by people with no skin in the game. That might mean, for example, government regulators who make as much money as the people they’re regulating. Or it could even mean a ‘private-sector regulatory apparatus policing the industry’, cracking down on rogue operators. The point is, we don’t want every risky decision made by people with pecuniary interests.

Second [ 2nd ], we need to keep things in perspective. The apparent onslaught of disasters does not portend the end of the world. Beware of ‘disaster hysteria in the news media’. The serial disasters of the 21st century will be – to some extent – a matter of perception. It will feel like we are bouncing from disaster-to-disaster in-part because of the shrinking of the world and the ubiquity of communications technology. Anderson Cooper and Sanjay Gupta are always in a disaster zone somewhere – demanding to know why the cavalry [ emergency first responders ] has not showed up.

Third [ 3rd ], we should think in terms of ‘how we can boost’ our “societal resilience;” the buzz-word in the ‘disaster preparedness industry’.

Think of what you would do, and what your community would do, after a disaster.

You cannot always dodge the disaster, but perhaps you can still figure-out how to recover quickly.

How would we ‘communicate’ if we got [ solar ] flared by the Sun and the [ electricity ] grid went down over 2/3rds of the country?

How would we even know what was going on?

Maybe we need to have the occasional “18th Century weekend” – to see how people might get through a couple of days without the [ electricity ] grid, cell [ telephone ] towers, cable TV [ television ], iTunes downloads – the full Hobbesian nightmare. And make an emergency plan: Buy some ‘batteries’ [ < ? > NOTE: solar flare effects, during a Solar Energetic Particle Event ( SEPE ), renders ‘all batteries dead’. ] and jugs of water – just for starters.

Figure-out how things around you work.

Learn about your community infrastructure.

Read about science, technology, engineering and ‘do not worry if you do not understand all the jargon’.

And then – having done that – go on about your lives, pursuing happiness on a planet that, though sometimes dangerous, is by-far the best one we’ve got.

Reference

http://www.slate.com/id/2294013/pagenum/all/#p2

– –

Hopefully, people will take an opportunity to read the CARV report on Solar Energetic Particle Event Effects so they can ‘really know what to prepare for soon’,  ’before celebrating’ an “18th Century weekend” affair – complete with “batteries” – as suggested by the MSN Slate News article ( above ).

Although the aforementioned Slate News article indicates, “Solar flares, scientists believe, are a disaster waiting to happen. Thus one of the sessions at the American Physical Society annual meeting was devoted to discussing the hazard of electromagnetic pulses ( EMP ) caused by solar flares – or terrorist attacks. Such pulses ( EMP ) could fry transformers and knock out the electrical grid over much of the nation. Last year the Oak Ridge National Laboratory released a study saying the damage might take years to fix and cost trillions of dollars. But maybe even that is not the disaster people should be worrying about,” – we actually ‘may’ have ‘something “people should be worrying about,” as MSNBC puts it, or “concerned about,” according to NASA, in-lieu of the following MSNBC Space.Com report ( below ):

– – – –

 

Source: MSNBC.COM

 

Huge Solar Flare’s Magnetic Storm May Disrupt Satellites, Power Grids

 

March 7, 2012 13:19 p.m. Eastern Standard Time ( EST )

 

A massive solar flare that erupted from the Sun late Tuesday ( March 6, 2012 ) is unleashing one of the most powerful solar storms in more than 5-years, ‘a solar tempest that may potentially interfere with satellites in orbit and power grids when it reaches Earth’.

 

“Space weather has gotten very interesting over the last 24 hours,” Joseph Kunches, a space weather scientist at the National Oceanic and Atmospheric Administration ( NOAA ), told reporters today ( March 7, 2012 ). “This was quite the Super Tuesday — you bet.”

 

Several NASA spacecraft caught videos of the solar flare as it hurled a wave of solar plasma and charged particles, called a Coronal mass Ejection ( CME ), into space. The CME is not expected to hit Earth directly, but the cloud of charged particles could deliver a glancing blow to the planet.

 

Early predictions estimate that the Coronal Mass Ejections ( CMEs ) will reach Earth tomorrow ( March 8, 2012 ) at 07:00 a.m. Eastern Standard Time ( EST ), with the ‘effects likely lasting for 24-hours and possibly lingering into Friday ( March 9, 2012 )’, Kunches said.

 

The solar eruptions occurred late Tuesday night ( March 6, 2012 ) when the sun let loose two ( 2 ) huge X-Class solar flares that ‘ranked among the strongest type’ of sun storms. The biggest of those 2 flares registered as an X Class Category 5.4 solar flare geomagnetic storm on the space weather scale, making it ‘the strongest sun eruption so far this year’.

 

Typically, Coronal Mass Ejections ( CMEs ) contain 10,000,000,000 billion tons of solar plasma and material, and the CME triggered by last night’s ( March 6, 2012 ) X-Class Category 5.4 solar flare is ‘the one’ that could disrupt satellite operations, Kunches said.

 

“When the shock arrives, the expectation is for heightened geomagnetic storm activity and the potential for heightened solar radiation,” Kunches said.

 

This heightened geomagnetic activity and increase in solar radiation could impact satellites in space and ‘power grids on the ground’.

 

Some high-precision GPS ( Global Positioning Satellite ) users could also be affected, he said.

 

“There is the potential for ‘induced currents in power grids’,” Kunches said. “‘Power grid operators have all been alerted’. It could start to ’cause some unwanted induced currents’.”

 

Airplanes that fly over the polar caps could also experience communications issues during this time, and some commercial airliners have already taken precautionary actions, Kunches said.

 

Powerful solar storms can also be hazardous to astronauts in space, and NOAA is working close with NASA’s Johnson Space Center to determine if the six ( 6 ) spacecraft residents of the International Space Station ( ISS ) need to take shelter in more protected areas of the orbiting laboratory, he added.

 

The flurry of recent space weather events could also supercharge aurora displays ( also known as the Northern Lights and Southern Lights ) for sky-watchers at high latitudes.

 

“Auroras are probably the treat that we get when the sun erupts,” Kunches said.

 

Over the next couple days, Kunches estimates that brightened auroras could potentially be seen as far south as the southern Great Lakes region, provided the skies are clear.

 

Yesterday’s ( March 6, 2012 ) solar flares erupted from the giant active sunspot AR1429, which spewed an earlier X Class Category 1.1 solar flare on Sunday ( March 4, 2012 ). The CME from that one ( 1 ) outburst mostly missed Earth, passing Earth by last night ( March 6, 2012 ) at around 11 p.m. EST, according to the Space Weather Prediction Center ( SWPC ), which is jointly managed by NOAA and the National Weather Service ( NWS ).

 

This means that the planet ( Earth ) is ‘already experiencing heightened geomagnetic and radiation effects in-advance’ of the next oncoming ( March 8, 2012 thru March 9, 2012 ) Coronal Mass Ejection ( CME ).

 

“We’ve got ‘a whole series of things going off’, and ‘they take different times to arrive’, so they’re ‘all piling on top of each other’,” Harlan Spence, an astrophysicist at the University of New Hampshire, told SPACE.com. “It ‘complicates the forecasting and predicting’ because ‘there are always inherent uncertainties with any single event’ but now ‘with multiple events piling on top of one another’, that ‘uncertainty grows’.”

 

Scientists are closely monitoring the situation, particularly because ‘the AR1429 sunspot region remains potent’. “We think ‘there will be more coming’,” Kunches said. “The ‘potential for more activity’ still looms.”

 

As the Sun rotates, ‘the AR1429 region is shifting closer to the central meridian of the solar disk where flares and associated Coronal Mass Ejections ( CMEs ) may ‘pack more a punch’ because ‘they are more directly pointed at Earth’.

 

“The Sun is waking up at a time in the month when ‘Earth is coming into harms way’,” Spence said. “Think of these ‘CMEs somewhat like a bullet that is shot from the sun in more or less a straight line’. ‘When the sunspot is right in the middle of the sun’, something ‘launched from there is more or less directed right at Earth’. It’s kind of like how getting sideswiped by a car is different than ‘a head-on collision’. Even still, being ‘sideswiped by a big CME can be quite dramatic’.” Spence estimates that ‘sunspot region AR 1429 will rotate past the central meridian in about 1-week’.

 

The sun’s activity ebbs and flows on an 11-year cycle. The sun is in the midst of Solar Maximum Cycle 24, and activity is expected to ramp up toward the height of the Solar Maximum in 2013.

 

Reference

 

http://www.msnbc.msn.com/id/46655901/

 

– – – –

Do we need “Planetary Protection?” NASA has a specific website, referenced here ( below ) as do others ( below ), including The Guardians of the Millennium.

Submitted for review and commentary by,

Concept Activity Research Vault E-MAIL: ConceptActivityResearchVault@Gmail.Com WWW: http://ConceptActivityResearchVault.WordPress.Com

Reference

http://planetaryprotection.nasa.gov/about/ [ Planetary Protection ) http://www.lpi.usra.edu/captem/ [ CAPTEM ] http://www.nrl.navy.mil/pao/pressRelease.php?Y=2008&R=39-08r [ U.S. Naval Research Laboratory ] http://hesperia.gsfc.nasa.gov/sftheory/imager.htm [ RHESSI ]

 

X-CIA Files Archives 2

Xcia Files main page image

[ NOTE: Legacy ( circa 1998 – 2003 ) X-CIA FILES website reports, briefs and images ( below ) ]

WARNING & DISCLAIMER: THIS IS NOT A GOVERNMENT WEBSITE

X-CIA Files Archive 2

” Details, Usually ‘Unavailable’ Elsewhere, Are Typically ‘Available’ Here! “

ExtraTerrestrial Technologies ( ETT )

Plasma Torch Rectenna Propulsion

3D Penrose Tiling Structures

Quasi-Crystal Materials Sciences

Lenticular Tactical Aerospace Vehicles ( LTAV )

Unmanned Combat Aerial Vehicle ( UCAV ) Linear Engines

Single-Stage To Orbit ( STO ) Vehicle Propulsion

Space-Time Continuum Manipulations

INTEL ( DOCILE ) Digital IC Orbit Communication Technologies

ExtraTerrestrial Biological Entities ( EBE )

Rare Unidentified Flying Objects ( UFO )

Linear AeroSpike Engines for Unmanned Combat Aerial Vehicles ( UCAV )

ROCKETDYNE linear AeroSpike XRS-2200 ( RS-2200 ) engines utilize a design first developed in the early 1970s incorporating Apollo space mission era hardware from J-2S engines. Although this AeroSpike engine began almost 30-years ago it is of strategic importance today in the F-117-E Stealth reconnaissance aircraft it will be for future aerospace vehicles now under development.

21st Century propulsion technology was derived from a combination of 1960s era hardware developed from several decades of engine designs plus 1990s era design / analysis tools and fiscal realities that ushered in an entirely new era of commercial and military space flight where ‘old technology’ was found to be a primary key to developing even newer technological advancements.

The AeroSpike team located vendors, who more than 30-years ago, manufactured the original J-2S legacy engine hardware that AeroSpike based its turbo-machinery on.

Vendors, still in existence, were contracted to provide the Program with newly built J-2S hardware.

In cases where vendors had gone out-of business, new generation vendors were identified for producing new hardware from 30-year old legacy designs.

Panels, that make up the unique AeroSpike nozzle, presented a huge design challenge.

AeroSpike nozzles have a form like a large curved ramp – unlike traditional bell shaped nozzle characteristics for most rocket engines.

AeroSpike nozzle thermal and structural loads required development of new manufacturing processes and toolings to fabricate and assemble AeroSpike nozzle hardware.

The AeroSpike team found a way to accommodate huge thermal expansions and induced mechanical forces in mechanical and brazed joints of the assemblies where appropriate materials and attachment techniques were identified so effective manufacturing processes were developed.

In early 1997, a small half span model packaged equipped with an AeroSpike 8 thrust-celled nozzle engine equipped Lift-Body Vehicle [ SR-74 Scramp ( see further below ) ] – called the LASRE experiment – was piggy-back mounted onto a LOCKHEED SR-71 BlackBird high-altitude reconnaissance aircraft that was then tasked to operate like a ‘flying wind test tunnel’ to determine a Single-Stage-To-Orbit ( STO ) Reusable Launch Vehicle ( RLV ) onboard AeroSpike engine plume would affect aerodynamics of the Lifting Body Vehicle shape at specific altitudes and speeds initially reaching approximately 750-mph.

The interaction of the aerodynamic flow with the engine plume could create drag and design refinements minimized that interaction.

The lifting body model, eight [ 8 ] nozzled AeroSpike engine and, canoe collective was called the “pod.” The entire pod was 41-feet in length and weighed 14,300 pounds. The experimental pod, mounted onto an Air Force LOCKHEED SR-71 BlackBird stealth reconnaissance aircraft loaner, was completed in November 1998.

Successfully completion braze of the first flight ramp was accomplished as well as the fabrication and assemblies of the parts for the first thrusters.

With completion of those milestones the AeroSpike engine proceeded through fabrication, test and, delivery enabling support for the planned first flight in 1999.

Now, the AeroSpike XRS-2200 linear thrust-vectoring engine’s eventual “new placement’ was set for the sub-orbital technology air/space vehicle – referred to as the X-33 VentureStar – had a Rocketdyne team anticipating delivery of their first XRS-2200 AeroSpike flight engine by September of 1999.

Also, a “combined industry and government team” at LOCKHEED-MARTIN Skunk Works ( Palmdale, California ) was developing the X-33 for its AeroSpike XRS-2200 engine flight out of Edwards Air Force Base ( California ) scheduled for December of 1999.

The Linear Aerospike XRS-2200 ( RS-2200 ) engine was developed by the ROCKETDYNE PROPULSION AND POWER UNIT of the BOEING COMPANY indicating they completed the engine in early 2000 although the F-117A Stealth fighter was already secretly flying ‘long before’ their official public information release.

The difference between the linear AeroSpike engine and conventional rocket engines are the shape of the nozzle – unlike conventional rocket engines using a bell shaped nozzle to constrict expanding gases – the Aerospike nozzle is V-shaped and called a “ramp”.

The performance of Electro-Mechanical Actuators ( EMA ) are used in propellant valving for these engines. EMA is seen as a technology of choice in new rocket engines that would be developed under the Space Launch Initiative ( SLI ).

The XRS-2200 gas generator operated successfully in the flow-rate range of proposed X-33 operating conditions. The gas generator essentially was a J2 gas generator modified for the higher chamber pressure and flow-rate required for the XRS-2200.

The gas generator must be able to operate in conditions significantly higher than normal J2 operating conditions.

A review of the data showed the gas generator operated in these conditions but that also, the combustor shell wall temperatures were within acceptable tolerances. Post test inspections also found the hardware to be in good operating condition which showed signs of marked improvements from past hardware weakening. Engineers at Marshall Space Flight Center ( MFSC ) were able to demonstrate the gas generator could be started with a softer ramp, to minimize overpressure of the combustor shell, by accurately sequencing valve timings and ramps on the XRS-2200 AeroSpike engine.

Successful component tests followed a series of AeroSpike multi-cell engine tests at Marshall Space Flight Center that successfully demonstrated hydrogen-oxygen combustion at full power, emergency power, and low throttle conditions.

The pressure fed thrusters and AeroSpike nozzles were developed at the Rocketdyne Division of Boeing under a technology agreement with NASA and Lockheed-Martin who was set to build the VentureStar X-33 transport aerospace vehicle.

The XRS-2200 AeroSpike engine shoots hot gases, from multiple linear placed chamber nozzles along the outside of the ramp surface. This unusual design allows the engine to be more efficient and effective than today’s rocket engines by ‘modulating the thrust to various positioned sets of these nozzels acting in concert with vectoring – shaping direction of engine propulsion / thrust.

Hot test firings were performed with tests on the powerpack at the John C. Stennis Space Center, which included the turbo-machinery and gas generator that ran a program duration of 45-seconds with a start to the 80% power level transition to mainstage operation at 100% power and then throttled down to 57% power.

Test data indicated normal shutdown with no anomalies for the ROCKETDYNE AeroSpike XRS-2200 linear engine designed for use onboard the VentureStar X-33 – Reusable Launch Vehicle ( RLV ) – prior to delivery at the LOCKHEED-MARTIN VentureStar X-33 assembly facility ( Palmdale, California ) where the X-33 was to be flown from Edwards Air Force Base ( California ) into outerspace followed by a return touchdown at one ( 1 ) of two ( 2 ) landing sites ( i.e. Utah or Montana ).

Some VentureStar X-33 and XRS-2200 ROCKETDYNE engine project participants, were:

– Gene Austin, Program Manager for NASA VentureStar X-33 at Marshall Space Flight Center; – Cleon Lacefield, Vice-President LOCKHEED-MARTIN Space Systems ( Palmdale, California ) VentureStar X-33; – Don Chenevert, Program Manager, NASA X-33, Aerospike Engine Testing at Stennis Space Center, MS; – Mike McKeon, Program Manager X-33 Aerospike Engine, ROCKETDYNE Propulsion and Power Unit, BOEING ( Canoga Park, California ); and, – Steve Bouley, Division Director, Propulsion Development, ROCKETDYNE Propulsion & Power Unit, BOEING.

Instead of hydraulics, future propulsion systems may use EMAs to control major propellant valves so gaining performance data in ‘real world testing’ has significant value.

There are six ( 6 ) EMAs – on each AeroSpike test engine – used to deliver propellants to the thruster banks and gas generators. Two ( 2 ) engines will use forty ( 40 ) thrusters – 20 per XRS-2200 AeroSpike engine achieves aircraft velocities exceeding Mach 13 +.

A total of seven ( 7 ) variations of the Advanced Linear AeroSpike Engines – built by the ROCKETDYNE DIVISION of BOEING – were to power the X-33 VentureStar RLV to have been built by LOCKHEED-MARTIN.

There were three ( 3 ) additional powerpack assemblies and four ( 4 ) full-up AeroSpike XRS-2200 linear engines – including two ( 2 ) flight units that existed during the remainder of the development program.

ROCKETDYNE developed the XRS-2200 Aerospike linear engine at its Canoga Park, California facility for the later cancelled ( 2001 ) VentureStar X-33 Single-Stage To Orbit ( STO ) Reusable Launch Vehicle ( RLV ) space transport program. A joint BOEING and NASA team at Stennis Space Center did the final XRS-2200 AeroSpike engine assembly.

The RS-2200 Linear Aerospike Engine is being developed for use on the LOCKHEED-MARTIN Skunk Works Reusable Launch Vehicle ( RLV ).

The Aerospike linear engine allows the smallest lowest cost RLV ( Reusable Launch Vehicle ) to be developed because the engine fills the base ( reducing base drag ) and is integral to the vehicle – reducing installed weight when compared to a bell shaped conventional rocket engine.

The Aerospike is somewhat the same as bell shaped rocket engines, except for its nozzle open to the atmosphere. The open plume compensates for decreasing atmospheric pressure as the vehicle ascends – keeping engine performance very-high along the entire trajectory.

This altitude compensating feature allows a simple low-risk gas generator cycle to be used. Over $500,000,000 million has been invested to-date in AeroSpike engines, and full size linear engines have accumulated seventy=three [ 73 ] tests and over 4,000 seconds of operation.

Following the series of tests, XRS-2200 AeroSpike engines were removed from the test stand facility and put into storage at the Stennis Space Center – awaiting NASA instructions on engine final dispositions.

The precursor to the AURORA Transport-Lift Vehicle X-43 placed three ( 3 ) such X-43A aerospace vehicles inside the Dryden Space Flight facility at Edwards Air Force Base, California where a 12-foot-long under wing test vehicle existed for the NASA “Hyper-X” multi-year hypersonic research program [ AURORA ] to demonstrate “airframe integrated and air breathing ( AeroSpike ) engine technologies” that promise to increase payload capacity for future vehicles by consuming ambient oxygen at altitudes higher than previously possible.

This will remove the need for carrying oxygen tanks onboard to promote combustion, as traditional rockets must do now.

Two flights are planned at Mach 7 ( approximately 5,000 mph ) and one ( 1 ) flight at Mach 10 ( almost 7,200 mph ) to a top speed of Mach 13 +.

By comparison, the world’s fastest “air-breathing plane” – to date – was the LOCKHEED SR-71 Blackbird that could fly at an ‘unclassified airspeed’ of Mach 3 + to an extimated top speed of Mach 7.

Future generations of EMAs will be even more compact – than those currently in operation – that will pave the way for linear AeroSpike acceleration thrust-vectoring engines to be deployed onboard ‘newly designed’ Unmanned Combat Air Vehicles ( UAV ).

Some X-33 and ROCKETDYNE XRS-2200 AeroSpike engine project participants, were:

– Gene Austin, NASA X-33 Program Manager, Marshall Space Flight Center; – Cleon Lacefield, Lockheed-Martin Space Systems Company Vice President for X-33, Palmdale, CA; – Don Chenevert, NASA X-33 Program Manager, Aerospike Engine Testing, Stennis Space Center, MS; – Mike McKeon, X-33 Aerospike Engine Program Manager, Rocketdyne Propulsion and Power, The Boeing Company, Canoga Park, CA; and, – Steve Bouley, Division Director, Propulsion Development, Rocketdyne Propulsion & Power Unit, the Boeing Company.

August 8, 2001 – The NASA Second Generation Reusable Launch Vehicle Program – also known as the Space Launch Initiative ( SLI ) – is making advances in propulsion technology with this third and final successful engine hot-fire designed to test electro-mechanical actuators. Information learned from this hot-fire test series about new electro-mechanical actuator technology – which controls the flow of propellants in rocket engines – could provide key advancements for the propulsion systems of future spacecraft. The test of twin ( 2 ) Linear Aerospike XRS-2200 engines originally built for the X-33 program, was performed Monday, August 6, 2001 at the NASA Stennis Space Center, Mississippi where the engines were fired for the planned 90-seconds and reached a planned maximum power of 85%. The test was originally slated to attain full power during 100-seconds of testing. Prior to the test, engineers determined the necessary results could be achieved at reduced duration and power. Based on this determination, both planned duration and planned power were reduced. Two [ 2 ] shorter hot-fires of the AeroSpike engines were performed last month [ July 2001 ] in preparation for the final test firing on August 6, 2001.

The Second Generation Reusable Launch Vehicle ( RLV ) Program, led by the NASA Marshall Space Flight Center in Huntsville, Alabama is a technology development program designed to increase safety and reliability while reducing costs for space travel.

“Because every engine proposed by industry for a second generation vehicle has electro-mechanical actuators, we took advantage of these AeroSpike engines already on the test stand to explore this relatively new technology now – saving us valuable time later,” said Garry Lyles, Propulsion Projects Office manager of the Second Generation Reusable Launch Vehicle Program at the Marshall Center. “This data is critical toward developing the confidence required to support the use of these actuators on future launch vehicles.”

Electro-mechanical actuators electronically regulate the amount of propellant (fuel and oxidizer) flow in the engine. The new technology is a potential alternative and improvement to the older pneumatic and hydraulic fluid systems currently used by the aerospace industry to drive and control critical rocket engine valves.

“This series of engine firings tested the actuator control system in what we call a ‘real condition of use’ environment,” said Dr. Donald Chenevert, electro-mechanical actuator project manager at the Stennis Center. “Firing allows us to see how the integrated system handles the extreme cold of cryogenic propellants, the stress loads of the propellants pushing through the valves, and the dynamic response to commanded flow rate changes. Additionally, we have many other unique conditions such as shock and vibration loads not found in a lab, so we capture more realistic data about the true performance of the actuators.” Engineers are performing engine post-test inspections, and early indications are that all test objectives have been met, Chenevert said.

The final data is to be fed directly into the engine systems being considered for a second-generation reusable launch vehicle, Lyles said. “Propulsion is one of the highest and most critical technology areas that we are exploring,” said Dennis Smith, manager of the Second Generation Reusable Launch Vehicle Program Office at the Marshall Center. “Our goal also is to find, improve or develop technologies such as airframes, avionics, health management systems and ground operations – all to make getting people and payloads into space safer and cheaper.”

The Rocketdyne Propulsion and Power Unit of The Boeing Company in Canoga Park, California developed the AeroSpike engine with engine test support conducted at Stennis Space Center.

RESEARCH ( Full Photo Gallery ): https://web.archive.org/web/20081020050929/http://unwantedpublicity.media.officelive.com/Gallery.aspx

– –

[ PHOTO ( Insert Here ): No photo ( to-date ) exists, to provode even an approximate exemplification ( for display as a header image ) depicting two ( 2 ) certain large dark triangle crafts I have personally witnessed as a trained intelligence professional and observer, that I have only briefly described within some of my reported publishings ) ]

Southern California High Desert Community Town Hall Meeting On ExtraTerrestrial Events –

USA, California, Hesperia – 1997 – Local area residents of the Victorville Valley southern California area formed a collective having focused new attentions toward the sky where a rash of ‘satellite failures’, unidentified flying objects ( UFO ) including one ( 1 ) that stopped 2-way traffic for 10-minutes at night along a popular California interstate where a very large triangle craft hovered while actually blocking out starlight, and – further south – a few extraterrestrial biological entity ( EBE ) sightings.

Local newspapers only reported that local area residents, curious from too many unexplained sightings, were holding an open to the public Town Hall Meeting ( Main Street in Hesperia, California ) to discuss and compare what they were encountering.

The town hall meeting, held in a small retail center with a fast food store, unfortunately did not discuss UFo sightings because a woman – operating an overhead transparency slide projector – placed images of ‘foreign’ ( Mexico ) area Chupacabra sightings for discussion. Frustrated by the obviously lengthy Chupacabra distraction, not taking local area UFO questions, most residents in-attendance abandoned the town hall office meeting to stretch their legs outside where some began talking amongst themselves about ‘UFO and alien local encounters’ they thought the town hall meeting was supposed to be allowing for open discussions.

Noticing town hall meeting attendees pouring outside, freelance reporter Paul Collin interviewed the disenfranchised residents whom left their off-topic town hall meeting inside. Eyewitnesses, came with family members, some providing additional eyewitness accounts. Providing one-on-one interviews for only first-hand reports describing details, residents were also allowed to personally sketch drawings of personal UFO and alien entity encounters.

A good investigative journalist may play unknowledgeable while subtly and quite effectively being able to quickly assess normal human frailties from purposeful deceit in getting to the bottom of the truth. Easy to spot, are armchair storytellers ( with plenty of time on their hands who invariably stray from the topic to talk about what they did or do for a living ), narcissists ( rambling on about themselves while exhibiting rather odd personal quirks ), weird-os and opportunists ( some wearing partial Star Trek or Wonder Woman costumes, alien face masks, spring-wired tinfoil antenna balls sprouting from headbands, or constantly looking in their compact mirrors to see if their make-up is still on their face correct ), and then move-on to interview others whose purpose stems from serious concerns as a resident member of the community.

Even then, trying to detrmine fact from fictionalized accountings is not an easy task. You look deep into these people’s faces as they convey their stories. “Did they really see what they’re claiming?” Look at their faces, closer, any micromomentary facial expressions? Also look carefully at their eyes and the direction they quickly snap ‘just before beginning to answer your question’. Look carefully at their reactions after throwing their own statement back at them, but with a purposeful small inaccuracy, to see whether they correct it, become exacerbated by your having just twisted what they just conveyed, or continue as though that’s what they said – but actually didn’t. Can they provide details as to what they were doing ‘just before the time of their encounter’? Do they appear to be easily disturbed emotionally or do they offer light-hearted concerns while discussing their more serious concerns on-topic?

Most were rather ‘original’, several did not match what was mostly being reported, some interviewees were very apolegetic for not having more than just a little to report. The culmination of many reports served to quickly narrow the scope of ‘believeable encounters’ from those ‘otherwise’.

Analysis of all boiled down to the following six ( 6 ) essential facts:

1. High-volume ( PUBLIC ) sightings;

2. Short-term duration ( 30-DAY ) reportings;

3. Small region ( HIGH DESERT ) locations;

4. Near ground Low Earth Orbit ( LEO ) altitudes.

5. Limited design ( UFO ) triangles; and,

6. Incident ( MAJOR ) highways.

Over all reports, only four ( 4 ) really stood-out:

A. There was the Hesperia, California family in their minivan – homebound east on Main Street ( Hesperia, California ) with a clear sunset behind them having just left a soccer game when all occupants began to comment about what appeared outside their windshield in the low horizon distance where a slimline triangle shaped UFO just lingered ( for about 5-minutes ) but then suddenly ( in seconds ) snapped its location due south and shot upward where all of a sudden – and in’mid-sky’ – just blinked-out before even reaching the upper darkening sky. The triangle UFO exhibited ‘no contrails’, ‘no sound barrier boom’, nothing. Just a brief low earth hover, quick snap south and then up out of sight in the blink of an eye. While the kids were all excited, the parents tried calming them down – along with their own unsettled nerves – explaining it all away as only being some new Air Force jet. Deep inside, the parents knew it was ‘not any aircraft’, but only one ( 1 ) of other unexplained sightings plaguing yet other residents over the past month;

B. All alone, a middle-aged man traveling west on Main Street ( Hesperia, California ) homebound for Phelan, California spotted in the southwest sky over the Wrightwood mountains a large triangle craft slowly moving upward. Stopping at California state highway 395 traffic light. He looked back up out his windshield and saw nothing there anymore, but it gave him something to tell his wife when he arrived home. The wife, rolling her eyes, put dinner on the table, but interestingly was also present by his side at the town hall meeting as well. Residents wanted to know what was going on in their local community, especially after local UFO sightings appeared to begin registering in their local paper;

C. Further southwest and beyond the Wrightwood mountains – in Azusa, California – a grandmother and her live-in daughter nurse both witnessed – on two ( 2 ) separate occasions while driving home slowly down their semi-rural neighborhood street at night – two ( 2 ) glowing red eyes in the head of what appeared to them to be a small 2-legged ape-like creature hunched down by the side of their road where although well ahead of their vehicle the creature suddenly darted across the street but with what they both claimed was at a ‘frightening blur’ of a pace. The women also spotted what they believed was the same 2-legged ape-like creature with red eyes three ( 3 ) additional times but inside the furthest corner of their backyard where it seemed to be glaring at them both through the rear kitchen window. Immediately scared to death, both residents – whom by the time they thought to call police – then witnessed the creature pop-up – rather unusually – bounding over and outside their backyard fence. I had to ask if they remained in the town hall meeting for the Chupacabra discussion, and they glanced at each other and let me know that what they saw was ‘not’ a Chupacabra. I asked, “Could it have been a baby Chupacabra? They looked at each other and then back at me, shaking their heads in the negative. Their ‘thing’ was ‘not hairy’, did ‘not have head horn spikes’, was ‘not a color shade of grey, blue, or eggshell’. It was ‘black’, ‘short’, and when it moved – it moved ‘extremely fast’ with an ‘odd blur’ you couldn’t focus-in on. I thought to myself, “Probably darn hard to target fire onto;” and,

D. What brought the freelance reporter to the meeting in the first place was his own personal encounter in the same general area of the High Desert of southern California where 1-week earlier at night while 20-minutes northeast of Victorville, California in the middle of the desert on California Interstate 15 ( I-15 ) on his way to Las Vegas, Nevada he noticed traffic on ‘both sides of the that highway pulling over and stopping. He figured a serious accident occured and pulled over to exit his vehicle to look out into the desert along the highway, but didn’t see any vehicles there. He walked back a couple of cars and noticed a group of people talking together and asked where the accident was. He was told to look up just a little off to the east of the intersate highway to see what was stopping traffic, and there ‘it’ was – an incredibly huge black triangle shaped object just hovering without any lights on. The oddest thing about it was that it was so huge that a whole section of the night sky had no starlight while all around the flying object anyone ‘could easily see starlight all around’ but no starlight directly above the behemoth. I asked the group what the thing was doing and what had it been doing. They said they didn’t know what it was doing now, but that it had been exhibiting a low hum, which stopped, and it was just continuing to linger where it had been for what they estimated had been 15-minutes. I called the California Highway Patrol office and they said they were already responding to it. I heard no emergency sirens and saw no red lights. I waited another 15-minutes and nothing happened. It just lingered a few hundred feet above ground off in the desert. Not being too much braver I decided to get back into my car and turn around and go back home.

The following week, I located one ( 1 ) particular Blockbuster video store on Bear Valley Road in southwest Victorville, California. That particular store carried an unusually large selection of UFO documentary videos placed in a special section. I decidely watched over 100 of those videos to determine if anyone else might have seen any huge triangle UFOs. At the time ( 1997 ) there were unfortunately ‘no flying triangle videos’ I could lay my hands on.

Apparently my frequent selections attracted the attention of the store owner, John Pflughoft of MPM INVEST, who eventually approached me and politel asked why I was interested in watching so many UFO videos. I think he knew something had startled me into that habit so, I conveyed what I had seen the previous week.

I also shared my late night experiences during 1972 while assigned to the Intelligence Section Station at George Air Force Base and later Edwards Air Force Base in the High Desert. I told him about strange red, orange, and yellow ‘firelight’ coming out the tops of some of the smaller mountains scattered between George AFB and Edwards AFB out in the middle of the desert. The video proprietor asked if I knew what the ‘firelights’ were. I told him I figured it was just rocket engine testing going on inside some of those small mountains.

He asked if I had ever seen any UFOs before last week. All I had to convey was an experience in 1976 while camping with a couple of my military buddies of mine up in the Iron Mountain range between Randsburg, California and Mojave, California at night, and while we ‘saw nothing’ all three ( 3 ) of us ‘heard’ a very unusual ‘electronic whirring sound’ that seemed to be travelling up and down both sides of the foothills a few hundred feet from where we were trying to sleep. I told him we walked in the direction of where we heard the whirring sound coming from last but saw nothing. Then when we returned to our camp where 30-minutes later we all heard it start back up again so, we left in the middle of the night and drove 90-miles to get home. He smiled and said, “Well, I guess that until last week you’ve been pretty lucky to have remained out-of the UFO experience.”

He then asked if an upcoming town hall meeting in Hesperia, California where residents were going to discuss their own personal UFO and extraterrestrial encounters during the recent month might interest me. I knew nothing of any other sightings so, he suggested I attend and asked if I would report back to him what I learned. I agreed, attended the meeting, but when it began being abandonded, dug-out a yellow legal pad of paper and began interviewing attendees upon exit. A final report was  prepared, and along with resident sketches, placed in a manila envelope sealed-up and dropped-off at the Blockbuster store for his later review.

– –

[ PHOTO ( above ): Circa 12OCT62 – Ames Langley Research Center lenticular vehicle aero-space body designs ( click on image to enlarge ) ]

As far back as October 12, 1962 Ames Langley and Dryden Flight Research Centers began feasibility studies and designs for developing a lenticular design space re-entry air vehicle with speeds capable of reaching Mach 25 + to Mach 50 +.

[ photo ( above ) TR-3B Astra – Flying Triangle ( click to enlarge ) ]

In 1995, at Nellis Air Force Base Test Range S-4 ( near Papoose Lake, Nevada ) the TR3-B ( a lenticular-shaped aerial vehicle ) was seen and reported to be between 300-feet and 500-feet in diameter.

Reportedly, the TR3B flies at speeds of Mach 15 + and reflects a bright blue grey color believed to be biological electro-chromatic 3-D Penrose tiling polymer material providing highly advanced stealth qualities.

TR3B is also believed to have carried the INTEL company Direct Orbital Communication & Intelligence Link Electronics ( DOCILE ) computer processor unit ( CPU ) system.

TR3B is believed to have derived partial funding from Strategic Defense Initiative ( SDI – Star Wars ) links with the super secret AURORA Program Office global security defense operations mission.

TR3-B is believed using a quasi-crystalline molecular property energy containment storage core driving a plasma-fluidic propulsion system employing combinatoric development of its Magnetic Field Disruptor ( MFD ) quantum-flux transduction field generator technology.

TR3B reportedly emits cyclotron radiation, performs pulse detonation acceleration, and carries EPR quantum receivers.

TR3B craft reportedly resembles a ‘very large triangle’ ( shaped ) air vehicle.

TR3B (aka) TR3-B (aka) TIER III B craft in no way resembled the TR-3/A MANTA air vehicle.

The flight-testing of all experimental and first-model military aircraft occurred here along an ancient dry lake now called Rogers Dry Lake, located on the western edge of Southern California’s Mojave desert – south of Highway 58 between the two ( 2 ) towns of Mojave, California and Boron, California ( where the World’s largest open-pit borax mine is ) in Rogers Dry Lake ( one of the first immigrant trails through California ).

The first permanent settlers, to a certain desert region area, was the Corum Family who located near this large dry lake area ( in 1910 ) where later local residents tried to get the local U.S. Post Office to name it “Corum, California” however another city with a similar name “Coram, California” existed so, the name “Corum” was reverse spelled it as “Muroc,” which is where this Mojave desert area saw the U.S. Army Air Corps later name “Muroc Field” and the subsequent naming of the NASA Muroc Flight Test Unit ( MFTU ) in this California area.

This dry lake was extremely ideal as what would become the major site of aviation flight-test history because, at about 2,300-feet above sea level, Rogers Dry Lake not only happens to fill an area of about 44-square miles (nearly half again as large as New York’s Manhattan Island) making it one of the largest and best natural ( flat and hard surfaced ) landing sites on Earth. The arid desert weather also promotes excellent flying conditions on almost every day of the year ( about 320-days out of the year ).

Rogers Dry Lake is the sediment filled remnant of an ancient lake formed eons ago. Several inches of water can accumulate on the lakebed when it rains, and the water in combination with the desert winds creates a natural smoothing and leveling action across the surface. When the water evaporates in the desert sun, a smooth and level surface appears across the lakebed, one far superior to that made by humans.

[ photo ( above ) DOUGLAS AIRCRAFT Black Horse Project Manta ( click to enlarge ) ]

AURORA Program Office consisted of lenticular shaped and wingless aerospace vehicle Projects that industry sleuths speculate secretly held a billion dollar high-speed high-altitude surveillance air space vehicle that leaves a ‘contrail’ behind it resembling ‘doughnut clouds on a string’.

According to some reports, AURORA Program aerospace vehicles are capable of high-speed maneuverability allowing abrupt course change corrections within their own flight path.

Information ( below ) are excerpts from two ( 2 ) individuals, Robert “Bob” Lazar and Edgar Fouche, during different time periods at different locations. These relevant excerpts should serve to familiarize readers and provide interesting relationships between similarities of ETT ( ExtraTerrestrial Technologies ), reverse-engineering ( backward engineering ) of U.S. government military seized extraterrestrial spacecraft, and current day advanced technologies controlled by the U.S. Department of Defense ( DOD ), Defense Advanced Research Projects Agency ( DARPA ) programs and projects worldwide.

Obvious template similarities seem to have been successfully performed in order to produce fully operational high-performance defense and observation flightcrafts for exclusively U.S. government use, examples of which may be viewed in the section “ NEWS ALERTS! “ on this website.

[ photo circa: 1995 ( above ) Area 51, Groom Lake Nevada ( click to enlarge ) ]

While Bob Lazar ( below ) provides his interviews mentioning eyewitness accounts coupled with basically seamless theories covering time and space folding with alien spacecraft ET technology interalia ETT [ ExtraTerrestrial Technologies ] given his previous work at the Nellis Air Force Base, Nevada Test Range Site S-4 Area 51 ( near Groom Lake, Nevada ), Edgar Fouche provides another arena of detailed information coinciding with some areas with what Bob Lazar saw in his own experiences near ExtraTerrestrial technology ( ETT ). Edgar Fouche depicts how ETT was converted into operational use flying craft for United States government arenas.

Skeptics may no longer speculate on what current aerospace lenticular crafts the U.S. has developed and just what is planned for the not too distant future where most of these highly classified Programs and Projects will remain cloaked for some time to come.

[ NOTE: For a look at current lenticular crafts, developed by the U.S., search this website for photos and details on UAV, UCAV, MCAV, MAV and High-Energy Weapons ( HEW ) and Directed Energy Weapon ( DEW ) research and development. ]

Culminations of technology data and relevant associated theories have never before been produced into one ( 1 ) reading area until now ( here ) where at first glance the following data may seem too fictionalized due in large part to its unfamiliarity to most, however these technologies are very much a large part of reality in what research provides of which only a very small percentage is presented in multiple interviews ( below ):

Interview Excerpts of Bob Lazar interview ( 09DEC89 ) on KLAS TV ( Las Vegas, Nevada ), below:

Producer / Host: George Knapp Lazar: The first thing was hands-on experience with the anti-matter reactor. Knapp: Explain what that is, how it works, and what it does. Lazar: It’s a plate about 18-inches in diameter with a sphere on top. Knapp: We have a tape of a model that a friend of yours made. You can narrate along. There it is… Lazar: Inside that tower is a chip of Element 115 they just put in there. That’s a super-heavy element. The lid goes on top. And as far as any other of the workings of it, I really don’t know, you know, [ such as ] what’s inside the bottom of it ( i.e. Element 115 ), sets up a gravitational field around the top. That little waveguide, you saw being put on the top, it essentially siphons off the gravitywave – and that’s later amplified in the lower portion of the craft. But, just in general, the whole technology is virtually unknown. Knapp: Now we saw the model. We saw the pictures of it there. It looks really, really simple, almost too simple to actually do anything. Lazar: Right. Knapp: Working parts? Lazar: None detectable. Essentially what the job was, to back-engineer [ reverse engineer ] everything, where you have a finished product and to step backwards and find out how it was made or how it could be made with earthly materials. There hasn’t been very much progress. Knapp: How long do you think they’ve had this technology up there? Lazar: It seems like quite a while, but I really don’t know. Knapp: What could you do with an anti-matter generator? What does it do? Lazar: It converts anti-matter . . . It DOESN’T convert anti-matter! There’s an annihilation reaction. It’s an extremely powerful reaction, a 100% conversion of matter to energy, unlike a fission or fusion reaction which is somewhere around eight-tenths of one percent conversion of matter to energy. Knapp: How does it work? What starts the reaction going? Lazar: Really, once the 115 [ Element 115 ] is put in, the reaction is initiated. Knapp: Automatic. Lazar: Right. Knapp: I don’t understand. I mean, there’s no button to push or anything? Lazar: No, there’s no button to push or anything. Apparently, the 115 under bombardment with protons lets out an anti-matter particle. This anti-matter particle will react with any matter whatsoever, which I imagine there is some target system inside the reactor. This, in turn, releases heat, and somewhere within that system there is a one-hundred-percent-efficient thermionic generator, essentially a heat-to-electrical generator. Knapp: How is this anti-matter reactor connected to gravity generation that you were talking about earlier? Lazar: Well, that reactor serves two purposes; it provides a tremendous amount of electrical power, which is almost a by-product. The gravitational wave gets formed at the sphere, and that’s through some action of the 115, and the exact action I don’t think anyone really knows. The wave guide siphons off that gravity wave, and that’s channeled above the top of the disk to the lower part where there are three gravity amplifiers, which amplify and direct that gravity wave. Knapp: In essence creating their own gravitational field. Lazar: Their own gravitational field. Knapp: You’re fairly convinced that science on earth doesn’t have this technology right now? We have it now at S-4, I guess, but we didn’t create it? Lazar: Right. Knapp: Why not? Why couldn’t we? Lazar: The technology’s not even — We don’t even know what gravity IS! Knapp: Well, what is it? What have you learned about what gravity is? Lazar: Gravity is a wave. There are many different theories, wave included. It’s been theorized that gravity is also particles, gravitons, which is also incorrect. But gravity is a wave. The basic wave they can actually tap off of an element: why that is I’m not exactly sure. Knapp: So you can produce your own gravity. What does that mean? What does that allow you to do? Lazar: It allows you to do virtually anything. Gravity distorts time and space. By doing that, now you’re into a different mode of travel, where instead of traveling in a linear method — going from Point A to B — now you can distort time and space to where you essentially bring the mountain to Mohammed; you almost bring your destination to you without moving. And since you’re distorting time, all this takes place in between moments of time. It’s such a far-fetched concept! Knapp: Of course, what the UFO skeptics say is, yeah, there’s life out there elsewhere in the universe; it can never come here; it’s just too darn far. With the kind of technology you’re talking about, it makes such considerations irrelevant about distance and time and things like that. Lazar: Exactly, because when you are distorting time, there’s no longer a normal reference of time. And that’s what producing your own gravity does. Knapp: You can go forward or backward in time? Is that’s what you’re saying? Lazar: No not essentially. It would be easier with a model. On the bottom side of the disk are the three gravity generators. When they want to travel to a distant point, the disk turns on its side. The three gravity generators produce a gravitational beam. What they do is they converge the three gravity generators onto a point and use that as a focal point; and they bring them up to power and PULL that point towards the disk. The disk itself will attach ONTO that point and snap back — AS THEY RELEASE SPACE BACK TO THAT POINT! Now all this happens in the distortion of time, so time is not incrementing. So the SPEED is essentially infinite. Knapp: We’ll get into the disks in a moment. But the first time you saw the anti-matter reactor in operation or a demonstration — you had a couple of demonstrations — tell me about that. Lazar: The first time I saw it in operation, we just put — a friend I worked with, Barry — put the fuel in the reactor, put the lid on as, as was shown there. Immediately, a gravitational field developed, and he said, “Feel it!” And it felt like you bring two like poles of a magnet together; you can do that with your hand. And it was FASCINATING to do that, impossible, except on something with great mass! And obviously this is just a . . . And it was a REPULSION field. In fact, we kind of fooled around with it for a little while. And we threw golf balls off it. And it was just a really unique thing. Knapp: And you had other demonstrations to show you that this is pretty wild stuff, right? Lazar: Yeah, they did. They were able to channel the field off in a demonstration that they created an INTENSE gravitational area. And you began to see a small little black disk form, and that was the bending of the light. Knapp: Just like a black hole floating around? Lazar: Yeah, well, a black hole is a bad analogy, but yeah, essentially.

Interview ( MAR – APR 1990 ) Excerpts of Bob Lazar at KLAS TV ( Las Vegas, NV ), below:

Lazar: The craft does not create an “antigravity” field, as some have surmised. “It’s a gravitational field that’s out of phase with the current one,” Lazar explained in a 1989 radio interview. “It’s the same gravitational wave. The phases vary from 180 degrees to zero … in a longitudinal propagation.” Assuming they’re in space, they will focus the three [ 3 ] gravity generators on the point they want to go to. Now, to give an analogy: If you take a thin rubber sheet, say, lay it on a table and put thumbtacks in each corner, then take a big stone and set it on one end of the rubber sheet and say that’s your spacecraft, you pick out a point that you want to go to -which could be anywhere on the rubber sheet – pinch that point with your fingers and pull it all the way up to the craft. That’s how it focuses and pulls that point to it. When you then shut off the gravity generator[s], the stone (or spacecraft) follows that stretched rubber back to its point. There’s no linear travel through space; it actually bends space and time and follows space as it retracts. In the first mode of travel – around the surface of a planet – they essentially balance on the gravitational field that the generators put out, and they ride a “wave”, like a cork does in the ocean. In that mode they’re very unstable and are affected by the weather. In the other mode of travel – where they can travel vast distances – they can’t really do that in a strong gravitational field like Earth, because to do that, first of all, they need to tilt on their side, usually out in space, then they can focus on the point they need to with the gravity generators and move on. If you can picture space as a fabric, and the speed of light is your limit, it’ll take you so long, even at the speed of light, to get from point A to point B. You can’t exceed it – not in this universe anyway. Should there be other parallel universes, maybe the laws are different, but anyone that’s here has to abide by those rules. The fact is that gravity distorts time and space. Imagining that you’re in a spacecraft that can exert a tremendous gravitational field by itself, you could sit in any particular place, turn on the gravity generator, and actually warp space and time and “fold” it. By shutting that off, you’d click back and you’d be at a tremendous distance from where you were, but time wouldn’t have even moved, because you essentially shut it off. It’ s so farfetched. It’s difficult for people to grasp, and as stubborn as the scientific community is, they’ll never buy it that this is in fact what happens.

According to Lazar, the propulsion system he worked on at S-4 gives rise to certain peculiar effects, including INVISIBILITY of the craft: “You can be looking straight up at it, and if the gravity generators are in the proper configuration you’d just see the sky above it – you won’t see the craft there. That’s how there can be a group of people and only some people can be right under it and see it. It just depends how the field is bent. It’s also the reason why the crafts appear as if they’re making 90- degree turns at some incredible speed; it’s just the time and space distortion that you’re seeing. You’re not seeing the actual event happening.” If the crafts look like they’re flying at seven thousand miles per hour and they make a right-angled turn, it’s not necessarily what they’re doing. They can ‘appear’ that way because of the gravitational distortion. I guess a good analogy is that you’re always looking at a mirage – [ it’s only when ] the craft is shut off and sitting on the ground, ‘that is’ what it ‘looks like’. Otherwise, you’re just looking at a tremendously distorted thing, and it will appear like it is changing shape, stopping or going, and it could be flying almost like an airplane, but it would never look that way to you. Knapp: How close do you think you have to get before time distortion takes place? Lazar: It’s tough to say, because it depends on the configuration of the craft. If the craft is hovering in the air, and the gravity amplifiers are focused down to the ground and it’s standing on its gravity wave, you would have to get into that focused area. If you’re directly underneath the craft at any time there’s a tremendous time distortion, and that’s in proportion to the proximity of the craft. Lazar: I don’t know if I mentioned it before, but the amplifiers always run at 100%. They are always outputting a maximum gravity wave, and that wave is phase-shifted from zero to 180 degrees. That’s essentially the attraction and repulsion, and it’s normally at a null setting somewhere in between. It’s a very straightforward system. It looks more like a coal-fired engine than very hi-tech.

Interview ( JUN – JUL 1990 ) Excerpts of Bob Lazar at KLAS TV ( Las Vegas, NV ), below:

Lazar: …And there are two specific different types of Gravity: Gravity A and Gravity B. Gravity A works on a smaller, micro scale while Gravity B works on a larger, macro scale. We are familiar with Gravity B. It is the big gravity wave that holds the Earth, as well as the rest of the planets, in orbit around the Sun and holds the moon, as well as man-made satellites, in orbit around the Earth. We are not familiar with Gravity A. It is the small gravity wave, which is the major contributory force that holds together the mass that makes up all protons and neutrons. Gravity A is what is currently being labeled as the Strong Nuclear Force in mainstream physics, and Gravity A is the wave that you need to access and amplify to enable you to cause space-time distortion for interstellar travel. To keep them straight, just remember that Gravity A works on an atomic scale, and Gravity B is the big gravity wave that works on a stellar or planetary level. However, don’t mistake the size of these waves for their strength, because Gravity A is a much stronger force than Gravity B. You can momentarily break the Gravity B field of the Earth simply by jumping in the air, so this is not an intense gravitational field. Locating Gravity A is no problem because it is found in the nucleus of every atom of all matter here on Earth, and all matter everywhere else in our universe. However accessing Gravity A with the naturally occurring elements found on Earth is a big problem. Actually, I’m not aware of any way of accessing the Gravity A wave using any Earth element, whether naturally occurring or synthesized, and here’s why. We’ve already learned that Gravity A is the major force that holds together the mass that makes up protons and neutrons. This means the Gravity A wave we are trying to access is virtually inaccessible as it is located within matter, or at least the matter we have here on Earth. The most important attribute of these heavier stable elements is that the Gravity A wave is so abundant that it actually extends past the perimeter of the atom. These heavier, stable elements literally have their own Gravity A field around them in addition to the Gravity B field that is native to all elements. No naturally occurring atoms on Earth have enough protons and neutrons for the cumulative Gravity A wave to extend past the perimeter of the atom so you can access it. Even though the distance the Gravity A wave extends is infinitesimal, it IS accessible and has amplitude, wavelength and frequency just like any other wave in the electromagnetic spectrum. Once you can access the Gravity A wave, you can amplify it just like we amplify any other electromagnetic wave. So, back to our power source. Inside the reactor, element 115 is bombarded with a proton that plugs into the nucleus of the 115 atom and becomes element 116 which immediately decays and releases or radiates small amounts of antimatter. The antimatter is released in a vacuum into a tuned tube that keeps it from reacting with the matter that surrounds it. It is then directed toward the gaseous matter target at the end of the tube. The matter and antimatter collide and annihilate, totally converting to energy. The heat from this reaction is converted into electrical energy in a near 100% efficient thermoelectric generator. This is a device that converts heat directly into electrical energy. Many of our satellites and space probes use thermoelectric generators, but their efficiency is very, very low. All of these actions and reactions inside of the reactor are orchestrated perfectly like a tiny little ballet, and in this manner the reactor provides an enormous amount of power. So, back to our original question: What is the power source that provides the power required for this type of travel? The power source is a reactor that uses element 115 as a fuel, and uses a total annihilation reaction to provide the heat which it converts to energy, making it a compact, lightweight, efficient, onboard power source. I’ve got a couple of quick comments, on Element 115, for those of you that are interested. By virtue of the way it’s used – in the reactor – it depletes very slowly, and only 223 grams ( just under ½ pound ) of Element 115 can be utilized for a period of 20 to 30-years. Element 115 melting point is 1740 C. I need to state here that even though I had hands-on experience with Element 115, I didn’t melt any of it down and I didn’t use any of it for twenty to thirty years to see if it depleted. Now when a disk travels near another source of gravity, such as a planet or moon, it doesn’t use the same mode of travel that we learned about in our science lesson. When a disk is near another source of gravity, like Earth, the Gravity A wave, which propagates outward from the disk, is phase-shifted into the Gravity B wave propagating outward ( from the Earth ), creates lift. The gravity amplifiers ( of the disk ), can be focused independently, and they are pulsed and do not remain ‘on’ continuously. When all three [ 3 ] of these amplifiers are being used for travel, they are in the delta wave configuration, and when only one [ 1 ] is being used, for travel, it is in the omicron wave configuration. As the intensity of the gravitational field around the disk increases, the distortion of space-time around the disk also increases. And if you could see the space-time distortion, this is how it would look [ Bob Lazar draws a side-view picture of saucer hovering above ground, with field surrounding it and running straight down to the ground. Picture a disk on the end of a pole, then throw a sheet over it. ] As you can see, as the output of the gravitational amplifiers becomes more intense, the form of space-time around the disk not only bends upward but – at maximum distortion – actually folds over into almost a ‘heart shape design around the top’ of the disk. Now remember, this space-time distortion is taking place 360 degrees around the disk, so if you were looking at the disk from the top, the space-time distortion would be in the shape of a doughnut. When the gravitational field around the disk is so intense, that the space-time distortion around the disk achieves maximum distortion, and is folded up into this heart shaped form, the disk cannot be seen from any angle vantage point – and for all practical purposes is invisible. All you could see would be the sky surrounding it.

Interview ( 28DEC89 ) of Bob Lazar at KVEG Radio Station ( below ):

KVEG Radio Incoming Caller: With the gravity generators running, is there thermal radiation?

Lazar: No, not at all. I was never down on the bottom ‘while’ the gravity generators were running, but the reactor itself – there’s no thermal radiation whatsoever. That was one of the really shocking things because that violates the first law of thermodynamics. Lazar: In fact, I’m in the process of fabricating the gravity amplifier, but then I’m at a tremendous shortage for power. So yeah, I have even tried to do that stuff on my own. Caller: Is there any electronics, as we know it, chips or transistors? Lazar: No, nothing like that. Because, of the tremendous power involved too, there was ‘no direct connection between the gravity amplifiers and the reactor’ itself. Caller: Are the waveguides similar to what we use with microwaves? Lazar: Very similar. Caller: In regard to the long-range method of travel, isn’t a ‘propulsion unit’ the wrong idea? I feel this device is creating a situation where it is diminishing or removing the localized gravitational field, and the long-distance body – that they’re heading toward – is actually ‘pulling’ the vehicle rather than it [ the vehicle ] being pushed. Am I correct in this? Lazar: The vehicle is not being pushed. But being ‘pulled’ implies it’s being pulled by something externally; it’s pulling something else to ‘it’. ‘It’ is ‘creating the gravitational field’. Caller: Is there any relation to the ‘monopoles’, which [ scientists ] have been looking for? Lazar: Well, they’ve been looking for the ‘monopole magnet’, but then this [ the UFO force ] is a gravitational force. Caller: What is the top speed of the craft? Lazar: It’s tough to say a top speed because to say ‘speed’ you have to ‘compare distance and time’. And when you’re screwing around with time, and distorting it, you can ‘no longer judge a velocity’. They’re ‘not traveling in a linear mode’ – where they just fly and cover a certain distance in a certain time. That’s the ‘real definition of speed’. They’re ‘bending and distorting space’ and then essentially snapping it back with the craft so, the ‘distances they can travel’ are phenomenal – in ‘little or no time’. So ‘speed has little bearing’. Caller: You’ve mentioned anti-gravity generator and anti-matter generator. Are they different? Lazar: It’s ‘not a gravity generator’ – it’s a ‘gravity amplifier’. I get tongue-twisted all too often. The ‘anti-matter reactor provides the power’ for the craft and the basic ‘low-amplitude gravitational wave’ – ‘too low of amplitude’ to do anything – is ‘piped into the gravity amplifiers’ – found at the bottom of the craft – amplify that to an ‘extremely powerful wave’, and ‘that is what the craft travels along’. But there is ‘an anti-matter reactor’ that ‘provides the power’. Caller: I understand there’s an antenna section in this device; what is the resonant frequency that that operates at? Lazar: The resonant frequency of the gravity wave I ‘do know’ but I don’t know it off hand – I just cannot recall it just now. Mark: Can you give me a ballpark, like 2,000 kilohertz? Lazar: I really don’t remember. It’s a really odd frequency. Mark: Is it measured in kilohertz or gigahertz or megahertz? Lazar: I really don’t remember. Burt: You were talking about the low- and high-speed modes and the control factors in there. Can you describe those modes and what the ship looks like each time it is going through those modes? Lazar: The low-speed mode — and I REALLY wish I could remember what they call these, but I can’t, as I can’t remember the frequency of the wave –The low-speed mode: The craft is very vulnerable; it bobs around. And it’s sitting on a weak gravitational field, sitting on three gravity waves. And it just bounces around. And it can focus the waves behind it and keep falling forward and hobble around at low speed. The second mode: They increase the amplitude of the field, and the craft begins to lift, and it performs a ROLL maneuver: it begins to turn, roll, begins to turn over. As it begins to leave the earth’s gravitational field, they point the bottom of the craft at the DESTINATION. This is the second mode of travel, where they converge the three gravity amplifiers — FOCUS them — on a point that they want to go to. Then they bring them up to full power, and this is where the tremendous time-space distortion takes place, and that whips them right to that point. Burt: Did you actually bench-test a unit away from the craft itself? Lazar: The reactor, yeah. Burt: About how large is this, and could you describe it? Lazar: The device itself is probably a plate about 18-inches square; I said diameter before but it is square. There’s a half-sphere on top where the gravity wave is tapped off of, but that’s about the size of it. Caller Jim ( Las Vegas, NV ): On TV [ television ], you spoke of observing a demonstration of this anti-matter gravity wave controller device. And you made a mock-up copy? Lazar: A friend made one, yeah. Jim: I heard you speak of bouncing golf balls off of this anti-gravity field? Lazar: Yeah. Jim: And also about the candle, the wax, and the flame stood still? Lazar: Right. Jim: And then the hole that you saw appear – Lazar: It wasn’t a hole; it was a little disk. Jim: Under what conditions did you see this demonstrated? Elaborate on this. And how large was the force field? Lazar: The force field where the candle was? Jim: The force field created by the anti-matter device. Lazar: It was about a 20-inch radius from the surface of the sphere. Jim: Where was this area, just above the device? Lazar: Yeah, surrounding the sphere. Jim: Did the sphere surround the device? Lazar: No, the sphere sits in the center of the device. It’s a half-sphere sitting on a plate, and a field surrounds the half-sphere. Jim: And you just place a candle in there? Lazar: No, no, no. That was a separate demonstration. I’m just telling you from where the field extends. Jim: Oh, that’s what I’m curious about. Lazar: No, they tap the field off using a wave-guide, off of the sphere. And this is a completely different setup, where they had a mockup small gravity amplifier, and there were three focused into a point, and that area of focus was probably nine or ten inches in diameter. Jim: They displaced this area or moved this area? Lazar: No, it wasn’t displaced; it’s just where the field was generated. Jim: And in there you put the candle? Lazar: Right. Jim: And that thing can actually bounce golf balls off of it? Lazar: No, no. The golf ball thing, again, had nothing to do with that setup. The golf ball thing had something to do with just when the reactor was energized, before the wave-guide was put on or anything. We were just pushing on the field; it was being demonstrated to me; and we just bounced a golf ball off the top.

Interview Excerpts of Bob Lazar at Seminar; Rachel, Nevada ( 01MAY1993 ), below:

Question: I’m interested in a little bit more about the physics of the power generation from the development of the anti-matter to the Gravity “A” wave and the amplification and the process of generation of that and being able to fold space. Lazar: Well, it’s… I can give you, I guess, a brief overview of essentially how that works. If you want an in-depth description, you can give me your address and I can send you a paper on it. Essentially, what the reactor does is provide electrical power and the base gravity wave to amplify, and it does that by interacting matter and antimatter, essentially. The way it does that is injecting an accelerated proton into a piece of 115. That spontaneously generates anti-hydrogen, essentially. That’s reacted in a small area. It’s a compressed gas, probably compressed atmospheric gas, and the antimatter reacting with matter produces the energy, mainly heat energy, and that is converted into electrical energy by a thermionic [ thermal ion / thermion ] generator that appeared to be 100% efficient, which is a difficult concept to believe anyway. Also, the reactor has two functions. That’s one of them; the other function is, it provides the basic gravity wave that’s amplified, and that appears at the upper sphere of the amplifier itself, and that’s tapped off with a wave-guide, similar to microwaves, and is amplified and focused, essentially. Question: So how is the electrical energy related to the amplification of the gravitational “A” wave energy? Lazar: The electrical energy is transmitted essentially without wires, and I related it to almost a Tesla setup. It seemed like each sub component on the craft was attuned to the frequency that the reactor was operating at, so essentially the amplifiers themselves received the electrical energy, like a Tesla coil transmits power to a fluorescent tube, and what was the rest of the question? Question: Yeah, in other words, what is the relationship between… I think you basically answered it. Lazar: Yeah, that’s how the amplifiers receive the power and through the wave-guide to receive the basic wave. It’s almost…It’s very, very similar to a microwave amplifier… Question: Was the local means of propulsion the same as these across-space distances? What was the local means of propulsion? Lazar: The local means of propulsion is essentially them balancing on a out of phase gravity wave, and it’s not as stable as you would think. When the craft took off, it wobbled to some degree. I mean a modern day Hawker Harrier or something along those lines of vertical takeoff craft is much more stable than then in the omicron [omicrom?] configuration, which is that mode of travel. The delta configuration is where they use the three amplifiers. Those are the only two methods I know about for moving the craft. Question: When you listen to some abduction reports, whether or not people believe it or not, there seems to be a common thread of people being hit by blue beams of light…. Lazar: Any of the three gravity amplifiers could do that, could lift something off the ground, or for that matter compact it into the ground. That’s not a problem, because the craft can operate on one amplifier, in omicron mode, hovering. That would leave the other three (?) amplifiers free to do anything. So I imagine they could pick up cows or whatever else they want to do. On the craft I worked on there was absolutely no provision for anything to come in through the bottom of the craft, or anything along those lines… Question: So what was the course of energy? How did it go from one area to another area? Lazar: The best guess is essentially it operated like a Tesla coil does. A transmitter and essentially a receiver tuned to the transmitting frequency, receives electrical power. There again, that’s not real advanced technology. Tesla did that in the 30s, I think. Question: You mentioned the photon earlier. Do you think that physics is taking a wrong turn by looking for exchange particles, when you’re talking about the strong force of gravity again? I’m not clear why you’re skeptical about the graviton? Lazar: About the graviton? Question: Every other force seems to have exchange particles connected with it. Lazar: No, not necessarily. I mean, they make it have one, but as time goes on, that really hasn’t held true. The bottom line is, they don’t…First of all, they don’t even believe there’s a graviton anymore, so I’m not the only one. As far as exchange particles, still, though some of them like the zeta particle, maybe that’s an actual thing, but when they’re looking at transfers of energy, I think these are scapegoats for the most part. A lot of experiments that I was doing at Los Alamos essentially were along these same lines, but other exchange particles like the intermediate vector bozon, I don’t believe that thing exists. I really don’t. I think they’re grabbing at straws and just coming up with excuses. Question: What about the small gravity, the Gravity “A”; how can you detect that one? What is the frequency of that? Lazar: Well, the frequency that the actual reactor operates at is like 7.46 Hertz. It’s a very low frequency. Question: That’s the frequency of Earth’s gravity, or universally, all gravity? Lazar: That’s the frequency the reactor operates at. Question: I can understand a reactor functioning – theoretically I can understand a reactor functioning at, say, (unintelligible word) 7.46 Hertz. There’s a wave-guide involved. I don’t buy 7.46… Lazar: No, that’s the basic… The frequency of the gravity wave that’s produced, it has to be higher frequency, because you’re in a microwave range to follow a conduit like that. Question: I understand from Lear’s lecture that it had a tendency to conduct on the outside also of the reactor. Lazar: Right. Well, that’s all… this was the electric field we were talking about. The basic frequency, I think, was the way the reactor was operating. The pulses that we detected out of it were probably, instead of a straight DC power supply, it was more along the lines of a pulse, as if we were getting a burst of particles coming out: An antimatter emission, then a reaction, a pulse of energy, and that would repeat. That’s about seven and a half Hertz, something along those lines. Question: Bob, the microwave frequency going to the wave-guide is electromagnetic, or that’s gravitational? Lazar: They’re one in the same. Question: I don’t understand what you mean by that. Lazar: Gravity is… Unfortunately, physics hasn’t gotten to that part yet, but gravity essentially is part of the electromagnetic spectrum. Question: Then what frequency is it? Lazar: That’s something I’m reserving for myself. Question: Something about the microwave range? Lazar: Something about the microwave range. Well, you can sort of figure it out by the dimensions of the waveguide itself, and that’s about it. Question: Positive energy versus regular photon? Lazar: No, it’s not photon. Question: Electromagnetic Energy? Lazar: Right. I’m not trying to be secret, but this is part of the equipment that I’m working on, and I want to get it operating before… Question: I hope we’ll find out one day. Lazar: Absolutely.

Speech of Edgar Fouché at International UFO Congress ( Summer 1998 ), below:

[ photo ( above ) LOCKHEED SR-71 Blackbird surveillance aircraft ( click to enlarge ) ]

I’m here to speak about government technology, special programs, and the TR-3B Flying Triangle. Thousands of sightings, of the Flying Triangle, have been reported, photographed, and investigated around the world – The USAF denies having such a vehicle. It also denies having replaced the Strategic reconnaissance spy plane – the SR-71 Blackbird. Keep this in mind as I proceed:

Astronauts Edgar Mitchell and Gordon Cooper say that new investigations are warranted in UFOs.

Edgar Mitchell, who became the sixth [ 6th ] man on the moon during the Apollo 14 mission said, “The evidence points to the fact that Roswell [ New Mexico crash of UFO ] was a real incident and that indeed an alien craft did crash and that material was recovered from that crash site.” Mitchell doesn’t say he’s seen a UFO, but he says he’s met with high-ranking military officers who admitted involvement with alien technology and hardware.

Gordon Cooper told a United Nation ( UN ) committee recently; “Every day in the USA, our radar instruments capture objects of ‘form’ and ‘composition’ unknown to us.” Cooper speculates public skepticism, toward UFOs, will shift dramatically.

Now, a little about my background:

I’ve held positions within the United States Air Force ( USAF ) that required me to have Top Secret and ‘Q’ clearances, and Top Secret Crypto access clearances.

I’ll show you pictures of some of aircraft programs I’ve worked. I’ll also show you some pictures of classified aircraft. And I’ll share with you some of the information and stories I’ve gathered through my research in developing [ my book ] Alien Rapture. In many cases I’ve been able to obtain actual details of this [ my book, “Alien Rapture – The Chosen” ] black technology.

I was born to fifth [ 5th ] generation French-Americans, and many of my relatives – for generations – have historically been involved with the government in fields of intelligence, black programs, cryptography, and classified development projects.

This is true, as far back as the French revolution, where Joseph Fouché was Prime Minister under Napoleon. He was the head of the French secret National Police Force and was a direct ancestor of mine. Joseph Fouché started and controlled the world’s first professionally organized intelligence agency with agents throughout Europe.

The CIA, the Russia KGB ( now FSB ), the UK MI-5 ( MI-6 ), Israel’s Mossad, and many other intelligence agencies have used and expanded on his methods of intelligence gathering, networking information, and political survival.

I have also worked intelligence and cryptography related programs, but because of oaths of secrecy, I will ‘not be able to share any details of this work’.

My career background spans 30-years, and since the government isn’t about to support my claims, you will see from the positions I’ve held and the Programs I worked that I was in a position to gather the information I am presenting.

Before, I gave a presentation to the International UFO Congress in Laughlin, Nevada during August [ 1998 ], I brought over 200 documents as an offer of proof to substantiate my credibility. These documents contained information on the positions and assignments I held in the U.S. Air Force and as a DOD [ U.S. Department of Defense ] contractor. They also detailed clearances I held, classified and non-classified ( DOD and military ) technical training I received ( over 4,000 hours ), and performance reviews from 1968 to 1995.

As a civilian, from 1987 to 1995, I performed as engineering program manager, site manager, and Director of Engineering for several DOD contractors.

Ken Seddington, of the International UFO Congress, Jim Courrant, a UFO investigator, and Tim Shawcross and John Purdie, of Union Pictures in London, England viewed these documents. Some of these documents are shown in “Riddle of the Skies” special, which will be on The Learning Channel next month.

With my training and experiences with intelligence equipment, special electronics, black programs, and crypto-logical areas, I received other government opportunities. I filled positions as major command liaison, headquarters manager, and DOD factory representative for TAC, SAC, ATC, and PACAF following the Viet Nam War.

Later in my career, as a manager of defense contractors, I dealt with classified ‘black programs’ developing state-of-the-art electronics, avionics, and automatic test equipment [ ATE ].

I was considered an Air Force expert with classified electronics counter-measures test equipment [ ATE ], certain crypto-logical equipment – owned by the National Security Agency – and Automatic Test Equipment [ ATE ].

I’ve worked with many of the leading military aircraft and electronics manufacturers in the U.S. At different times I participated as a key member in design, development, production, and flight operational test and evaluation in classified aircraft development programs, state-of-the-art avionics, including electronic countermeasures, satellite communications, crypto-logic support equipment.

During my military career, I was ‘hand picked’ ( Development Cadre ) for many of the Air Force newest fighter and bomber development programs. I also represented many of these programs for TAC, SAC, PACAF, and ATC.

Other research and development programs I worked as far back as the 1970s are still classified Top Secret.

My involvement with black programs, developing stealth aircraft, is classified.

I am perhaps the only person who has actually worked at the Top Secret Groom Lake Air Base, within Area 51 of the Nellis Range, and has proved that I had the position, training, and clearances to be there.

[ photo circa: 1974 ( above ) DOD DARPA Have Blue Project F-117 stealth fighter ( click to enlarge ) ]

This [ NOTE: ‘not photo above’ ] is a F-117 Stealth fighter being readied at Groom Air Base at night. Notice the fog engines in work for cover.

My last position for the Air Force was as a Strategic Air Command Headquarters’ Liaison. As a Defense Contractor-Manager, I performed as an engineering program manager and site manager for DOD contractors involved in classified development, logistics support, electronic engineering, and technical data development from 1987 – 1995.

I have completely disassociated myself from the defense industry. I consider myself a writer and inventor now.

I undertook this trip to do research for my book Alien Rapture, which included a meeting with five [ 5 ] close friends who had agreed to release confidential information to me and discuss their closely guarded personal experiences.

I also interviewed other contacts that had worked classified programs or flown classified military aircraft to gather information about UFO sightings and contact.

Later, I was blessed to team-up with a great man and a great writer, Brad Steiger. I had decided to get out of the defense industry, as I felt that fraud, waste, and abuse was rampant – both on the government and contractor sides.

Who were the five [ 5 ] friends and co-conspirators and a host of other insiders?

It started when some old friends of mine met in the spring of 1990 in Las Vegas [ Nevada ]. There were five [ 5 ] of us then; all of us had remained close following the Vietnam War. I’ve always been the networker for my DOD, military, and contractor friends so, I’m the one who set up the meeting with the five [ 5 ]:

1. The first friend, Jerald *, was a former NSA or TREAT Team member. T.R.E.A.T. stands for Tactical Reconnaissance Engineering Assessment Team. Jerald * worked for the DOE [ U.S. Department of Energy ] as a national security investigator. That was his cover, but he really worked for the NSA [ U.S. National Security Agency ]. His job required him to manage a team – to ‘watch employees’ with Top Secret and Q clearances in the mid-west, in Los Alamos, Sandia, and White Sands ( New Mexico ), and in the Nevada Test Site and Nellis Range, which includes Area 51. Area 51 is where the most classified aerospace testing in the world takes place. You may know the base as Groom Lake Air Base, Watertown, The Ranch, or Dreamland. He [ Jerald ] was found dead of a heart attack 1-year after our last meeting.

2. The second friend, Sal *, was a person who had worked directly for the NSA [ U.S. National Security Agency ] with Electronic Intelligence ( ELINT ) and became a defense contractor after his retirement.

3. The third friend, Doc *, was a former SR-71 spy plane pilot and USAF test pilot at Edwards Air Force Base [ California ].

4. The fourth friend, Dale *, and I were in the service together during the Viet Nam conflict [ war ], and I’ve known him [ Dale ] since the early 1970s. His father worked for over 20-years for the NSA [ U.S. National Security Agency ] and he [ Dale ] is the one who sent me the MJ-12 [ Majestic ] documents his father had obtained. These documents, the New MJ-12 Charter signed by proxy during the Reagan [ Ronald Reagan ] Administration and Attachment D to the Eisenhower [ U.S. Army General Dwight D. Eisenhower ] MJ-12 briefing document, which is the Autopsy Report from Roswell [ New Mexico ], are included as attachments in my book Alien Rapture.

5. The fifth friend, Bud *, was a DOD contractor and electronics engineer. He [ Bud ] had worked on Top Secret development programs dealing with Electronic CounterMeasures [ ECM ], radar homing and warning, ECM ( Electronic CounterMeasure ) jammers, and Infra-Red [ IR ] receivers. He [ Bud ] retired as a program manager and later died of a brain tumor within 30-days after his symptoms appeared.

*All names and identifying factors have been changed.

It bothered each of us that we had experiences with unusual phenomena, extremely advanced technology, and witnessed unidentified aerial contact that had not been previously reported. We sat at a table in a dark corner of the Silver Dollar Saloon and Casino in Las Vegas [ Nevada ], discussing our experiences and swapping knowledge.

In 1990, I had no intention of writing about programs I was involved-with due to the Secrecy Act and classification documents I had signed.

Jerald asked me if I had ever heard of the ‘Flying Triangle’.

Of course I had heard rumors of Delta shaped and bat winged shaped prototypes being tested at Groom Air Base.

He [ Jerald ] said that an early test model – of the Flying Triangle – was sighted by hundreds of people over Hudson Valley, in the mid 1980s there was a major flap in Belgium – the year [ 1989 ] before our meeting [ 1990 ] – and that thousands of people had witnessed the Triangle and F-16 chase that followed. He definitely piqued my curiosity.

Over the next 4-years, each member of the group wrote down as much information as he could remember about unusual phenomena and personal sightings.

From my close friends, came their contacts. I agreed to interview these contacts in person. I interviewed four [ 4 ] other [ LOCKHEED ] SR-71 pilots, two [ 2 ] [ LOCKHEED ] U-2 [ Dragon Lady ] pilots, a [ 1 ] TR-1 pilot, and about two dozen [ 24 ] bomber and fighter jocks [ jockeys ]. None, of the people I interviewed, wanted to be known or quoted – and wanted me to swear never to reveal their names. I have and will continue to honor their wishes.

Many were afraid of what the government would do to them for taking about Top Secret ‘Black Programs’ they were involved with, and others were just worried about losing their retirement pensions.

I’ll Share some of these secrets and unusual phenomena with you:

[ photo ( above ) LOCKHEED A-12 ( single and dual cockpit versions ) ]

The SR-71 was designed as a spy plane for the CIA in the 1960s and designated the A-12.

[ extremely rare photo circa: 1954 ( above ) LOCKHEED YF-12 Prototype ( click to enlarge ) ]   [ rare photo circa: 1958 ( above ) LOCKHEED A-12 ( click to enlarge ) – NOTE: USAF brand new 1958 Edsel station wagon ( blue with white top ) and Dodge Power Wagon ( blue with white top ) pick-up truck ( mid image far right ) ]

[ photo ( above ) LOCKHEED A-11 ( click to enlarge ) ]

The Mach 3 plus aircraft first flew in 1962 [ ? ], taking off from Groom AFB [ ? ] in Area 51.

[ rare photo circa: 1960 ( above ) LOCKHEED SR-71 ( click to enlarge ) ]

Later, once the Air Force operated it as a reconnaissance plane, it was designated the SR-71 BlackBird.

My friend Chuck, an SR-71 pilot, related to me an in-flight incident he experienced in the 1970s. He was returning from a reconnaissance flight, and while at an altitude of 74,000 feet at the speed of almost Mach 3, ( 3 times the speed of sound ) he noticed something flickering in his peripheral vision. Hovering over his left wing tip was a ball of dense plasma like light. It was so bright, that when he stared at it for more than a few seconds, his eyes hurt.

Chuck tried to use his UHF-HF and VHF communications sets to no avail. There was nothing but static. Repeatedly glancing briefly at the ball of light, he watched in amazement as it moved effortlessly about his aircraft.

At one point the light positioned itself a few feet in front of the large spiked cone at the air Intake Inlet. The enormous amount of air rushing into the engines should have sucked in and shredded almost anything in its path, but the light orb was mysteriously unaffected.

The light, he noted, acted in a curious manner, if something inanimate could act at all. It moved from time to time to other parts of the vehicle, staying with him until his approach to Beale AFB in California. He was in sight of the Air Base when the light swung away from his aircraft in a wide arch with ever increasing speed.

Of course, after reading his incident report, his operations commander told him not to ever speak about his experience. When Chuck related the story to me, he told me he was absolutely convinced that the ball of light was controlled by some form of intelligence. I have about two dozen [ 24 ] stories from pilots of similar in flight incidents with UFOs and plasma balls. There have been thousands of reported sightings of plasma balls, energy filled orbs, or foo fighters as they were named during World War II.

In 1944, while fighting the Japanese and Germans, pilots started reporting strange flares and bright orange and red lights. These lights moved rapidly, were under intelligent control, and could come to a complete stop, remain stationary, and then disappear in an instant.

Foo means ‘fire’ in French. The pilots coined the term ‘foo fighters’ for the haunting glowing balls that doggedly paced their jets. Most were unnerved by radical maneuvers the foo fighters that could climb vertically, accelerate, and make high G turns at speeds far beyond any known allied aircraft.

Not far from the Royal Air Force base, MacRahanish, a triangular shaped aircraft was spotted off Western Scotland. MacRahanish has been rumored to be a base for black aircraft operations for a number of years. It’s also a NATO standby base.

RAF personnel have admitted that they have witnessed the operation of large triangular aircraft from RAF Boscombe in February 1997.

It was widely reported that a secret U.S. spy plane crash landed at Boscombe Down in 1994. It had been rumored for some time that the Triangle spotted over Belgium was based at Boscombe Down and Royal Naval Air Station ( RNAS ) Yeovilton where other sightings of the Triangle were reported.

British RAF [ Royal Air Force ] have a long history of close involvement with U.S. aerospace black programs. Key RAF officers and British scientists have been involved at Groom Air Base [ Nevada ] since 1957 and the [ LOCKHEED ] U-2 [ DragonLady ] program.

In 1995 and 1996 the National UFO Reporting Center alone received forty-three [ 43 ] reports:

11 in the State of Washington; 8 in the State of California; and, 18 from other states – from the State of Hawaii to the State of New York

Sightings, of a Triangular aircraft.

A few years ago The British Magazine, UFO Reality, published this information:

“A top BBC [ British Broadcast Corporation ] executive let slip recently that there is a D-Notice on media reporting of the so-called ‘Black Triangle. The executive is the former producer of a very popular BBC science program. He told one of our team that the black Triangle ‘craft’ – first witnessed by the hundreds in the Hudson Valley region of the U.S. in the mid-1980s, then by the thousands in Belgium in 1989 – 1990, and more in Britain – has been ‘heavily D-Noticed’ by the government. For this reason the BBC will NOT be reporting on the enigmatic craft, no matter how many witness reports there are. According to this producer, the government’s restrictive notice on reporting the Triangle, was authorized under secrecy laws, in order to protect secret new military projects.”

From 1973 through 1976, I was home based out of Edwards AFB. It is near Lancaster, California and even nearer to the San Andrus [ mountain range zone ] fault [ plate tectonic earthquake demarcation line ].

Edwards [ AFB ] has a long history with secret technology and experimental aircraft. The YB-49 was flown in 1948 at Edwards AFB which looks a lot like the B-2 Stealth Bomber.

[ photo ( above ) LOCKHEED XB-70 Valkyrie with 4 PRATT & WHITNEY engines ( click to enlarge ) ]

The XB-70 flown in 1964 looks a lot like the still Top Secret SR-75 that, the Air Force says doesn’t exist.

Edwards A.F.B. is the home of the U.S. Air Force Test Pilot School and is responsible for Flight Operational Test and Evaluation [ FOTE ] of the Air Force’s newest aircraft.

Edwards [ AFB ] hosts a number of tenant organizations, from NASA to the Jet Propulsion Laboratory [ Pasadena, California ] facility.

Edwards [ AFB ] developed various versions of the flying wing [ shaped aircraft ] from the B-35, YB-49, B-2 [ Spirit ], and exotic aircraft sometimes ahead of their time – like the XB-70, F-117, and YF-22.

I worked with the F-111 swing-wing bomber, the F-15 air superiority fighter, the F-16 fighter, the A-10 [ Warthog ] close air support attack aircraft, and B-1 stealth bomber. I was involved with these and other classified development programs, when they were just a gleam in some pilot trainee’s eyes.

One night – in the mid 1970s – a long time friend of mine and I were standing on top of the Fairchild A-10 [ Wart Hog ] hanger at Edwards AFB in southern California. It was about 02:00 a.m. and a clear night with millions of stars visible to the naked eye. I noticed a group of stars that seemed to be shifting in color. I pointed out to my friend that the three [ 3 ] bright stars in triangular formation were not part of the big dipper.

We watched as the strobing stars shifted from bright blue to a red yellow color. After a period of about 20-minutes we could tell the objects probably weren’t stars because they were getting larger. This was somewhat unnerving.

* It was further unnerving when ‘the space in-between the enlarging lights began blocking out the stars in the background’. [ See, e.g. beginning of this report, i.e. Victor Valley northeast area of California alongside Interstate 15 ( further above ) ]

We decided it probably was a Top Secret Air Force vehicle of some type, still we weren’t sure. The vehicle had gone from, half the size of the big dipper, to twice its size in under a half hour [ 30-minutes ]  and had moved from the west to the east towards the base.

About the time we could make out a silhouette or outline of the triangular vehicle, the lights – or possibly exhausts – flared brighter and vanished from the sky in an instant.

This experience wasn’t my first sighting, but it was one of the few where I had a witness. In the summer of 1976, I relocated to Nellis Air Force Base – north of Las Vegas. I spent the next 3-1/2 years there. I worked primarily with the F-15, electronics countermeasures, and Automatic Test Equipment [ ATE ]. I had heard rumors of airbases located in the desert, at places called:

Mercury; Indian Springs; and, Others, that didn’t even have names.

Before the collapse of the [ Russia ] USSR ( CCCP ), no one talked about their classified work experience, nor did anyone repeat rumors of Top Secret technology and aircraft.

Most of us who had Top Secret clearances never even told our wives what we were doing, and where we were going – when on these type projects. I once spent 6-months, in Viet Nam, while my ex-wife thought I was attending a classified technical school in Colorado.

The military, in a court of law, actually denied the existence of a classified Air Force Base inside the Nellis Range out in the Nevada Desert. Don’t you know, the Plaintiffs – who had worked at Groom – and their lawyer was surprised to hear this, but that’s another story.

I was one of the few personnel at Nellis who had a Top Secret clearance with Crypto access. I was certified to work on Mode 4 IFF, ( an aircraft system which responded to classified, encrypted codes ). I was also certified to work on other Crypto equipment that I cannot discuss.

Due to a combination of coincidences, and my technical experience, I was requested to temporary assignment at a place that had no name. My commander told me I was to report to an office on the base, and he didn’t have a clue where I was going or what I was going to be working on. And let me tell you, he wasn’t too happy about being left in the dark.

I left one Monday morning long before sunrise. It was 4:30 AM when I boarded a dark blue Air Force bus with all of the windows blacked out. There were 28 other people on the bus, not including the 2 security policemen holding M-16 automatic weapons and the bus driver. We were each told when boarding, “Do not speak on this bus unless you are spoken too.” Not one of us uttered a word. Believe me. There is nothing that can inspire compliance like an M-16 sticking in your face, I assure you!

The bus drove through the desert, this much I know from the poor air-conditioning and the amount of fine dust that came through every crack in the old vehicle for several hours, and it was soon obvious where I was.

In the 1950s, the U.S. government started building the super secret Groom Lake facilities – for the CIA U-2 ( DragonLady ), a high-altitude global surveillance aircraft. Acquired in 1951 – with $300,000,000 million in seed money from the CIA – the site was called S-4, the name Area 51 and Site 4 insiders call the Papoose Lake, Nevada facilities south of Groom Lake, Nevada. Area 51, is located in the north central part of the Nellis Air Force Base ( AFB ) Range designated Area 51. Construction of facilities within the Nellis Range continues even to today.

The SR-71 Blackbird surveillance aircraft, TR-1, F-117 stealth fighter, B-2 bomber, TR-3A Manta, and TR-3B Astra or flying triangle were tested at Groom Lake, Nevada. Under certain circumstances had persons sighted these crafts they would have been looked upon perhaps, as unidentified flying object ( UFO ) extraterrestrial spacecraft.

SR-75, replaced the SR-71 Blackbird.

SR-74 SCRAMP is a drone that appears to ride under the SR-75.

TR-3B Flying Triangle are operated there – as well as other Top Secret prototype and operational aerospace vehicles.

Many of these aircraft have been misidentified as UFOs

When we reached Groom Lake, the bus pulled into a hanger and they shut the doors. The security personnel checked me in – while other security personnel dispatched the others to their places of work. I was given a pair of heavy glasses to wear, which can only be described as looking like welder’s goggles. The lenses were thick, and the sides of the goggles were covered to obliterate my peripheral vision. Once I had these goggles on I could only see about 30-feet in front of me. Anything beyond that distance became increasingly blurred. If an M1 Tank barrel had been pointed at me, from about 50-feet away, I would not have seen it. It was very disconcerting to have to wear those glasses.

The whole time I was there, some 10-days consecutive, followed by several follow-up visits, the routine was the same. Leave, Nellis before sunrise, and return home – from Nellis – after dark every day.

Only once did I get a chance to see the whole base, and that was when I was flown up – from Nellis – in a helicopter to Groom Lake for emergency repairs of their crypto test equipment. For those stationed at Groom Lake, or commuting there daily – with flight schedules posted for classified flights – everyone ‘not cleared for that particular program and flight’ must be off the ramp – inside 30-minutes – prior to the scheduled operation.

A couple of thousand personnel are flown into Area 51 daily, from McCarran Airport in Las Vegas, Nevada and – from Edwards AFB in California – on contractor aircraft. Several hundred commute from Tonopah and Central Nevada via the north entrance near Rachel, Nevada. Other commuters use its [ Area 51 ] south entrance via Mercury, Nevada or Indian Springs, Nevada west of Las Vegas, Nevada.

While at Groom Lake I made contacts and met people from other programs. Over time, a few became friends and we exchanged stories.

On my 3rd day on the job at Groom Lake, I had to remove a module from a multi-bay piece of satellite communications equipment used to support certain special mission aircraft. I noticed while inside the bay checking out the wiring that it contained a sealed unit about the size of a large briefcase. It had a National Security Agency ID plate on it. The nomenclature on the nameplate was, Direct Orbital Communication Interface Link Emitter [ DOCILE ]. I thought this was strange, as the unit was part of a digital communications link used solely to communicate with classified Air Force vehicles. I was unaware – at the time – of any military orbital missions not related to NASA. Remember, this was in the late 1970s – the shuttle didn’t fly until 1981.I disconnected the unit, and – out of curiosity – I removed the rear access cover. To my amazement, there were some half-dozen [ 6 ] large hybrid integrated circuit [ IC processor ] chips inside. The largest chip had over 500 hair-thin leads attached and was approximately the size of a Zippo lighter. The ‘paper inspection stamp’ on the chip was dated 1975.

In 1975, the most advanced processor speeds – on the most classified projects – were equivalent to an IBM 8088 which ran at 4,000,000 million cycles per second, but this unit had a processor speed of 1,000,000,000 billion cycles per second.

It wasn’t until more than a dozen [ 12 ] years had passed before I saw comparable technology with integrated circuit chips, but ‘then’ [ 1975 ] it was at a Top Secret avionics development project at the ITT company.

In the mess hall at Groom Lake, I heard words like:

Lorentz Forces; Pulse Detonation; Cyclotron Radiation; Quantum Flux Transduction Field Generators; Quasi-Crystal Energy Lens; and, EPR Quantum Receivers.

I was told ‘quasi-crystals’ were the key to a whole new field of propulsion and communication technologies.

To this day, I’d be hard pressed to explain to you the unique electrical, optical and physical properties of Quasi-Crystals and why so much of the research is classified.

Even the unclassified research [ on quasi crystals ] is funded by agencies like the Department of Energy [ DOE ] and the Department of Defense [ DOD ].

Why is the U.S. Department of Energy and Ames Laboratory so vigorously pursuing research with Quasi-Crystals?

What is the DOE new Initiative in Surface and Interface Properties of Quasi-crystals?

“Our goal is to understand, and facilitate exploitation of, the special properties of Quasi-crystals. These properties include ( but are not limited to ) low thermal and electrical conductivity, high-hardness, low friction, and good oxidation resistance.” That’s the ‘unclassified’ part.

What are Quasi-Crystals?

In 1984, a paper was published, which marked the discovery of quasi crystals, two ( 2 ) distinctly different metallic crystals joined symmetrically together.

[ NOTE: See, e.g. Project CARET ( 1984 ), at: http://upintelligence.wordpress.com/2010/10/23/secret-extraterrestrial-information-technologies/ ]

By 1986, several Top Secret advanced studies were going on – funded by DARPA – with leading scientists already working in the field.

In classic crystallography, a crystal is defined as a three dimensional ( 3-D ) periodic arrangement of atoms with translational periodicity along three ( 3 ) principal axis’s.

Since Quasi-crystals ‘lose periodicity’ in at least one [ 1 ] dimension, it is not possible to describe them in 3D-space as easily as normal crystal structures. Thus, it becomes more difficult to find mathematical formulae for interpretation and analysis of diffraction data.

After the official release for the discovery of Quasi-crystals in 1984, a close resemblance was noted between icosahedra quasi-crystals and 3-D Penrose patterning.

Before quasi-crystals were discovered, in 1984, British mathematician Roger Penrose devised a way to cover a plane in a non-periodic fashion using two ( 2 ) different types of tiles arranged in a way they obey certain rule match-ups of rhombohedrens [ rhombohedrans ] – instead of the rhombi [ rhombae ] – that later became known as 3D-Penrose Tiling.

12-years later, ‘Penrose Tiling’ became the prototype of very powerful models explaining structures within Quasi-crystals discovered in rapidly quenched metallic alloys.

14-years of quasi-crystal research, established the existence of a wealth of stable and meta-stable quasi-crystals with 5, 8, 10, and 12 fold symmetry with strange ‘structures’ and interesting ‘properties’. New ‘tools had to be developed for – study and description – of these extraordinary materials.

[ NOTE: See, e.g. Project CARET ( 1984 ), at: http://upintelligence.wordpress.com/2010/10/23/secret-extraterrestrial-information-technologies/ ]

I’ve discovered classified research showing quasi-crystals as promising candidates for high-energy storage materials, metal matrix components, thermal barriers, exotic coatings, infrared sensors, high-energy laser applications and electromagnetics. Some high strength alloy surgical tools are already on the market.

One of the stories I was told more than once was that one of the crystal pairs used in the propulsion of the Roswell crash was a Hydrogen Crystal. Until recently, creating a hydrogen crystal was beyond the reach of our scientific capabilities. That has now changed. In one Top Secret Black Program, under the DOE, a method to produce hydrogen crystals was discovered, and then manufacturing began in 1994.

The lattice of hydrogen quasi-crystals, and another material not named, formed the basis for the plasma shield propulsion of the Roswell [ New Mexico, USA UFO crash ] craft and was an integral part of the bio-chemically engineered vehicle.

A myriad of advanced crystallography – undreamed-of by scientists – were discovered by the scientists and engineers who evaluated, analyzed, and attempted to reverse-engineer the technology presented with the Roswell vehicle and eight [ 8 ] more vehicles which have crashed since then.

I wrote down everything I saw, heard and touched in my log – every night before going to bed. By the way, the food at the Groom Lake mess hall was excellent, but what would you expect – there was no cable TV, no alcohol and no women. I guess they figured they’d better do something right.

Later, while back at the base, my routine went on as normal – as did my part-time job that summer at the Silver Dollar Salon [ Las Vegas, Nevada ].

My NSA friend, Jerald, managed a team that investigated and ‘watched’ those with highly classified jobs at the Nevada Test Site and the Nellis Range – among other highly classified facilities – happened to show up.

I met Jerald in 1976 when I was part of a team that moved the F-15 operation from Edwards AFB to Nellis AFB and set up the Joint NAV – AIR AIM – VAL / ACE – VAL program.

He was checking up on a guy who had a drinking problem who worked at the Nevada Test Site ( S-4 ) where they set off underground atomic explosions.

He [ Jerald ] happened to mention a vehicle that could be boosted into orbit and return and land in the Nevada desert. This was during the late 1970s. It was an Unmanned Reconnaissance Vehicle ( URV ) – launched from a B-52 bomber – and used booster rockets to place it in temporary low earth orbit ( LEO ) for taking high-altitude reconnaissance photographs. I thought he was feeding me a line of bull. Then he said, “This vehicle is remote-piloted ( unmanned ) with communications via the DOCILE system at Groom.”

I’m not usually too slow, but it didn’t hit me until he repeated, “You know, the Direct Orbital Code Interface Link Emitter ( DOCILE ).” Bingo, the light bulb went on, I had seen – at Groom Lake [ Nevada ] a part of the DOCILE equipment – the NSA briefcase sized unit with the large chips.

These are old pictures of the Virtual Reality Laboratory ( VRL ) at Brooks Air Force Base ( BAFB ) where the software – to remotely fly exotic aircraft – was developed.

Let me get back to the development of [ my book ] Alien Rapture – The Chosen.

After I agreed to write my co-conspirator’s story, I talked to several military Judge Adjutant General ( JAG ) lawyers whom were told I wanted to write about some of my experiences in the military and that I had been on many classified projects. I was told, I had to write my story as ‘fiction’, which I have.

I was told, I couldn’t name any real individuals with clearances or covers, or use their working names, which I haven’t.

I was also told, I couldn’t discuss any secrets of programs I had been personally assigned to, which I have ‘not done’ and ‘will never do’.

Then I was told so long as I did that I could damn well write anything I wanted to. Of course I did ‘not’ tell them that I was going to write about the government conspiracy to cover-up UFO contact and reverse engineering of alien technology. Or, that I was interviewing pilots who flew classified ‘air craft’, plus others whom had worked Black Programs.

In the summer of 1992, we again met in Las Vegas, Nevada. I had compiled my notes from our first meeting, my interviews, and input that five ( 5 ) friends had passed on to me, of whom each reached out to ‘their friends and contacts’ uncovering even more information.

We agreed I was the only one who could get away with writing about our experiences, since I was planning on getting out of the D.O.D. ( Department of Defense ) industry.

My friends for the most part, were still connected.

Bud, one of my co-conspirators and close friends, informed me he had a cancerous tumor and was going through some severe depression. He was dead 30-days later. It was a real blow to us. We lost Jerrold, 1-year before, due to his heart attack.

Of the remaining three ( 3 ) friends, Sal dropped off of the face of the earth and none of his – nor my contacts – have been able to locate him for 2-years now. Sal was extremely paranoid about the two ( 2 ) deaths ( of Bud and Jerrold ), and had second thoughts about the book. Sal said he was going to move and didn’t know when or if he would contact me next. I like to think of him sipping a tropical drink on some Pacific island.

Let me talk about my friend Doc, who has a theory that UFOs were drawn-to like fast Aircraft. SR-71 pilot, Doc, whom I knew well, was stationed at Kadena AFB ( KAFB ) where they were located on the SAC ( Strategic Air Command ) side of the base during 1973.

While flying back across the South China Sea, from a reconnaissance mission, SR-71 pilot ( Doc ) encountered a shadow over his cockpit. Doc said his avionics systems went totally haywire, and he felt the aircraft nose down slightly, which can be dangerous at 2,000 miles per hour ( 35 miles per minute ). When the SR-71 pilot looked up, Doc was so startled that he almost panicked and immediately made an evasive maneuver to the right and down – one of many maneuvers made if an approaching missile is detected.

Doc said the object was so big that it totally blocked out the sun. His estimate was that it was 250-feet to 300-feet across, oval shaped and appeared bright blue-grey in color, however Doc wasn’t sure because it also had a shimmering halo of energy surrounding the vehicle.

About 3-minutes later, and some thousands of feet lower, the vehicle reappeared to the side of his left wing tip. He tried his UHF radio and all he could pick up was a deep electrical hum. He abandoned his attempts to use his radio, as his immediate survival was more important.

For the next 10-minutes, the large oval vehicle moved from his left wing tip to the rear of the SR-71 and then to his right wing tip. The movement from the left, to the rear, to the right wing tip took about 2-minutes, and then it reversed the movement. During the UFO’s last swing to the rear of his SR-71 his aircraft started buffeting wildly, which is terrifying at Mach 3, then it stopped after about 15-seconds, and then he never saw it again.

Doc said he heard a sound in his head, ‘like a swarm of bees in my brain,’ as he described it. When Doc returned from the mission he immediately went to his debriefing. The minute he mentioned the incident with the Unidentified Aerospace Vehicle ( UAV ) to his commander, he was pulled away from the debriefing and taken to his commander’s office. His commander, a colonel, filled out an incident report, in detail, and then told Doc not to mention the incident to anyone or he would be subject to severe and speedy penalty under military regulations.

Doc told me he knew of no single other SR-71 pilot or astronaut who hadn’t had a close encounter or a UFO sighting. Doc felt none would ever go on record with their experiences because of fear of retaliation from the U.S. Department of Defense ( DOD ) and loss of their retirement pay and benefits for breaking the U.S. oath of Secrets.

During the 9-years after this in-flight incident, Doc related that a few of his trusted friends related similar incidents, with the same type of vehicles, or glowing orbs of dense light, dancing around their SR-71.

Then Doc told me another story about his friend Dave, another SR-71 Blackbird pilot, who – while drunk on Sake in Japan – told him in whispers that he hadn’t been a drinker until he made a reconnaissance flight over the Eastern border of Russia 6-months earlier when Dave returned delirious and semi-conscious.

Dave’s SR-71 crew had to pull him out of the cockpit. The Flight Surgeon attributed his symptoms to loss of oxygen. Dave didn’t share ( with U.S. Air Force base physicians ) his nightmares, for fear the Flight Surgeon would ground him and possibly lose his flying status, however under the influence of alcohol – in a quiet bar – with a trusted fellow SR-71 blackbird pilot ( Doc ) friend, Dave opened up.

Dave tearfully related – in an emotional story – having frequent nightmares that something had gotten to him during his flight over Russia. For Dave, what made matters worse was he had absolutely no memory of the flight from the time he lifted off from Osun Air Base ( Korea ) until the day after he returned and found himself in the Naval Regional Hospital in Okinawa, Japan.

I managed to track down Dave, who lives in Southern California, who confirmed off-the-record the incident relayed by Doc to me was true. Dave said he was actually happy someone was writing about stories of contact and sightings by military pilots, and was sure he had some type of contact with a UFO.

One day, while still at Nellis Air Force Base test range S-4, we were informed there was an F-15 that crashed on the Nellis Range, where Area 51 is located. The F-15 crash happened in 1977. A Lieutenant Colonel and Doc Walters, the Hospital Commander, actually flew into the side of a mountain while doing a routine Functional Check Flight.

A sergeant, who worked for me, recovered the F-15 Heads-Up Display film canister while he had earlier been assigned to the Accident Investigation Team. He told me that a guy ( in a dark jump suit ), who was out of Washington D.C. personally took that film canister away from him – unusual because everything else was picked up and logged, first, then taken back to the assigned hanger for analysis – in-addition to a ‘prototype video camera’ ( also on the aircraft ) and flight data recorder; all handed over to the guy from Washington, D.C.

One night a couple of weeks after the crash, my U.S. National Security Agency ( NSA ) friend, Jerald ( Gerald ? Jerrold ? ) relayed to me – at the Silver Dollar Saloon ( Las Vegas, Nevada ) – that the aforementioned Lieutenant Colonel radioed the Nellis Tower that he had an ‘extremely large thing’ hovering his aircraft and was experiencing loss of flight systems. His communications went dead and a few seconds later his aircraft exploded into the side of a mountain-top.

Jerald, who was the most ‘connected’ person I’ve ever known, told me viewing the ‘video’ showed some type of oval vehicle – of tremendous size – was so close to the F-15 its camera appeared to have gone out of focus.

When Doc and the Lieutenant Colonel ejected, the UFO was still above them, their bodies were torn to shreds. Officially, it was determined – as always the case – that ‘pilot error’ caused the perfectly functional aircraft – in clear airspace with maximum visibility – to crash.

Nevada calls itself the silver state, the battle-born state, and the sagebrush state. A more appropriate motto would be, the conspiracy state.

Of the 111,000 square miles of land in Nevada, over 80% is controlled by the federal government – the highest percentage of any state in the U.S. If it were not for the gaming industry in Nevada, the federal government would be the largest employer in the State of Nevada, with 18,000 federal and military personnel and another 20,000 government contractors and suppliers.

The Nevada Test Site, Nellis Air Force Base and Range [ Nevada ], Fallon Naval Air Station [ Nevada ], the Tonopah Range, and the aerospace industry eat-up a lot of U.S. tax dollars.

The Nevada Test Site and the Nellis Range have myriad secrets yet to be revealed, including a super secret laboratory named DARC, part of the Defense Advanced Research Center ( DARC ).

DARC is located, inside the Nellis Range, 10 stories underground.

DARC was built in the 1980s with SDI ( Strategic Defense Initiative ) money.

DARC is next to a mountain – near Papoose Lake, south of Groom Lake, Nevada – where the TR-3Bs ( TIER III B ) are stored in an aircraft hanger built into a side of a mountain near DARC. The Nellis Range covers more than 3,500,000 million acres.

EG&G, a government contractor, provides classified research and development services for the military and government, and supplies technical and scientific support for nuclear testing and energy research and development programs.

EG&G provided large diameter drilling, mining and excavation for DARC underground and mountainside facilities.

EG&G built these DARC hidden bunkers, mountain hangers and vast underground facilities at Groom Lake, Papoose Lake, and Mercury for the government – facilities and observations posts well camouflaged inside the Nevada Test Site and the Nellis Range.

Starting in 1971 and continuing through 1975, a massive amount of excavation took place at the Groom Lake facility and Papoose Lake facility where most subsequent construction also took place underground.

In 1972, EG&G was granted an ‘indefinite contract’ called “Project Red-Light” to support the U.S. Department of Energy ( DOE ) and the military. This contract gave EG&G responsibility to assist in the recovery of nuclear materials in cases of mishaps and to provide aerial and ground security for highly classified government and military sites.

My sources say DOE and NSA are primary responsibles, to the MJ-12 committee, for reacting to UFO sightings and recovery of UFO artifacts in cases of crash.

So what’s going on more recently, you may ask? Let’s talk about the newest secrets and rumors:

  [ photo ( above ) AVRO VZ-9 AeroCar ( 1957 BELL Laboratory ( Canada ) contractor for CIA ) ]

The Hillary platform, AVRO saucer and NORTHROP sections where aerospace vehicles with advance technology was developed and tested – each emulated some characteristic of UFOs as described by the late Dr. Paul Hill, a NASA UFO investigator. Hill’s posthumously published book, Unconventional Flying Objects, discusses UFO technology extensively. If you have not read this illustrious tome, I suggest you do so.

Newly unclassified documents show that AVRO [ Canada ] built and tested a number of saucers – at Area 51 in Nevada  [ USA ] – contrary to DOD lies that the program was canceled because it failed to meet expectations. LOCKHEED Advanced Developmental Projects Division ( ADPD ) – also known as the “Skunk Works” – developed the A-12 – for the U.S. Central Intelligence Agency ( CIA ) – and a later version, the SR-71 for the USAF, in the early 1960s.

30-years later, the SR-71 was still breaking world speed records. The sleek matte-black stiletto shaped spy plane SR-71 Blackbird – traveling 2,000 miles in 1-hour 4-minutes ( including multiple orbits, touch-down and take-off ) – broke the world air speed record from Los Angeles, California to Washington, D.C. on its retirement flight in 1990.

Area 51 ( Groom Lake Air Base Nevada facilities ) has a 6-mile long runway, the longest in the United States where U.S. Department of Defense ( DOD ) and CIA most exotic aerospace vehicles are tested and modified. Area 51 is a place where curious outsiders circulate rumors about aliens and extra-terrestrial technology being utilized to accelerate the various programs at Area 51.

Why a 6-mile long runway? You need a runway this long if the minimum, or stall speed, of an aircraft is a very high speed. Aircraft ( without wings ), like wedge shaped lifting bodies or those with 75 degree swept back wings, have very high stall speeds, and while they launch-lift very fast, they land even faster.

My sources estimate that up to 35% of SDI ( Strategic Defense Initiative – “Star Wars” ) funding was siphoned-off to provide primary expenditures for the U.S. Air Force most secret ‘Black Program’, begun in 1982, the AURORA Program.

AURORA, the Program, ‘is’ the ‘code name’ of the ‘ongoing Program’ to have advanced aerospace vehicles built and tested.

AURORA, contrary to popular belief, is ‘not’ the name of an individual aircraft.

AURORA Program is the namesake of “aurora borealis” – excited gas in the upper atmosphere. As early as 1992 the Air Force had already made contingency plans to move some of it’s aircraft out of Groom Air Base ( Area 51 ) where the public eye was on the base and they ( government officials ) did ‘not’ like that one bit.

Everything, needing a long runway, like the SR-75 was removed by early 1992 to other bases in Utah, Colorado, Alaska, Greenland, Korea, Diego Garcia, and remote islands in the Pacific.

Short take-off and landing vehicles ( STOL ), especially the bat wing TR-3A ( TIER III A ) Manta and TR-3B ( TIER III B ) Astra – the ‘Flying Triangle’ – were relocated to Papoose Lake, Nevada ( USA ) in the southern part of the Nellis Air Force Base ( AFB ) Test Range Site called Area S-4. Other than the SR-75 – still being dispersed to other locations – more research and development ( R&D ) and flight operational test and evaluation continues in Area 51, more-so now than ever before.

For the last few years high-tech buffs speculated that at least one ( 1 ) new and exotic aerospace vehicle existed, the SR-75 – first operational AURORA Program vehicle that went operational in 1989, 2-years ( 1987 ) ‘after’ flight testing and modifications were complete.

SR-75 is a hypersonic strategic reconnaissance ( SRV ) spy plane called the Penetrator, a mother ship, that I will explain shortly.

Hypersonic speeds start at approximately Mach 5.

SR-75 replaced the SR-71 spy plane, retired in 1990 by the U.S. Air Force that said, “there is no replacement, all we really need is our spy satellites to do the job.” Hmm?

The DOD, upon analysis of Desert Storm, admitted satellites ( alone ) could not provide the necessary quick response real-time reconnaissance information needed by various military agencies. Yet they have repeatedly fought some U.S. Congressional efforts to bring back the SR-71. Why? The answer should be obvious.

SR-75, is better capable of global positioning in less than 3-hours, equipped with multi-spectral sensors ( optical, radar, infrared, and laser ) collecting images, electronics intelligence ( ELINT ), signals intelligence ( SIGINT ), illuminating ( lighting-up via laser ) targets, and more.

SR-75, far exceeds classified military speed and altitude records set by predecessor SR-71 flight records – still classified Mach 3.3+ ( confidently add another 3 to 4 Mach + ) with ceiling altitudes of ( at least ) 85,000 feet.

SR-75, attained altitudes of over 120,000 feet and speeds exceeding Mach-5 ( 5 times faster than the speed of sound ) – equating to more than 3,300 miles per hour.

SR-75, from take-off to landing, can make the round trip from central Nevada to Northeast Russia and back in under 3-hours.

SR-75, is 162-feet long and has a wing span of 98-feet.

SR-75, fuselage begins at 10-feet above ground.

SR-75, carries a flight crew of three ( 3 ), i.e. pilot, reconnaissance officer and launch control officer, the latter of whom doubles as the SR-75 Penetrator electronics warfare ( EW ) officer. Two ( 2 ) methane and LOX fueled high bypass turbo-ramjet ( combined cycle ) engines – housed under each wing – where bays run 40-feet thereunder, terminating at the trailing edge of the wing.

SR-75 Penetrator Pulse Wave Detonation engines ( 2 ) push speeds above Mach 5 – now reported Mach 7+ or 4,500-mph ( miles per hour ) with now new engine modifications. Although this plane has been sighted on numerous occasions, picked up on military radar, the pulse wave detonation contrail has been seen, but the Air Force vehemently denies its existence.

SR-75 large Pulse Wave Detonation engine bay inlets ( located under each wing of the black mother ship – hang downward 7-feet under the wings ) – are 12-feet wide.

SR-71 Blackbird, SR-75 Penetrator ( black mothership ), and its SR-74 daughter ship SR-74 ( Scramp ) were all built by LOCKHEED Advanced Development Company ( Skunk-Works ).

SR-74 Scramp, daughtership of the SR-75, is a ‘drone’ launched from ‘under’ the SR-75 black mothership fuselage.

SR-74 Scramp, after launch, utilizes a supersonic combustion ram-jet engine ( Scramjet ).

Jerald witnessed the flight of the big black Air Force SR-75 carrying the little unmanned SR-74 while inside Area 51 where it was initially positioned piggyback ( on its upper raised platform ) atop the SR-75 Penetrator.

[ photo ( above ) LOCKHEED YF-12-A / SR-71 Blackbird drone: D-21 ( drone ) ( click to enlarge ) ]

Remember, the SR-74 Scramp can ‘not’ launch itself from the ground.

SR-75 talk, was heard by me, as far back as the late 1970s when I was at Groom [ Groom Lake ], and I’ve 2 additional friends who saw the SR-75 at Groom Lake, Nevada.

SR-74 Scramp, from its SR-75 mother ship, can only launch at an altitude above 100,000-feet + from where it can then attain orbital altitudes over 800,000-feet ( 151-miles + ).

The U.S. Air Force SR-74 Scramp is for launching small highly classified ‘ferret satellites’ for the U.S. National Security Agency ( NSA ). SR-74 Scramp, can launch at least two ( 2 ) 1,000-pound satellites measuring 6-feet by 5-feet.

SR-74 Scramp is roughly the equivalent size and weight of a F-16 fighter.

SR-74, can easily attain speeds of Mach 15, or a little less than 10,000 miles per hour. NASA Space Shuttle is ‘technologically antique’ by comparison – taxpayer joke.

If you think these rumors are far-fetched, look at the YB-49 and XB-70 flown in 1948 and 1964 respectively, then look at the SR-75 that has been spotted numerous times.

Some say, “The government cannot keep a secret,” wrong statement if some think the government can’t.

There are new rumors that the U.S. placed two ( 2 ) new vehicles in permanent orbit. One of these, being the Space Orbital Nuclear – Service Intercept Vehicle ( SON-SIV ) code name LOCUST.

The SR-74 SCRAMP and the TR-3B [ TIER III B ] can deliver Spare Replacement Units ( SRU ), service fuels, fluids and chemicals to the SON-SIV.

SON-SIV robotic uses those deliverables to service, calibrate, repair and replace parts on the newer NSA, CIA and NRO ( National Reconnaissance Office ) satellites built to be maintained ‘in space’.

Finally, I’ve saved the best for last. The Operational model of the TR-3B [ TIER III B ].

The early information I gathered from interviewing my contacts and their closest friends who worked black programs resulted in the basic specifications of the TR-3B [ TIER III B ] Flying Triangle. I had this simple drawing by late 1990.

On the night of March 30, 1990 a Captain of the Belgium ( Belgique ) National Police decided to pursue incoming reports about a ‘triangular’ shape UFO. Two ( 2 ) radar installations, one a NATO defense group and the other a Belgium ( Belgique ) civilian and military radar verified the triangle UFO.

Excellent atmospheric conditions prevailed, and there was no possibility of false echoes due to temperature inversions. At 5 AM in the morning, 2 dispatched F-16 fighters spotted the Triangle on their radar screens that had locked onto the target.

Six seconds later the object speeded up from an initial velocity of 280 kilometers per hour to 1,800 kilometers per hour; at the same time descending from an altitude of 3,000 meters to 1,700 meters, then down to 200-meters, causing the F-16 radars to lose lock-on. This maneuver happened all in a matter of 1-second. The 40 G acceleration of the Triangle was some 32 gravitational forces higher than what a human pilot can stand.

Contrary to normal aeronautical expectations, no sonic boom was heard. This phenomenal game of hide and seek was observed by twenty [ 20 ] Belgium National Policemen and hundreds of other witnesses who all saw the Triangular vehicle and the F-16 fighters. The chase was repeated twice more during the next hour.

Belgians have made all information of this event public, unlike our U.S. government that only admits nothing and denies everything to do with UFOs – even when some are ours.

[ photo ( above ) another Triangle craft believed over Turkey enroute to Iraq ( click to enlarge ) ]

The TR-3B original photo was taken with a digital camera carried aboard, a special black operations C-130 cargo plane, by a U.S. Air Force Special Operations sergeant who took the picture from aboard the C-130 flying mission support for the TR-3B.

I’ve seen this picture personally and have interviewed several people who worked on the program. I’m sure of my facts and specifications.

You can see for yourselves, that from the Belgium pictures, a computer generated software composite rendition – resulting from the Europe sightings, plus my original schematic – taken from interviews – that this ‘is an accurate rendition’ of the TR-3 [ TIER III ].

From the ‘original digital photograph’ of the TR-3B, a ‘computer graphic enhanced representation’ – made using 3D graphic rendered software – hangs on the wall inside a ‘black vault’ at the AURORA Program Office. I am not at liberty to divulge further details about the digital picture except to say a friend took a great career risk taking it and showing it to me.

We have used these highly accurate computer graphic pictures of the Prototype and Operational models of the TR-3B to get further verification of their accuracy. You will not get a clearer picture of what the Flying Triangles are until one lands amidst ‘public domain’ and is captured by CNN or other news media.

Jerald said he would never forget the sight of the alien looking TR-3B based at Papoose Lake, Nevada where the pitch black triangular shaped craft was rarely mentioned – and then only in hushed whispers – at the Groom Lake facility where he worked.

TR3B [ TIER 3 B ] flew over the Groom Lake runway in complete silence, and magically stopped above Area S-4 where t hovered ( silently for about 10-minutes ) silently in the same position before gently settling vertically to the tarmac.

At times a corona of silver blue light glowed around the circumference of the massive TR-3B. The operational model of the TR3-B is 600-feet across.

TR3B ‘prototype’ ( 200-foot ) and TR3B ‘operational’ ( 600-foot ) version crafts are code named ASTRA.

TR3B tactical reconnaissance version first [ 1st ] ‘operational flight’ was in the early 1990s.

TR-3A Manta [ TIER III A Manta ] is a subsonic reconnaissance vehicle shaped like a bat wing and is in no way related to the TR-3B.

The nomenclature for the TR-3B is unconventional and named thusly to confuse those tracking black budget Programs, Projects and rumors leaked to be purposefully confusing as most are in the aerospace industry where one would think there ‘must be a relationship’ between the TR-3A and the TR-3B, although there is ‘no relationship’.

TR3B triangular shaped nuclear powered aerospace platform was developed under the Top Secret AURORA Program with SDI ( Strategic Defense Initiative – Star Wars ) and black budget monies. At least three [ 3 ] TR3B, $1,000,000,000 billion dollar +, were flying by 1994.

AURORA is the most classified aerospace development program in existence – with its TR-3B being the most exotic vehicle created from the AURORA Program – funded and operationally tasked by the National Reconnaissance Office ( NRO ), NSA, and CIA.

TR-3B flying triangle is ‘not fiction’ and was built with technology available in the mid 1980s and uses more reversed alien technology than any vehicle ever before, but ‘not’ “every” UFO spotted is one of theirs.

TR-3B vehicle outer coating is electro-chemical reactive, that can fluctuate from radar electrical frequency ( EF ) stimulation, causing reflective color spectrum changes from radar absorption.

TR3B is the first [ 1st ] U.S. aerospace vehicle employing quasi-crystals within its polymer skin used in conjunction with TR-3B Electronic Counter Measures ( ECCM ) that can make the vehicle look like a ‘small aircraft’ or ‘flying cylinder’ that also fools radar receivers into falsely detecting a either a ‘variety of aircraft’, ‘no aircraft’ or ‘several aircraft placed in various locations’.

Electro-chromatic polymer skins can be uncovered performing research that points to stealth material properties.

A couple in Ohio spotted the Triangle in early 1995, was first spotted by a man indicating its orange ball of light and then triangle shape with three [ 3 ] bright spots at each corner. As it moved slowly southward they were awestruck by the enormous size. “The damn thing is real,” he exclaimed to his wife, “It’s the flying triangle.” The man said it was the size of “two [ 2 ] football fields” – making it 200 yards or 600-feet across – same as the TR3B operational version craft.

From the collection of pictures, analysis and further refinement we now have a better schematic layout of the Top Secret USAF Flying Triangle [ TR 3 B ] seen by thousands that the U.S.  Department of Defense ( DOD ) and U.S. government claims ‘does not exist’.

A circular, plasma filled accelerator ring called the Magnetic Field Disrupter, surrounds the rotatable crew compartment, far ahead of any imaginable technology.

Sandia & Livermore U.S. National Laboratories developed reverse engineered MFD technology

The government will go to any lengths to protect this technology. The plasma, mercury based, is pressurized at 250,000 atmospheres at a temperature of 150 degrees Kelvin, and accelerated to 50,000 rpm to create a super-conductive plasma with the resulting gravity disruption.

MFD generates a magnetic vortex field, which disrupts or neutralizes the effects of gravity on mass within proximity, by 89 percent. Do not misunderstand. This is ‘not’ anti-gravity. Anti-gravity provides a ‘repulsive force’ that can be ‘used for propulsion’.

The MFD creates a disruption – of the ‘Earth gravitational field’ – upon the mass within the circular accelerator.

The mass of the circular accelerator and all mass within the accelerator, such as the crew capsule, avionics, MFD systems, fuels, crew environmental systems, and the nuclear reactor, are reduced by 89%.

A side note to the Magnetic Field Disruptor development; one source that worked at GD Convair Division in the mid 1960s described a mercury based plasma that was cooled to super-conductive temperatures, rotated at 45,000-rpm and pressurized at thousands of atmospheres. This would be considered state-of-the-art technology even by today’s standards, some 30 years after he worked this project.

He related that the project achieved its objective. Instruments and test objects within the center of the accelerator showed a 50 percent loss of weight, attributed to a reduction in the gravitational field. He had worked on MFD as far back as 1965 and was told by a senior scientist that the research had been going on for a decade. See: Convair, notes from Gravitics research and Gravity RAND article.

The current MFD in the TR-3B causes the effect of making the vehicle extremely light, and able to outperform and outmaneuver any craft yet constructed, except of course, the UFOs we did not build.

The TR-3B is a high altitude, stealth, reconnaissance platform with an indefinite loiter time. Once you get it up there at speed, it doesn’t take much propulsion to maintain altitude.

At Groom Lake there have been whispered rumors of a new element that acts as a catalyst to the plasma.

Recently NASA and Russia have admitted breakthroughs in technology that would use a plasma shield for exotic aerospace vehicles. If you know anything about the history of classified black programs, you know that by the time NASA starts researching something, it’s either proven or old technology. They are the poor stepchildren when it comes to research and development technology and funding.

With the vehicle mass reduced by 89% the craft can travel at Mach 9, vertically or horizontally. My sources say the performance is limited only the stresses that the human pilots can endure. Which is a lot, really, considering along with the 89% reduction in mass, the G forces are also reduced by 89%.

The crew of the TR-3B should be able to comfortable take up to 40Gs. The same flight characteristics described in the Belgium sightings and many other sightings. Reduced by 89%, the occupants would feel about 4.2 Gs.

The TR-3Bs propulsion is provided by 3 multimode thrusters mounted at each bottom corner of the triangular platform. The TR-3 is a sub-Mach 9 vehicle until it reaches altitudes above 120,000 feet – then who knows how fast it can go.

The three [ 3 ] multimode rocket engines mounted under each corner of the craft use hydrogen or methane and oxygen as a propellant. In a liquid oxygen / hydrogen rocket system, 85% of the propellant mass is oxygen.

The nuclear thermal rocket engine uses a hydrogen propellant, augmented with oxygen for additional thrust.

The reactor heats the liquid hydrogen and injects liquid oxygen in the supersonic nozzle, so that the hydrogen burns concurrently in the liquid oxygen afterburner.

The multimode propulsion system can operate in the atmosphere, with thrust provided by the nuclear reactor, in the upper atmosphere, with hydrogen propulsion, and in orbit, with the combined hydrogen\ oxygen propulsion.

What you have to remember is that the 3 multi-mode rocket engines only have to propel 11% of the mass of the Top Secret TR-3B. Rockwell reportedly builds the engines.

From the evolution of exotic materials, advanced avionics, and newer propulsion engines the stealth aircraft were born. Leaps in technology have been obtained with reverse engineering of Alien Artifacts as described in the newly released MJ-12 Revised Charter, signed during the Reagan [ Ronald Reagan ] Administration.

According to Jerald’s account, the technology developed at Papoose far exceeded any known within the world scientific community. Jerald was in his late 50s when I first met him in LV. He had actually spoken to scientists who analyzed the Roswell vehicle and technology–technology that we can assuredly assume was developed from reverse engineering of recovered alien artifacts. The control of all Alien Artifacts, i.e. the research, the reverse engineering, and analysis of the Extraterrestrial Biological Entities (aka) EBE, were transferred to the super-secret laboratory, called the Defense Advanced Research Center ( DARC ), in Area S-4.

Many sightings of triangular UFOs are not alien vehicles but the top secret TR-3B. The NSA, NRO, CIA, and USAF have been playing a shell game with aircraft nomenclature.

Creating the TR-3, modified to the TR-3A, the TR-3B, and the Tier 2, 3, and 4, with suffixes like Plus or Minus added on to confuse even further the fact that each of these designators is a different [ Unmanned Aerial vehicle ( UAV ) ] aircraft, and not the same aerospace vehicle. [ See, e.g. “ DARPA Photos ” on this website for more on UAV and UCAV with Tier designations provided. ]

A TR-3B is as different from a TR-3A as a banana is from a grape. Some of these vehicles are manned and others are unmanned.

Before Jerald died, we had a long conversation. He was sure he had documentation that would prove the existence of the MJ-12 committee and our using crashed alien vehicles to reverse engineer their technology. I told him that I did not want any classified documents in my possession. I never found out what happened to them.

I also believe the recently deceased Colonel Corso, who discloses the government’s involvement with alien technology, was an honest and honorable man. I believe he was on the inside of administering alien artifact protocol for the Army, and he might have embellished the depth of his involvement.

I don’t have time to go into the two [ 2 ] unique MJ-12 documents that I acquired.

The characters in [ my book ] Alien Rapture are fictional, but the facts of the covert government agenda to suppress alien artifacts, reverse engineering of alien [ ExtraTerrestrial ] technology, and the details of Black Programs are absolutely true.

Part of our agreement was that every one of my close five friends would get a chance to look at the manuscript before I sent it to any Literary Agents.

Dale discussed the Alien Rapture manuscript with his father, who worked high up in the NSA for over 20-years. His father asked him how much he trusted me, and Dale told him ‘completely’.

Dale’s father provided him with two MJ-12 documents, and told him to retype them, and send them to me with the understanding that I would not ever reveal the source of them.

The documents that Dale retyped had most the names and dates blacked out. This is how I received these documents, and I was so naive that I didn’t even know what the histories of the MJ-12 documents were.

From as far back as the Viet Nam conflict [ war ], I knew Dale and that he was as close to his father as a son can get. I do not feel that his father used him for distributing disinformation. Whether his father was duped, I have no idea. From my personal opinion, I believe the MJ-12 committee was real, and still exists in some form.

The TR-3B’s anti-gravity physics is explained, insofar as the theory of general relativity can be considered as an explanation for anti-gravity flight anomalies.

Edgar Fouché describes (above) the TR-3B’s propulsion system as follows: “A circular, plasma filled accelerator ring called the Magnetic Field Disrupter, surrounds the rotatable crew compartment and is far ahead of any imaginable technology…

The plasma, mercury based, is pressurized at 250,000 atmospheres at a temperature of 150 degrees Kelvin, and accelerated to 50,000 rpm to create a super-conductive plasma with the resulting gravity disruption.

The MFD generates a magnetic vortex field, which disrupts or neutralizes the effects of gravity on mass within proximity, by 89%…

The current MFD in the TR-3B causes the effect of making the vehicle extremely light, and able to outperform and outmaneuver any craft yet …My sources say the performance is limited only the stresses that the human pilots can endure. Which is a lot, really, considering along with the 89% reduction in mass, the G forces are also reduced by 89%.

The crew of the TR-3B should be able to comfortable take up to 40Gs… Reduced by 89%, the occupants would feel about 4.2 Gs.

The TR-3Bs propulsion is provided by 3 multimode thrusters mounted at each bottom corner of the triangular platform. The TR-3 is claimed by some to be either a sub-Mach-9 up to a Mach 25+ lenticular-shaped vehicle until it reaches altitudes above 120,000 feet. No one knows really how fast it can go.

Many have been skeptical of Mr. Fouché’s claims however, in an interesting scientific article another claim is made that, the charged particles of plasma don’t just spin uniformly around in a ring, but tend to take up a synchronized, tightly pitched, helical or screw-thread motion as they move around the ring. This can be understood in a general way, where the charged particles moving around the ring act as a current that in turn sets up a magnetic field around the ring.

It is a well-known fact that electrons ( or ions ) tend to move in a helical fashion around magnetic field lines. Although it is a highly complex interaction, it only requires a small leap of faith to believe that the end result of these interactions between the moving charged particles ( current ) and associated magnetic fields results in the helical motion described above. In other words, the charged particles end up moving in very much the same pattern as the current on a wire tightly wound around a toroidal core.

In an article entitled, “Guidelines to Antigravity” by Dr. Robert Forward, written in 1962 ( available at: http://www.whidbey.com/forward/pdf/tp007.pdf ) Dr. Forward describes several little known aspects – of Einstein’s general relativity theory – indicating how moving matter can create unusual gravitational effects. Figure 5, indicates how the moving matter pattern describes what’s necessary to generate a gravitational dipole which was exactly the same as the plasma ring pattern described in the physics article discussed above.

If Fouche’s description is even close to correct, then the TR-3B utilizes this little known loophole in the General Relativity Theory to create it’s antigravity effects. Even though the TR-3B can only supposedly cancel 89% of gravity (and inertia) today, there is no reason why the technology can’t be improved to exceed 100% and achieve true antigravity capability.

In theory, this same moving matter pattern could be mechanically reproduced by mounting a bunch of small gyroscopes – all around the larger ring – with their axis on the larger ring, and then spinning both the gyroscopes and the ring at high speeds, however as Dr. Forward points out any such mechanical system would probably fly apart before any significant anti-gravity effects could be generated. However, as Dr. Forward states, “By using electromagnetic forces to contain rotating systems, it would be possible for the masses to reach relativistic velocities; thus a comparatively small amount of matter, if dense enough and moving fast enough, could produce usable gravitational effects.”

The requirement for a dense material moving at relativistic speeds would explain the use of Mercury plasma (heavy ions). If the plasma really spins at 50,000 RPM and the mercury ions are also moving in a tight pitched spiral, then the individual ions would be moving probably hundreds, perhaps thousands of times faster than the bulk plasma spin, in order to execute their “screw thread” motions. It is quite conceivable that the ions could be accelerated to relativistic speeds in this manner. I am guessing that you would probably want to strip the free electrons from the plasma, making a positively charged plasma, since the free electrons would tend to counter rotate and reduce the efficiency of the antigravity device.

One of Einstein’s postulates of the Theory of General Relativity says that gravitational mass and inertial mass are equivalent. This is consistent with Mr. Fouche’s claim that inertial mass within the plasma ring is also reduced by 89%. This would also explain ‘why the vehicle is triangular’ shaped. Since it still requires conventional thrusters for propulsion, the thrusters would need to be located outside of the “mass reduction zone” or else the mass of the thruster’s reaction material would also be reduced, making them terribly inefficient. Since it requires a minimum of three [ 3 ] legs to have a stable stool, it follows that they would need a minimum of three [ 3 ] thrusters to have a stable aerospace platform. Three [ 3 ] thrusters – located outside of the plasma ring – plus appropriate structural support would naturally lead to a triangular shape for the vehicle.

Some remain skeptical of the claimed size for the TR-3B at being approximately 500 to 600-feet across. Why would anyone build a tactical reconnaissance vehicle almost two ( 2 ) football fields long? However, the answer to this may also be found in Dr. Forward’s paper. As Dr. Forward’s puts it, “…even the most optimistic calculations indicate that very large devices will be required to create usable gravitational forces. Antigravity…like all modern sciences will require special projects involving large sums of money, men and energy.” Dr. Forward has also written a number of other articles, at: http://www.whidbey.com/forward/TechPubs.html

The TR3-B was spotted by U.S. military personnel as the “initial penetration bomb delivery vehicle” prior to the follow-up work done by the F-117 stealth fighters and B-2B bomber mop-up crews and not what Americans were told by the media were the initial strike penetration vehicles used during the Persian Gulf War air strikes inside Iraq airspace.

Recent sightings in December of 2000 has the TR3-B flying south from Utah across the mountain regions near a National Monument valley of Taos, New Mexico. For more information and images, See, e.g. ” UFO & ETT Pics ” on this X-CIA Files website.

Aerial vehicles, such as this ( below ), appears to have been around for decades

[ photo ( above ) VRIL VII Manned Combat Aerial Vehicle ( MCAV ) Circa: 1941 ( East Germany ) ]

November 9, 2001

The Old Technology

Reverse-engineered information production, although rarely used due to time consumption, may be utilized to uncover data previously gathered but somehow overlooked is a methodology rarely used by some of the most sophisticated intelligence gathering agencies in the World.

The concept is not ‘new’ but is the most feared form of information assimilation that tears down the walls of classification and censorship as to how things are or were previously perceived as fundamental beliefs. Little known and forgotten about practices and methodologies can easily circumvent new technologies by using the “Old Technology”.

An example of “Old Technology” was demonstrated in early-1970s era Soviet MIG defense aircraft using avionics that the West once saw used in 1950s-era television (TV) sets, i.e. electron vacuum tubes. During the 1970s, the West – sitting smug with transistorized miniature circuits in their fighter aircraft avionics – laughed at Soviet defense aircraft use of electron vacuum tube technology in these 1970s era advanced fighter jet aircraft being utterly outdated technology.

Years later, however the West discovered that Soviet defense aircraft avionics equipped with electron vacuum tube-type technology conquered EMP (elector-magnetic pulse) radiation (produced during above-ground Nuclear detonations) transistor micro circuit technology disruption. EMP wreaked havoc on Western ‘new technology’ electronics at great distances (sometimes, hundreds of miles) away from a ground zero nuclear blast or airburst electronic bombs or e-bombs.

Old “TV tube” technology was actually “higher technology” than what the West used for military defense fighter aircraft during the 1970s. The West was forced to develop other technology strategies, one of which was actually based on an even older technology, the Faraday Cage, a defense used against e-bombs.

That hurdle and yet another were to be eventually conquered, even better once again, with an ‘old technology school-of-thought’. The West found that the age-old adage of ‘fighting fire with fire’ would apply to protecting communication and electronic devices by, simply bombarding its “pre-production material’s structures” with gamma particle radiation. Thus the term, “rad-hard” or, “radiation hardened”, was coined.

In order to prevent EMP telecommunication disruption, fiber-optic communication cables using light-waves to transmit communications were bombarded (before installation) with ‘gamma radiation particles’ that provides a gamma-to-gamma resistance or vaccination against EMP telecommunication disruption, a “rad-hard technique” incorporated for years within U.S. defense Command, Control, Communication, and Computer Intelligence (C4I).

Old Technology, i.e. electron vacuum tubes, solid-state, analog, and arithmetic calculations, once thought to be old technology is still considered sensitive and continuing to be classified by the U.S. government. On the other hand, Extra-Terrestrial Technology (ETT) has brought unusual products into the consumer marketplace, which many take for granted. “Smart Structures” such as “Hybrid Composites” and “Smart Materials” such as “Shape Memory Alloys”, the later of which is now in the public domain and found in new technology ‘eyewear frames’ that will reshape back to their original form – after being bent out of shape – when water is poured over the frames.

Some believe that “cellphone technology” is a form of ETT, nevertheless more ETT products are coming our way, however what is “not” coming our way is the background information we should be focusing on, and for this reason, the following information is revealed.

Research and development on ETT materials was decompartmentalized into un-recognizable facilities after Area 51 began receiving so much publicity. Highly classified material and research projects began being conducted off government installations in universities and private firm research facilities around the world.

Initially, material pieces and sections of covertly held extra-terrestrial spacecraft began undergoing research studies at the Wright-Patterson Air Force Base General Electric Research and Development Division in Ohio, the Nellis Air Force Base test range sites at S-4 and Area 51 near Groom Lake in Nevada, and on a U.S. Army base in Dugway, Utah.

The U.S. Central Intelligence Agency (C.I.A.), U.S. National Security Agency (N.S.A.), and U.S. National Reconnaissance Office (N.R.O.) didn’t feel they had enough real estate space at Nellis Air Force Base (A.F.B.) to sufficiently “test” alien technology or ETT spacecraft insofaras their reverse-engineering programs went so, they went about an ingenious way of slowly but surely expanding their real estate property coverage area, which served to keep sensitive information even further hidden from the prying eyes of the outside World.

Additional funds were also used to bunkerize ETT-developed U.S. defense air, sea and even the new land-tunneling vehicular programs where most of the extremely sensitive and larger programs went literally underground from even the ever so watchful eyes of not only our own un-controllable and subverted satellites but, friendly foreigns as well as, enemy-based borns too. With everything in-place, the Government could move forward.

– – – –

Area 51 Law Suits – 1995 thru 2000

Lack of oversight creates opportunities for violations of environmental law to go undetected and unpunished. Some have charged that the Department of Defense, as recently as 1993, used secrecy as a cover for violations of environmental law. Recent lawsuits against the Department of Defense and the Environmental Protection Agency (EPA) allege that:

(1) Illegal open-air burning of toxic wastes took place at a secret Air Force facility near Groom Lake, Nevada; and,

(2) EPA has not exercised its required environmental oversight responsibilities for this facility.

Responding to the second (2nd) of these lawsuits, the EPA reported that in early 1995 it had seven (7) regulators on staff with Special Access [access to “Black Programs”] clearance that inspected the Groom Lake facility regarding “unknown health dangers” suffered by U.S. government contractors whom worked at the U.S. Nellis Air Force Base (AFB) experimental testing range near Groom Lake, Nevada known as “Area 51″.

The U.S. government would not release information to the victims or their professional advisors (medical and legal) as to the exact nature of “what” those workers had been exposed to, and consequently lawsuits were filed against the U.S. government.

What the public wasn’t aware of was, what occurred ‘before’ a U.S. Presidential directive in 2000.

What Happened In 1995:

In a federal 9th Circuit Court of Appeals case, these same injured government workers – from Area 51 – were battling for the release of U.S. government information. In one such instance, the U.S. Air Force base “Nellis AFB Nevada” claimed one (1) of their “manuals” – an “unclassified manual” in its entirety – should be considered a “classified” manual by the federal court. [It is suspected that because the particular “unclassified” manual was simply “found to be at Area 51” that it automatically became “classified” by reason of its “location”. See, e.g. Declaration of Sheila E. Widnall (below)]. Hence, the 9th circuit court of appeals ruled in favor of the government, and prohibiting government worker Plaintiffs from pursuing their case further through any other court.

Interesting information on this was revealed by, an official of the United States Air Force during the federal hearings. (See Immediately Below)

UNCLASSIFIED DECLARATION AND CLAIM OF MILITARY AND STATE SECRETS PRIVILEGE OF:

SHEILA E. WIDNALL, SECRETARY OF THE AIR FORCE

I, SHEILA E. WIDNALL, HEREBY DECLARE THE FOLLOWING TO BE TRUE AND CORRECT:

1. Official Duties: I am the Secretary of the United States Air Force and the head of the Department of the Air Force. In that capacity, I exercise the statutory functions specified in section 8013 of Title 10, U.S. Code. I am responsible for the formulation of Air Force policies and programs that are fully consistent with the national security directives of the President and the Secretary of Defense, including those that protect national security information relating to the defense and foreign relations of the United States. As the Secretary of the Air Force, I exercise authority over the operating location near Groom Lake, Nevada, and the information associated with that operating location. As the head of an agency with control over the information associated with the operating location near Groom Lake, I am the proper person to assert the military and state secrets privilege with regard to that information. Under Executive Order 12356, I exercise original TOP SECRET classification authority, which permits me to determine the proper classification of national security information on behalf of the United States. Executive Order No. 12356, Sec. 1.2, 47 Fed. Reg. 20,105 (1982), reprinted in 50 U.S. Code Section 401 (1991); Presidential Order of May 7, 1982, Officials Designated to Classify National Security Information, 50 U.S. Code Section 401 (1991).

2. Purpose: This Declaration is made for the purpose of advising the court of the national security interests in and the security classification of information that may be relevant to the above captioned lawsuits. The statements made herein are based on (a) my personal consideration of the matter, (b) my personal knowledge; and (c) my evaluation of information made available to me in my official capacity. I have concluded that release of certain information relevant to these lawsuits would necessitate disclosure of properly classified information about the Air Force operating location near Groom Lake, Nevada. I am satisfied that the information described in the classified Declaration is properly classified. I have further determined that the information described in the classified Declaration, if released to the public, could reasonably be expected to cause exceptionally grave damage to the national security. It is not possible to discuss publicly the majority of information at issue without risking the very harm to the national security that protection of the information is intended to prevent.

3. Security Classification: Under Information Security Oversight Office guidance, “[certain information that would otherwise be unclassified may require classification when combined or associated with other unclassified information.” (32 CFR 2001.3(a)) Protection through classification is required if the combination of unclassified items of information provides an added factor that warrants protection of the information taken as a whole. This theory of classification is commonly known as the mosaic or compilation theory. The mosaic theory of classification applies to some of the information associated with the operating location near Groom Lake. Although the operating location near Groom Lake has no official name, it is sometimes referred to by the name or names of programs that have been conducted there. The names of some programs are classified; all program names are classified when they are associated with the specific location or with other classified programs. Consequently, the release of any such names would disclose classified information.

4. National Security Information: As the head of the agency responsible for information regarding the operating location near Groom Lake, I have determined that information that concerns this operating location and that falls into any of the following categories, is validly classified:

a. Program(s) name(s); b. Mission(s); c. Capabilities; d. Military plans, weapons, or operations; e. Intelligence sources and methods; f. Scientific or technological matters; g. Certain physical characteristics; h. Budget, finance, and contracting relationships; i. Personnel matters; and, j. Security sensitive environmental data.

The following are examples of why certain environmental data is sensitive to the national security. Collection of information regarding the air, water, and soil is a classic foreign intelligence practice, because analysis of these samples can result in the identification of military operations and capabilities. The presence of certain chemicals or chemical compounds, either alone or in conjunction with other chemicals and compounds, can reveal military operational capabilities or the nature and scope of classified operations. Similarly, the absence of certain chemicals or chemical compounds can be used to rule out operations and capabilities. Revealing the composition of the chemical waste stream provides the same kind of exploitable information as does publishing a list of the chemicals used and consumed. Analysis of waste material can provide critical information on the makeup as well as the vulnerabilities of the material analyzed. Disclosure of such information increases the risk to the lives of United States personnel and decreases the probability of successful mission accomplishment.

5. Role of State and Federal-Environmental Agencies: Since 1990, appropriately cleared representatives of Nevada’s Department of Conservation and Natural Resources have been authorized access to the operating location near Groom Lake. The state representative’s role is and has been to monitor and enforce compliance with environmental laws and regulations and to advise on remedial efforts, if required. Appropriately cleared officers of the U.S. Environmental Protection Agency were recently granted access to the operating location near Groom Lake for inspection and enforcement of environmental laws. Federal inspectors from the Environmental Protection Agency commenced an inspection pursuant to the Solid Waste Disposal Act, commonly referred to as a “RCRA inspection,” at the operating location near Groom Lake, Nevada on December 6, 1994.

[EDITOR’S NOTE: Groom Lake: On May 19, 1995, the Director of the FFEO and the Deputy Assistant Secretary of the U.S. Air Force signed a memorandum of agreement ensuring that EPA has continued access to the operating location near Groom Lake for administering environmental laws. Moreover, due to national security concerns, the Air Force agreed to provide reasonable logistical assistance to EPA. Finally, EPA agreed that any classified information obtained by EPA would be treated in accordance with applicable laws and executive orders regarding classified materials.]

The Air Force has taken these steps to ensure full compliance with all applicable environmental laws. At the same time that the operating location near Groom Lake is being inspected for environmental compliance, it is essential to the national security that steps also be taken to prevent the disclosure of classified information.

6. Invoking Military and State Secrets Privilege: It is my judgment, after personal consideration of the matter, that the national security information described in this Declaration and in the classified Declaration, concerning activities at the U.S. Air Force operating location near Groom Lake, Nevada, constitutes military and state secrets. As a result, disclosure of this information in documentary or testimonial evidence must be barred in the interests of national security of the United States. Pursuant to the authority vested in me as Secretary of the Air Force, I hereby invoke a formal claim of military and state secrets privilege with respect to the disclosure of the national security information listed in paragraph four of this Declaration and more fully discussed in the classified Declaration, whether through documentary or testimonial evidence.

7. Environmental Compliance: Although I have found it necessary to invoke the military and state secrets privilege, I believe it important to comment on the Air Force’s commitment to full compliance with the environmental laws of the United States. Our goal is to be the best possible environmental steward of the lands comprising the Nellis Range. To meet that goal we are cooperating and will continue to cooperate with both federal and state environmental agencies.

8. Under penalty of perjury, and pursuant to section 1746 of Title 28, U.S. Code, I certify and declare that the foregoing statements are true and correct.

Executed this 21st day of February 1995 at Arlington, Virginia.

Sheila E. Widnall Secretary of the Air Force

What Happened In 2000:

The outcome – five (5) years later – was that before President Clinton left office, he signed a document sealing the lid on the secret, once and for all, which also sealed the U.S. Government workers fate. Interests of “national security” were cited.

Below, is an exact copy of the document signed by the U.S. President.

THE WHITE HOUSE Office of the Press Secretary

For Immediate Release February 1, 2000

TO THE CONGRESS OF THE UNITED STATES:

Consistent with section 6001(a) of the Resource Conservation and Recovery Act (RCRA) (the “Act”), as amended, 42 U.S.C. 6961(a), notification is hereby given that on September 20, 1999, I issued Presidential Determination 99-37 (copy enclosed) and thereby exercised the authority to grant certain exemptions under section 6001(a) of the Act.

Presidential Determination 99-37 exempted the United States Air Force’s operating location near Groom Lake, Nevada from any Federal, State, interstate, or local hazardous or solid waste laws that might require the disclosure of classified information concerning that operating location to unauthorized persons.

Information concerning activities at the operating location near Groom Lake has been properly determined to be classified, and its disclosure would be harmful to national security. Continued protection of this information is, therefore, in the paramount interest of the United States.

The determination was not intended to imply that in the absence of a Presidential exemption, RCRA or any other provision of law permits or requires the disclosure of classified information to unauthorized persons. The determination also was not intended to limit the applicability or enforcement of any requirement of law applicable to the Air Force’s operating location near Groom Lake except those provisions, if any, that would require the disclosure of classified information.

WILLIAM J. CLINTON

THE WHITE HOUSE,

January 31, 2000

– –

Submitted for review and commentary by,

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com