AntiMatter Technology Problems

AntiMatter Technology Problems

by, Paul Collin, host of Concept Activity Research Vault ( CARV ) on May 15, 2011 ( Originally Published: May 10, 2011 )

CALIFORNIA, Los Angeles – May 15, 2011 – The global scientific community is eyeing suspiciously a 1952 ‘experimental projects’ organization known as the Conseil Européen pour la Recherche Nucléaire ( CERN ) also known as European Organization for Nuclear Research ] wherein its Large Hadron Collider ( LHC ) consists of a huge 27-mile in diameter high-energy particle collider is conducting some extremely serious experiments involving what scientists and physicisys say involves something called a “CP-violation” that deals with creating a variety of new subatomic particles that are believed to have never existed anywhere on Earth.

There is quite a bit of controversy concerning something called a “strangelet” ( strangelets ) and other particle creations within the CERN experiment, which because of conjectures in scientific theories are feared by some professionals, could create a ’new subatomic particle’ that may upset the balance of Earth as we know it, and what is even more frightening is that if something goes out-of-control, it may take anywhere between 1-year to 5-years ‘before anyone notices a chain reaction having already been created that some have already identified as a ‘micro-blackhole’ that could theoretically begin consuming Earth from within its own magnetic iron core. Sounding like ‘science fiction’, apparently CERN experiments are ’definitely not’ something to be taken lightly.

This serious and highly controversial subject amongst scientists and physicists around the world is being touched-on in this report, amongst other related information, amongst which includes video clips ( below ) for better understanding some of the many aspects for public knowledge not being addressed by mainstream news broadcasts.

CERN went even further, though, by expanding its deep underground experiments to conduct related experiments in outerspace within what it calls the Alpha Magnetic Spectrometer ( AMS / AMS-02 ) now scheduled for launch aboard the U.S. Space Shuttle Endeavor STS-134 mission set for May 16, 2011. The AMS-02 is, however, to be delivered to the International Space Station ( ISS ) where it will continue CERN designated experiments.

Interestingly, during July 2010 the Alpha Magnetic Spectrometer ( AMS / AMS02 ) was ‘not’ launched as the video clip ( above ) depicted. The Alpha Magnetic Spectrometer ( AMS / AMS-02 ), being equated to that of the Hubble space telescope, actually holds far more technological advancements from CERN and is solely designed to focus on subatomic particles surrounding antimatter issues.

U.S. Space Shuttle Endeavor mission STS-134 was scheduled to launch on April 14, 2011 but was delayed until the end of April 2011, but then was delayed yet again until May 16, 2011. Why so many delays and reschedulings?

Earth anti-matter issues are rarely addressed by the mainstream news media with the public, however in-lieu of the recent NASA public warning that it is expecting a ‘significant’ “solar flare” to erupt, coming bound for Earth, as something “we all need to be concerned about,” the Alpha Magnetic Spectrometer ( AMS ) having just been recently placed onboard the U.S. Space Shuttle Endeavour mission – scheduled to deliver the AMS aboard the International Space Station ( ISS ) – is something the public really needs to take a closer look at.

AMS-02 onboard ISIS

[ PHOTO ( above ): Alpha Magnetic Spectrometer ( AMS / AMS-02 ) in U.S. Space Shuttle Endeavour cargo bay April 2011 ( click to enlarge ) ]

– –

Source: Nature.Com

AntiUniverse Here We Come by, Eugenie Samuel Reich

May 4, 2011

A controversial cosmic ray detector destined for the International Space Station will soon get to prove its worth.

The next space-shuttle launch will inaugurate a quest for a realm of the Universe that few believe exists.

Nothing in the laws of physics rules out the possibility that vast regions of the cosmos consist mainly of anti-matter, with anti-galaxies, anti-stars, even anti-planets populated with anti-life.

“If there’s matter, there must be anti-matter. The question is, where’s the Universe made of antimatter?” says Professor Samuel C.C. Ting, a Nobel prize winning physicist at the Massachusetts Institute of Technology ( MIT ) in Cambridge, Massachusetts. But most physicists reason that if such antimatter regions existed, we would have seen the light emitted when the particles annihilated each other along the boundaries between the antimatter and the matter realms. No wonder the Professor Samuel C.C. Ting brainchild, a $2,000,000,000 billion dollar space mission was sold ‘partly on the promise of looking for particles emanating from anti-galaxies’, is fraught with controversy.

Professor Ting’s project, however has other ‘more mainstream scientific goals’ so, most critics of which held their tongues last week as the U.S. Space Shuttle Endeavour STS-134 mission – prepared to deliver the Alpha Magnetic Spectrometer ( AMS version, known as the AMS-02 ) to the International Space Station ( ISS ) – flight was delayed ( because of problems ) until later this month ( May 2011 ).

Pushing The Boundaries –

Seventeen ( 17 ) years in the making, the Alpha Magnetic Spectrometer ( AMS ) is a product of the former NASA administrator Dan Goldin quest to find remarkable science projects for the Internation Space Station ( ISS ) and of the Ting fascination with anti-matter.

Funded by NASA, the U.S. Department of Energy ( DOE ), plus a sixteen ( 16 ) country consortium of partners, the Alpha Magnetic Spectrometer ( AMS ) has prevailed – despite delays and technical problems – along with the doubts of many high-energy and particle physicists.

“Physics is not about doubt,” says Roberto Battiston, deputy spokesman for the Alpha Magnetic Spectrometer ( AMS ) and physicist at the University of Perugia, Italy. “It is about precision measurement.”

As the Alpha Magnetic Spectrometer ( AMS ) experiment headed to the Space Shuttle Endeavour launch pad, Roberto Battiston and other scientists were keen to emphasize the Alpha Magnetic Spectrometer ( AMS ) ‘unprecedented sensitivity’ to the gamut of cosmic rays, that rain down on Earth, that should allow the Alpha Magnetic Spectrometer ( AMS ) to perform two ( 2 ) things:

1. Measure Cosmic Ray High-Energy Charged ‘Particles’ and ‘Properties’ ( thereof ), sent from:

– Sun ( Earth’s ); – Supernovae ( distant ); and, Gamma ( γ ) Ray Bursts ( GRB ).

AND,

2. Detect AntiMatter ( errant chunks ), sent from the:

a. Universe ( far-away ).

Cosmic rays ( on Earth ) can only be indirectly detected by their showers of ‘secondary particles’ produced – when slamming into molecules of atmosphere in high regions above the Earth, but the Alpha Magnetic Spectrometer ( AMS ) in space will get an undistorted view.

“We’ll be able to measure ( solar ) Cosmic Ray Flux very precisely,” says collaboration member physicist Fernando Barão of the Laboratory of Instrumentation and Experimental Particle Physics ( in Lisbon, Spain ). “The best place ( for detecting this ) is to be in ‘space’ because you don’t have Earth’s atmosphere that is going to destroy those cosmic rays.”

No matter what happens, with the more speculative search for antimatter, the Alpha Magnetic Spectrometer ( AMS ) should produce a definitive map of the cosmic ray sky – helping to build a kind of ‘astronomy not dependent on light’.

Alpha Magnetic Spectrometer ( AMS ) consists of a powerful permanent magnet surrounded by a suite of particle detectors.

Over 10-years ( or more ), that the Alpha Magnetic Spectrometer ( AMS ) experiment will run, the Alpha Magnetic Spectrometer ( AMS ) magnet will bend the paths of cosmic rays by an amount that reveals their energy and charge, thereby their identity.

Some will be ‘heavy atomic nuclei’, while others ( made from anti-matter ), will reveal themselves by ‘bending in the opposite direction’ from their ‘matter’ counterparts ( see, e.g. cosmic curveballs ).

By ‘counting positrons’ ( i.e. antimatter ‘electrons’ ), the Alpha Magnetic Spectrometer ( AMS ) could also ‘chase a tentative signal of dark matter’, the so-far ‘undetected stuff’ thought to account for ‘much of the mass of the Universe’.

In 2009, Russia and Italy researchers – with the Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics ( PAMELA ) onboard a Russia satellite – published evidence of an ‘excess amount of positrons in the space environment surrounding Earth’ ( O. Adriani et al. Nature 458 , 607–609; 2009 ). One potential source of this is the ‘annihilation of dark-matter particles’ within the ‘halo enveloping our Galaxy’.

Another speculative quest, is to follow up on hints of ‘strange matter’, a ‘hypothetical substance’ that should be found in ‘some collapsed stars’ containing ‘strange quarks’, ‘up quarks’ and ‘down quarks’ – within ordinary nuclei.

NASA Alpha Magnetic Spectrometer ( AMS ) program manager Mark Sistilli says hints of ‘strange matter’ were seen – during a 1998 pilot flight of the Alpha Magnetic Spectrometer ( AMS / AMS-01 ) aboard the Space Shuttle, however NASA determined results ‘too tentative to publish’.

Because the Alpha Magnetic Spectrometer ( AMS / AMS-02 ) status was made as an “exploration mission,” the Alpha Magnetic Spectrometer ( AMS ) ‘did not need to follow’ “peer review” NASA would ‘normally have required’ for a ”science mission.”

But Sistilli emphasizes the Alpha Magnetic Spectrometer ( AMS ) earned flying colors from committees convened by the U.S. Department of Energy ( DOE ), which is supplying $50,000,000 million of the funding.

Now their ( DOE ) confidence will be put to the test.

Reference

http://www.nature.com/news/2011/110504/full/473013a.html

While for some it may appear strangelet subatomic antimatter particle research is for advancing our knowledge of unlocking the secrets of life in the Universe, others are still asking NASA what they really know is behind ‘why’ an ‘expected significant’ Solar Energetic Particle Event ( SEPE ) is something “we all need to be concerned about” on Earth.

With Solar Energetic Particle Event ( SEPE ) high-energy effects capable of disrupting Earth ground-based and space-based electrical components and electricity grid infrastructure systems for up to 10-years, many wonder why billions upon billions of dollars were and are still being pumped into the CERN project studying ‘strangelets’ and people want to know just why we need ‘more immediate information detection capabilities’ on high-energy solar flare proton and electron ejections coming toward Earth soon, which NASA and other agencies ‘know far more about’ than they are willing to tell the public.

How advanced has government authorities grown from private-sector science and technology knowledge? The United States has already mapped internal magma flows of the Sun.

How could the U.S. government possibly ‘see inside the Sun’ to know when a ‘direct or near direct Earth facing’ Sun based Coronal Mass Ejection ( CME ) from a solar flare would occur in the future?

In layman terms, for government it was like looking through a clear glass Pyrex bowl positioned atop a stove burner, watching as water starts to boil inside it, and then predicting – based on the flame heating it the water – when bubbles will come to the surface, when one takes into account a government ’ground-based’ ( does ‘not’ require ‘space-based placement’ ) observatory telescope equipped with a “super lens” used for imaging ( observing ) ‘objects at great distances inside matter’ – a “superlens” that now even ‘defies light-speed’ and ‘matter’. ( Read Below )

– –

[ PHOTO ( above ): Antimatter photon ‘optic’ substrate structure material for ‘subsurface solar imaging plasma flows’ inside Sun enables plotting Coronal Mass Ejections ‘before solar surface eruptions’ ( click to enlarge ) ]

Source: U.S. Department of Energy, Lawrence Berkeley National Laboratory, Operated by the University of California

Optical Antimatter Structure Shows The Way For New Super Lens by, Aditi Risbud

April 21, 2009

A device, made from alternating layers of ‘air’ and ‘silicon photonic crystal’, behaves like a ‘super lens’ – providing the first experimental demonstration of optical antimatter.

Scientists at Berkeley Lab ( Berkeley, California, USA ) and the Institute for Microelectronics and Microsystems ( CNR ) in Naples, Italy have experimentally demonstrated – for the first time – the ‘concept of optical antimatter’ by ‘light traveling through a material without being distorted’.

By engineering a material focusing light through its own internal structure, a beam of light can enter and exit ( unperturbed ) after traveling through millimeters of material.

For years, optics researchers have struggled to bypass the ‘diffraction limit’, a physical law restricting imaging resolution to about 1/2 the wavelength of light used to make the image.

If a material with a negative index of refraction ( a property describing how light bends as it enters or exits a material ) could be designed, this diffraction hurdle could be lowered.

Such a material could also behave as a superlens, useful in observing objects from imaging equipment with ‘details finer than allowed by the diffraction limit’, a physical law restricting imaging resolution to about 1/2 the wavelength of light used to make the image.

Despite the intriguing possibilities posed, by a substance with a negative index of refraction, ‘this property is inaccessible through naturally occurring ( positive index ) materials’.

During the mid 1990s, English theoretical physicist Sir John Pendry proposed his clever ‘sleight of light’ using so-called metamaterials – engineered ‘materials’ whose underlying structure ‘can alter overall responses’ to ‘electrical fields’ and ‘magnetic fields’.

Inspired by the Sir John Pendry proposal, scientists have made progress in scaling metamaterials from microwave to infrared wavelengths while illuminating the nuances of light-speed and direction-of-motion in such engineered structures.

“We’ve shown a ‘completely new way to control and manipulate light’, ‘using a silicon photonic crystal’ as a ‘real metamaterial’ – and it works,” said Stefano Cabrini, Facility Director of the Nanofabrication Facility in the Molecular Foundry, a U.S. Department of Energy ( DOE ) User Facility located at Lawrence Berkeley National Laboratory ( LBNL ) providing support to nanoscience researchers around the world.

“Our findings will open-up an easier way to make structures and use them effectively as a ‘super-lens’.”

Through the Molecular Foundry user program, Cabrini and post-doctoral researcher Allan Chang collaborated with Vito Mocella, a theoretical scientist at the Institute of Microelectronics and Microsystems ( CNR ) in Naples, Italy to fabricate a 2 X 2 millimeter device consisting of alternating layers of air and a silicon based photonic crystal containing air holes.

Using high precision nanofabrication processes, the team designed the spacing and thicknesses of each layer to behave like the metamaterial Sir John Pendry had envisioned.

This device was then used to focus a beam of near-infrared ( I-R ) light, essentially ‘annihilating’ 2 millimeters of ‘space’.

“Now that we have a prototype to demonstrate the concept, our next step will be to find the geometry and material that will work for visible light,” said Cabrini.

Along with possibilities in imaging, the researchers’ findings could also be used to develop hybrid negative-index and positive-index materials, Cabrini added, which may lead to novel ‘devices’ and ‘systems’ unachievable through either material alone.

“Self-collimation of light over millimeter-scale distance in a quasi zero average index metamaterial,” by Vito Mocella, Stefano Cabrini, Allan S.P. Chang, P. Dardano, L. Moretti, I. Rendina, Deirdre Olynick, Bruce Harteneck and Scott Dhuey, appears in Physical Review Letters available in Physical Review Letters online.

Portions of this work were supported by the U.S. Department of Energy ( DOE ) Office of Science, Office of Basic Energy Sciences under Contract No. DE-AC0205CH11231.

The Molecular Foundry is one ( 1 ) of five ( 5 ) U.S. Department of Energy ( DOE ) Nanoscale Science Research Centers ( NSRC ) that are premier national user facilities for interdisciplinary research at the nanoscale. Together, the U.S. Department of Energy ( DOE ) Nanoscale Science Research Centers ( NSRC ) comprise a suite of complementary facilities providing researchers with state-of-the-art capabilities to fabricate, process, characterize and model nanoscale materials, which constitutes the ‘largest infrastructure investment’ of the National Nanotechnology Initiative ( NNI ).

U.S. Department of Energy ( DOE ) Nanoscale Science Research Centers ( NSRC ) are located at these six ( 6 ) locations:

– Argonne National Laboratory ( ANL ); – Brookhaven National Laboratory ( BNL ); – Lawrence Berkeley National Laboratory ( LBNL ); – Oak Ridge National Laboratory ( ORNL ); – Sandia National Laboratory ( SNL ); and, – Los Alamos National Laboratory ( LANL ).

For more information about the DOE NSRCs, please visit http://nano.energy.gov.

Berkeley Lab is a U.S. Department of Energy ( DOE ) National Laboratory located in Berkeley, California conducting ‘unclassified scientific research’ managed by the University of California.

References

http://www.lbl.gov http://foundry.lbl.gov http://newscenter.lbl.gov/feature-stories/2009/04/21/optical-antimatter

– –

If the public could keep its eye open for one second, it would see what is coming at them before it hits them with a surprise that only government knows anything about however governments continue conveying its double-speak vernacular to citizens over a very long period of decades; perhaps, a mere fact ’known today’ may eventually come as no surprise to many whom would have otherwise been kept in the dark while only a few know far more about what awaits the masses.

Perhaps, people may begin asking more questions of their country’s agencies spending so much money so quickly for apparently some ‘mysterious emergency purpose’, and if not for some ‘mysterious emergency purpose’, why is so much money being spent on science and space projects while the general public is told about ‘serious government budget cutbacks’ seeing so many people suffer?

If there is no ’emergency’, then people should know ‘why they are suffering financially more’ – just for the sake of ‘growing science experiment budgets’?

Might be a good idea for everyone to begin keeping their eyes open a little more often and trained on something more than light-hearted mainstream media news entraining entertainment broadcastings.

If people think they get real serious about ‘what they know’ as told on television news broadcasts, imagine how much more serious they will become when they learn about what they ‘were not told’?

Think about it. How Fast Is Technology Growing?

Just beginning to grasp something ‘new’?

Now think about something even newer than the Large Hadron Collider ( LHC ) at CERN.

Reference: https://web.archive.org/web/20120921064312/http://conceptactivityresearchvault.wordpress.com/2011/05/15/antimatter-technology-problems/

BNL Time Chamber

Running Time Backwards –

The Relativistic Heavy Ion Collider ( RHIC ) that added its Solenoidal Tracker At RHIC ( STAR ) claiming to “reverse time” by ultra super computers reconstructing sub-atomic particle interactions producing particles – emerging from each collision – that STAR is believed to be able to “run time backward” in a process equated to examining final products coming out-of a factory that scientists and physicists have no idea ’what kinds of machines produced the products’. Basically, they are developing items so fast, they do not know how they were formed, much less what the capabilities are. Fact is, ’they could easily produce a monster’ and ‘not know what it is until after they are eaten by it’. Scary, really, like kids being given matches to play with.

They are being educated beyond their own intelligence, so much so and to the point by which scientists and physicists ’cannot even grasp what ‘it’ is they’re looking at – much less know what they are trying to manipulate to ‘see what it does next’ – nevertheless they are conducting experiments like children playing with dynamite.

Think this is science fiction? Think they are mad scientists at play? Check the research reference links ( below ).

Think antimatter technology has advanced alot since you began reading this report? calculate ‘more’ because the public does not even know half of it.

Reference: https://unwantedpublicityintel.wordpress.com/2015/09/22/time-foolery/ and, Research ( 26MAY05 ): https://web.archive.org/web/20120925015521/http://www.bnl.gov/rhic/news2/news.asp?a=2647&t=today

CERN has been operating since 2002, and the “SuperLens” was worked-on ‘before’ 2002, making ‘both’ today now 10-years old.

Want newer ‘news’?

Superlenses – created from perovskite oxides – are simpler and easier to fabricate than ‘metamaterials’.

Superlenses are ideal for capturing light travelling in the mid-infra-red ( IR ) spectrum range, opening even newer technological highly sensitive imaging devices, and this superlensing effect can be selectively turned ‘on’ and ‘off’, opening yet another technology of ‘highly dense data storage writing’ for ‘far more advanced capability computers’.

Plasmonic whispering gallery microcavities, consisting of a silica interior coated with a thin layer of silver, ‘improves quality by better than an order of magnitude’ of current plasmonic microcavities. and paves the way for ‘plasmonic nanolasers’.

Expand your knowledge, begin researching the six ( 6 ) reference links ( below ) so that the next time you watch the ‘news’ you’ll begin to realize just how much you’re ‘not being told’ about what is ‘actually far more important’ – far more than you’re used to imagining.

Submitted for review and commentary by,

Paul Collin ( Concept Activity Research Vault ) E-MAIL: UnwantedPublicity@GMAIL.com  WWW: http://ConceptActivityResearchVault.WordPress.Com

References

http://www.bnl.gov/rhic/
http://www.bnl.gov/rhic/STAR.asp
http://www.bnl.gov/bnlweb/pubaf/pr/PR_display.asp?prID=1075&template=Today
http://newscenter.lbl.gov/news-releases/2011/03/29/perovskite-based-superlens-for-the-infrared/
http://newscenter.lbl.gov/news-releases/2009/01/22/plasmonic-whispering-gallery/
http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=1018060

Earth Event Alerts

Earth Event Alerts

[ IMAGE ( above ): IBM Stratus and Cirrus supercomputers analyze Global Environmental Intelligence ( click to enlarge ) ]

Earth Event Alerts

by, Kentron Intellect Research Vault [ E-MAIL: KentronIntellectResearchVault@Gmail.Com ]

August 17, 2012 19:00:42 ( PST ) Updated ( Originally Published: March 23, 2011 )

MARYLAND, Fort George G. Meade – August 17, 2012 – IBM Stratus and IBM Cirrus supercomputers as well as CRAY XK6m and CRAY XT5 ( Jaguar ) massive parallel supercomputers and vector supercomputers are securely controlled via the U.S. National Security Agency ( NSA ) for analyzing Global Environmental Intelligence ( GEI ) data extracted from ground-based ( terrestrial ) monitoring stations and space-based ( extraterrestrial ) spaceborne platforms studying Earth Event ( Space Weather ) effects via High-Performance Computing ( HPC ) as well as, for:

– Weather Forecasting ( including: Space Weather ); – U.S. Government Classified Projects; – Scientific Research; – Design Engineering; and, – Other Research.

[ IMAGE ( above ): CRAY XK6m supercomputers analyze Global Environmental Intelligence ( click to enlarge ) ]

CRAY INC. largest customers are U.S. government agencies, e.g. the U.S. Department of Defense ( DoD ) Defense Advanced Research Projects Agency ( DARPA ) and the U.S. Department of Energy ( DOE ) Oak Ridge National Laboratory ( ORNL ), which accounts for about 3/4 of revenue for CRAY INC. – as well as other supercomputers used worldwide by academic institutions ( universities ) and industrial companies ( private-sector firms ).

CRAY INC. additionally provides maintenance, support services and sells data storage products from partners ( e.g. BlueArc, LSI and Quantum ).

Supercomputer competitors, of CRAY INC., are:

– IBM; – HEWLETT-PACKARD; and, – DELL.

On May 24, 2011 CRAY INC. announced its new CRAY XK6 supercomputer, a hybrid supercomputing system combining its Gemini InterConnect, AMD Opteron™ 6200 Series processors ( code-named: InterLagos ) and NVIDIA Tesla 20 Series GPUs into a tightly integrated upgradeable supercomputing system capable of more than 50 petaflops ( i.e. ‘quadrillions of computing operations’ per ‘second’ ), a multi-purpose supercomputer designed for the next-generation of many-core High Performance Computing ( HPC ) applications.

The SWISS NATIONAL SUPERCOMPUTING CENTRE ( CSCS ) – located in Manno, Switzerland – is the CRAY INC. first ( 1st ) customer for the new CRAY XK6 system. CSCS ( Manno, Switzerland ) promotes and develops technical and scientific services in the field of High-Performance Computing ( HPC ) for the Swiss research community, and is upgrading its CRAY XE6m system ( nick-named: Piz Palu ) into a multiple cabinet new CRAY XK6 supercomputer. The SWISS NATIONAL SUPERCOMPUTING CENTRE ( CSCS ) supports scientists working, in:

– Weather Forecasting; – Physics; – Climatology; – Geology; – Astronomy; – Mathematics; – Computer Sciences; – Material Sciences; – Chemistry; – Biology; – Genetics; and, – Experimental Medicine.

Data additionally analyzed by these supercomputers, include:

– Ultra-Deep Sea Volcanoes located in continental plate fracture zones several miles beneath ocean basin areas ( e.g. Asia-Pacific Rim also known as the “Pacific Ring of Fire” where a circum-Pacific seismic belt of earthquakes frequently impact areas far across the Pacific Ocean in the Americas ).

Global geoscience realizes Earth ‘ground movement shaking’ earthquakes hide alot, of what people are actually walking on-top-of, large geographic land mass areas known as ‘continental shelves’ or “continental plates” that move ( tectonics ) because of superheated pressurized extrasuperconducting magnetic energy properties released from within molten magma material violently exploding beneath the surface of the Earth down in ultra-deep seas.

[ IMAGE ( above ): Global Tectonic Plate Boundaries & Major Volcano HotSpots ( click to enlarge ) ]

Significant volcanoes are positioned like dots along this global 25,000-mile circular region known as the “Pacific Ring of Fire” extending from south of Australia up the ‘entire eastcoast’ of Japan, China and the Kamchatka Pennisula of Russia to across the Aleutian Islands of Alaska and then south down the ‘entire westcoast’ of North America and Latin America.

[ IMAGE ( above ): Ultra-Deep Sea Pacific Ocean Basin ( click to enlarge ) ]

March 11, 2011 Tohoku-chiho Taiheiyo-oki Japan 9.0 earthquake held several secrets, including U.S. government contractors simultaneously monitoring a significant ”moment of magnitude” ( Mw ) Earth Event occurring parallel to the eastcoast of Japan beneath the Western Pacific Ocean where an entire suboceanic mountain range was being split in-half ( south to north ) 310-miles long and split open 100-feet wide ( east to west ), which the public was unaware of nor were they told details about.

Interestingly, the March 11, 2011 Japan island earthquakes have not yet stopped, as the swarm of 4.0, 5.0, 6.0 and 7.0 Richter scale earthquakes continue as a direct and proximate cause of erupting ‘suboceanic volcanoes‘ moving these large “plates” beginning to force yet others to slam into one another thousands of miles away.

Japan’s Western Pacific Ocean ‘eastcoast’ has a ’continental plate’ slamming point meeting the ’westcoast’ of North America near the Cascade mountain range ‘plate’ reacting in one ( 1 ) of two ( 2 ) ways, i.e. ’seaward’ ( plate thrusting toward Japan ) or ‘landward’ ( plate thrusting toward the Pacific Northwest ) of the United States and/or Canada.

What The Public Never Knew

Government leadership, globally, is acutely familiar with these aforementioned types of major Earth Events, including ‘monstrous plate tectonic pushing matches’, which usually collapse one or more ‘national infrastructures’ and typically spells ‘death’ and ‘serious injuries’ for populations in developed areas.

Extremely familiar with mass public panic resulting from Earth Event catastrophes, government ‘contingency actions’ pre-approved by ‘governing bodies’ and/or ‘national leadership’ Executive Order Directives, which although not advertised is a matter of public record, ‘immediately calls’ upon ‘all military forces’ to carry-out “risk reduction” ( ‘minimization of further damages and dangers’ ) through what is referred to as “mitigation” ( ‘disaster management’ ) within “National Disaster Preparedness Planning” ( ‘national contingency measures’ ) details citizens are unaware-of. Government decision-makers know a “national emergency can bring temporary suspension of Constitutional Rights and a loss of freedoms – a volatile subject few care to discuss because ’any significant natural disaster’ will result in government infringment on many civil liberties most populations are accustomed to enjoying.

Before 1-minute and 40-seconds had passed into the March 11, 2011 Tohoku, Japan earthquake ( Richter scale: M 9.0 ), key U.S. government decision-makers discussed the major Earth Event unfolding off Japan’s eastcoast Plate-Boundary subduction zone beneath the ultra-deep sea of the Western Pacific Ocean where Japan’s monstrous volcano mountain range had split at least 100-feet wide open and cracked 310-miles long in a northern direction headed straight for the Aleutian Islands of Alaska in the United States.

U.S. Military Contingent Standby “Red Alert” Notification

U.S. Air Force ( USAF ) ‘subordinate organization’ Air and Space Operations ( ASO ) Communications Directorate ( A6 ) ‘provides support’ over ‘daily operations’, ‘contingency actions’ and ‘general’ Command, Control, Communication and Computer Intelligence ( C4I ) for the U.S. Air Force Weather Agency ( USAFWA ) saw its 1st Weather Group ( 1ST WXG ) Directorate ready its USAFWAWXGOWS 25th Operational Weather Squadron ( OWS ) at Davis-Monthan Air Force Base ( Tucson, Arizona ) responsibile for conjuntive communication notification issuance of an Earth Event “Red Alert” immediately issued directly to U.S. Army Western Command ( WESTCOM ) with a “Standby-Ready” clause pausing Western Region ( CONUS ) mobilization of Active, Reserve and National Guard military forces at specific installations based on the Japan Earth Event “moment of magnitude” ( Mw ) Plate-Boundary consequential rebound expected to strike against the North America westcoast Plate-Boundary of the Cascadia Range reactionarily triggering its subduction zone into a Cascadia ‘great earthquake’.

CALTECH Public News Suppression Of Major Earth Event

Officials, attempting to diminish any clear public understanding of the facts only knowing a Richter scale level earthquake ‘magnitude’ ( never knowing or hearing about what a major Earth Event “moment of magnitude” ( Mw ) entailed ), only served-up ‘officially-designed double-speak psycho-babble terms’ unfamiliar to the public as ‘creative attention distraction’ announcing the “Japan earthquake experienced,” a:

– “Bilateral Rupture;” and,

– “Slip Distribution.”

The facts are that, the Japan ‘earthquake’ would ‘never have occurred’, ‘unless’:

1ST – “Bilateral Rupture” ( ‘suboceanic subterranean tectonic plate split wide open  ) occurred; followed by,

2ND – “Slip Distribution” ( ‘tectonic plate movement’ ); then finally,

3RD – “Ground Shaking” ( ‘earthquake’ ) response.   Officials failed the public without any notification a major Earth Event “moment of magnitude” ( Mw ) on the “Pacific Ring of Fire” ( circum-Pacific seismic belt ) in the Western Pacific Ocean had a, huge:

1. Continental Plate Break Off;

3. Undersea Plate Mountain Range Crack Wide Open; plus,

2. Mountain Range Split Open 310-Miles Long.

There are some, laying at rest, that might ‘not consider’ the aforementioned three ( 3 ) major Earth Event occurences significant, except those ‘still living’ on Earth.

Asia-Pacific Rim

This western Pacific Ocean huge ‘undersea mountain range’ moved ‘east’, crushing into the smaller portion of its tectonic plate’ toward the continent of Asia, which commenced continous streams of day and night significant earthquakes still registering 5.0 + and 6.0 + according to Richter scale levels of magnitude now and for over 12-days throughout the area surrounding Japan, the point nearest where the tectonic plate meets the continent of Asia within the western Pacific Ocean from where this ‘monstorous undersea mountain range’ suddenly split, sending the ‘eastern half’ – with the ‘tectonic plate’ broken beneath it – slamming into the continent of Asia.

Simultaneously pushed, even greater with more force outward ( note: explosives – like from out-of a cannon or from a force-shaped explosive – project blasts outward from the ‘initial explosive blast’ is blunted by a back-stop ) away-from the Asia continent, was this ‘monstorous undersea mountain range’ split-off ( 310-miles / 500-kilometers long ) ‘western half’ slammed west up against the Americas ‘western tectonic plates’ .

This ‘is’ the ‘major’ “Earth Event” that will have consequential global impact repurcussions, ‘officially minimized’ by ‘focusing public attention’ on a ‘surface’ Earth Event earthquake 9.0 Richter scale magnitude ( once ), while even further diminishing the hundreds of significant earthquakes that are still occuring 12-days after the initial earthquake.

Asia-Pacific Rim “Ring Of Fire”

Many are unaware the “Asia-Pacific Rim” is ( also known as ) the “Ring of Fire” whereunder the ”ultra-deep sea Pacific Ocean’ exists ‘numerous gigantic volatile volcanoes’ positioned in an ‘incredibly large circle’ ( “ring” ) around a ‘huge geographic land mass area’ comprised of ‘tectonic plates’ that ‘connect’ the ‘Eastern Asias’ to the ‘Western Americas’.

Yellowstone National Park Super Volcano

Many people are still wondering ‘why’ the Japan earthquakes have not yet stopped, and why they are being plagued by such a long swarm of siginificant earthquakes still to this very day nearly 60-days later. The multiple color video clips viewed ( below ) provides information on unusual earthquake swarm patterns and reversals while studying the World’s largest supervolcano in Wyoming ( USA ) located at Yellowstone National Park, a global public attraction viewing natural underground volcano steam vents known as geyser eruptions:

[ PHOTO ( above ): Major HotSpot at Yellowstone displays Half Dome cap of granite rock above unerupted volcano magma. ]

Ultra-Deep Sea Volcanoes

When huge undersea volcanoes erupt they dynamically force incredibly large geographic land mass plates to move whereupon simultaneously and consequentially movement is experienced on ‘surface land areas’ people know as ’earthquakes’ with their ’aftermath measurements’ provided in “Richter scale level” measurements that most do not understand. These Richter scale measurements are only ‘officially provided estimates’, as ’officials are never presented with totally accurate measurements’ because many of which are ‘not obtained with any great precision for up-to 2-years after the initial earthquake’.

Rarely are ‘precise measurements’ publicly provided, and at anytime during that 2-year interim the public may hear their previously reported earthquake Richter scale level measurement was either “officially upgraded” or “officially downgraded.” Often, this is apparently dependent when one sees ’many other countries contradicting U.S. public news announcements’ about the magnitude of a particularly controversial earthquake. An example of this was seen surrounding the March 12, 2011 earthquake in Japan:

– Japan 1st public announcement: 9.2 Richter scale;

– United States 1st public announcement: 8.9 Richter scale;

– United States 2nd public announcement: 9.0 Richter scale; and,

– United States 3rd public announcement: 9.1 Richter scale.

What will the March 12, 2011 Japan earthquake be officially reported as in 2-years? Who knows?

Never publicly announced, however are measurements of an earthquake ‘force strength pressure accumulation’ transmitted through suboceanic tectonic plates grinding against one another, a major Earth Event ‘geographic pushing process’, having been seen by U.S. NSA supercomputers from global ground and space-based monitoring analysis surrounding the “Asia-Pacific Rim Ring of Fire” – stretching from the ‘Eastern Asias’ to the ‘Western Americas’ and beyond.

This ‘domino plate tectonic principle’ results from combined amounts of ‘volcanic magmatic eruptive force strength’ and ‘tectonic plate accumulative pressure build-up’ against ‘adjacent tectonic plates’ causing ‘suboceanic, subterranean and surface land to move’ whereupon ‘how significant such amounts occur determines strength’ of both consequential ‘earthquakes’ and resultant ‘tsunamis’.

Waterway Tsunamis

When most of the public ‘hears about’ a “tsunami”, they ‘think’ ‘high waves’ near “ocean” coastal regions presented with significant floods over residents of cities nearby. Few realize the ‘vast majority of Earth’s population predominantly live all along ocean coastal regions. Few realize one ( 1 ) ‘gigantic tsunami’ could ‘kill vast populations living near oceans in the wake of popular beaches, a tough trade-off for some while logical others choose ‘living further inland’ – away from large bodies of water like ‘large lakes’ where ‘tide levels are also effected by the gravitational pull of the moon’ that can also can a ‘vast deep lake body’ bring a tsunami dependent on which direction tectonic plates move a ‘force directionalized earthquake’ creating a ‘tsunami’ with significant innundating floods over residents living in cities near those ‘large shoreline’ areas too.

What most of the public does not yet fully realize is that ‘large river bodies of water’, like the Mississippi River that is a ‘north’ to ‘south’ directional river’ could easily see ‘east to ‘west’ directional ‘tectonic plates’ move adjacent states – along the New Madrid Fault subduction zone – with significant ‘earthquakes’ – from tectonic plate movement easily capable of squeezing the side banks of even the Missippi River forcing huge amounts of water hurled out onto both ‘east’ and ‘west’ sides resulting in ‘seriously significant inland flooding’ over residents living in ‘low lying’ states of the Central Plains of the United States.

Japan “Pacific Ring Of Fire” Earthquakes To Americas Cascadia Fault Zone

Japan accounts, of a co-relative tsunami, suggest the Cascadia Fault rupture occurred from one ( 1 ) single earthquake triggering a 9-Mw Earth Event on January 26, 1700 where geological evidence obtained from a large number of coastal northern California ( USA ) up to southern Vancouver Island ( Canada ), plus historical records from Japan show the 1,100 kilometer length of the Cascadia Fault subduction zone ruptured ( split cauding that earthquake ) major Earth Event at that time. While the sizes of earlier Cascadia Fault earthquakes are unknown, some “ruptured adjacent segments” ( ‘adjacent tectonic plates’ ) within the Cascadia Fault subduction zone were created over periods of time – ranging from as little as ‘hours’ to ‘years’ – that has historically happened in Japan.

Over the past 20-years, scientific progress in understanding Cascadia Fault subduction zone behavior has been made, however only 15-years ago scientists were still debating whether ‘great earthquakes’ occured at ‘fault subduction zones’. Today, however most scientists realize ‘great earthquakes’ actually ‘do occur in fault subduction zone regions’.

Now, scientific discussions focus on subjects, of:

– Earth crust ‘structural changes’ when a “Plate Boundary” ruptures ( splits ) – Related tsunamis; – Seismogenic zone ( tectonic plate ‘locations’ and ‘width sizes’ ).

Japan America Earthquakes And Tsunamis Exchange

Great Cascadia earthquakes generate tsunamis, which most recently was at-least a ’32-foot high tidal wave’ onto the Pacific Ocean westcoast of Washington, Oregon, and California ( northern portion of state ), and that Cascadia earthquake tsunami sent a consequential 16-foot high todal wave onto Japan.

These Cascadia Fault subduction zone earthquake tsunamis threaten coastal communities all around the Pacific Ocean “Ring of Fire” but have their greatest impact on the United States westcoast and Canada being struck within ’15-minutes’ to ’40-minutes’ shortly ‘after’ a Cascadia Fault subduction zone earthquake occurs.

Deposits, from past Cascadia Fault earthquake tsunamis, have been identified at ‘numerous coastal sites’ in California, Oregon, Washington, and even as far ‘north’ as British Columbia ( Canada ) where distribution of these deposits – based on sophisticated computer software simulations for tsunamis – indicate many coastal communities in California, Oregon, Washington, and even as far ‘north’ as British Columbia ( Canada ) are well within flooding inundation zones of past Cascadia Fault earthquake tsunamis.

California, Oregon, Washington, and even as far ‘north’ as British Columbia ( Canada ) westcoast communities are indeed threatened by future tsunamis from Cascadia great earthquake event ‘tsunami arrival times’ are dependent on measuring the distance from the ‘point of rupture’ ( tectonic plate split, causing earthquake ) – within the Cascadia Fault subduction zone – to the westcoast “landward” side.

Cascadia Earthquake Stricken Damage Zone Data

Strong ground shaking from a “moment of magnitude” ( Mw ) “9″ Plate-Boundary earthquake will last 3-minutes or ‘more’, dominated by ‘long-periods of further earthquakes’ where ‘ground shaking movement damage’ will occur as far inland as the cities of Portland, Oregon; Seattle, Washington; and Vancouver, British Columbia ( Canada ).

Tsunami Optional Wave Patterns “Following Sea”

Large cities within 62-miles to 93-miles of the nearest point of the Cascadia Plate-Boundary zone inferred rupture, will not only experience ‘significant ground shaking’ but also experience ‘extreme duration ground shaking’ lasting far longer, in-addition to far more powerful tsunamis carrying far more seawater because of their consequential “lengthened  wave periods” ( ‘lengthier distances’ between ‘wave crests’ or ‘wave curls’ ) bringing inland akin to what fisherman describe as a deadly “following sea” ( swallowing everything within an ‘even more-so powerful waterpath’ ), the result of which inland causes ‘far more significant damage’ on many ‘tall buildings’ and ‘lengthy structures’ where ‘earthquake magnitude strength’ will be felt ‘strongest’ – all along the United States of America Pacific Ocean westcoast regional areas – experiencing ‘far more significant damage’.Data Assessments Of Reoccuring Cascadia Earthquakes

cascadia ‘great earthquakes’ “mean recurrence interval” ( ‘time period occuring between one earthquake with the next earthquake ) – specific ‘at the point’ of the Cascadia Plate-Boundary – time is between 500-years up-to 600-years, however Cascadia Fault earthquakes in the past have occurred well within the 300-year interval of even less time since the Cascadia ‘great earthquake’ of the 1700s. Time intervals, however between ‘successive great earthquakes’ only a few centuries up-to 1,000 years has little ‘well-measured data’ as to ‘reoccurance interval’ because the numbers of recorded Cascadia earthquakes have rarely measured over ‘five’ ( 5 ). Data additionally indicates Cascadia earthquake intermittancy with irregular intervals when they did occur, plus data lacks ‘random distribution’ ( ‘tectonic plate shift’ or ‘earth movement’ ) or ‘cluster’ of these Cascadia earthquakes over a lengthier period of time so ‘more accurate assessments are unavailable’ for knowning anything more about them. Hence, because Cascadia earthquake ‘recurrence pattern’ is so ‘poorly known’, knowing probabilities of the next Cascadia earthquake occurrence is unfortunately unclear with extremely sparse ‘interval information’ details.

Cascadia Plate-Boundary “Locked” And “Not Locked”

Cascadia Plate-Boundary zone is ‘currently locked’ off the U.S. westcoast shoreline where it has accumulating plate tectonic pressure build-up – from other tectonic plates crashing into it for over 300-years.

The Cascadia Fault subduction zone, at its widest point, is located northwest just off the coast of the State of Washington where the maximum area of seismogenic rupture is approximately 1,100 kilometers long and 50 kilometers up-to 150 kilometers wide. Cascadia Plate-Boundary seismogenic portion location and size data is key-critical for determining earthquake magnitude, tsunami size, and the strength of ground shaking.

Cascadia Plate-Boundary “landward limit” – of only its “locked” portion – where ‘no tectonic plate shift has yet to occur’ is located between the Juan de Fuca tectonic plate and North America tectonic plate were it came to be “locked” between Cascadia earthquakes, however this “ocked” notion has only been delineated from ‘geodetic measurements’ of ‘surface land deformation’ observations. Unfortunately, its “seaward limit” has ‘very few constraints’ up-to ‘no constraints’ for travelling – on its so-called “locked zone” portion – that could certainly move at any time.

Cascadia Plate Continues Sliding

Cascadia transition zone, separating its “locked zone” from its “continuous sliding zone” headed east into the continent of North America, is constrained ( held-back ) poorly so, Cascadia rupture may extend an unknown distance – from its now “locked zone” to its “continously sliding transition zone.”

On some Earth crust faults, near coastal regions, earthquakes may also experience ‘additional Plate-Boundary earthquakes’, ‘increased tsunami tidal wave size’ plus ‘intensification of local area ground shaking’.

Earth Event Mitigation Forces Global Money Flow

Primary ‘government purpose’ to ‘establishing international’ “risk reduction” is solely to ‘minimize global costs from damages’ associated with major magnitude Earth Events similar-to but even-greater than the what happend on March 11, 2011 all over Japan.

Historical earthquake damages assist in predictive projections of damage loss studies suggesting disastrous future losses will occur in the Pacific Northwest from a Cascadia Fault subduction ‘great earthquake’. National ‘loss mitigation efforts’ – studying ‘other seismically active regions’ plus ‘national cost-benefit studies’ indicate that ‘earthquake damage loss mitigation’ may effectively ‘reduce losses’ and ‘assist recovery’ efforts in the future. Accurate data acquired, geological and geophysical research and immediate ‘technological information transfer’ to ‘national key decision-makers’ was to reduce Pacific Northwest Cascadia Fault subduction zone additional risks to those of the Western North America coastal region.

Damage, injuries, and loss of life from the next great earthquake from the Cascadia Fault subduction zone will indeed be ‘great’, ‘widespread’ and ‘significantly ‘impact national economies’ ( Canada and United States ) for years to decades in the future, which has seen a global concerted increase, in:

– International Cooperative Research; – International Information Exchanges; – International Disaster Prepardeness; – International Damage Loss Mitigation Planning; – International Technology Applications; and, – More.

Tectonics Observatory

CALTECH Advanced Rapid Imaging and Analysis ( ARIA ) Project collaborative members of the NASA Jet Propulsion Laboratory ( JPL ), University of California Institute of Technology ( Pasadena ) Tectonics Observatory ARIA Project members, CALTECH scientists, Shengji Wei and Anthony Sladen ( of GEOAZUR ) modelled the Japan Tohoku earthquake fault zone sub-surface ( below surface ) ‘tectonic plate movement’, dervived from:

– TeleSeismic Body Waves ( long-distance observations ); and,

– Global Positioning Satellites ( GPS ) ( near-source observations ).

A 3D image of the fault moving, can be viewed in Google Earth ( internet website webpage link to that KML file is found in the “References” at the bottom of this report ) projects that fault rupture in three dimensional images, which can be viewed from any point of reference, with ‘that analysis’ depicting the rupture ( ground splitting open 100-feet ) resulting in the earthquake ( itself ) ‘triggered from 15-miles ( 24-kilometers ) beneath the ultra-deep sea of the Western Pacific Ocean, with the ‘entire island of Japan being moved east’ by 16-feet ( 5 meters ) from its ‘before earthquake location’.

[ IMAGE ( above ): NASA JPL Project ARIA Tectonic Plate Seismic Wave Direction Map ( click image to enlarge and read ) ]

National Aeronautics and Space Administration ( NASA ) Jet Propulsion Laboratory ( JPL ) at the University of California ( Pasadena ) Institute of Technology ( also known as ) CALTECH Project Advanced Rapid Imaging and Analysis ( ARIA ) used GEONET RINEX data with JPL GIPSY-OASIS software to obtain kinematic “precise point positioning solutions” from a bias fixing method of a ‘single station’ matched-up to JPL orbit and clock products to produce their seismic displacement projection map details that have an inherent ’95% error-rating’ that is even an ‘estimate’, which ‘proves’ these U.S. government organization claims that ‘all they supposedly know’ ( after spending billions of dollars ) are what they are ‘only willing to publicly provide may be ‘only 5% accurate’. So much for what these U.S. government organizations ‘publicly announce’ as their “precise point positioning solutions.”

Pay Any Price?

More ‘double-speak’ and ‘psycho-babble’ serves to ‘only distract the public away from the ‘truth’ as to ‘precisely what’ U.S. taxpayer dollars are ‘actually producing’, and ‘now knowing this’ if ‘those same officials’ ever ‘worked for a small business’ they would either be ‘arrested’ for ‘fraud’ or ‘fired’ because of ‘incompetence’, however since ‘none of them’ will ever ‘admit to their own incometence’ their ‘leadership’ needs to see ‘those responsible’ virtually ‘swing from’ the end of an ‘incredibly long U.S. Department of Justice rope’.

Unfortunately, the facts surrounding all this only get worse.

[ IMAGE ( above ): Tectonic Plates ( brown color ) Sinking and Sunk On Earth Core. ]

Earthquake Prediction Falacy

Earthquake prediction will ‘never be an accomplished finite science for people to ever rely upon’, even though huge amounts of money are being wasted on ‘technology’ for ‘detection sensors’ reading “Seismic Waveforms” ( also known as ) “S Waves” that ‘can be detected and stored in computer databases’, because of a significant fact that will never be learned no matter how much money or time may be devoted to trying to solve the unsolvable problem of the Earth’s sub-crustal regions that consist primarily of ‘molten lake regions’ filled with ‘floating tectonic plates’ that are ‘moving while sinking’ that ‘cannot be tested’ for ‘rock density’ or ‘accumulated pressures’ existing ‘far beneath’ the ‘land surface tectonic plates’.

The very best, and all, that technology can ever perform for the public is to record ‘surface tectonic plates grinding aganist one another’ where ‘only that action’ ( alone ) does in-fact emit the generation of upward ‘accoustic wave form patterns’ named as being ‘seismic waves’ or ‘s-waves’ that ‘do occur’ but ‘only when tectonic plates are moving’.

While a ‘public early warning’ might be helpful for curtailing ‘vehicular traffic’ crossing an ‘interstate bridge’ that might collapse or ‘train traffic’ travel being stopped, thousands of people’s lives could be saved but it would fail to serve millions more living in buildings that collapse.

Early Warning Exclusivity

Knowing governments, using publicly unfamiliar terms, have ‘statisticly analyzed’ “international economics” related to “national infrastructure preparedness” ( ‘early warning systems’ ) – both “public” ( i.e. ‘utility companies’ via ‘government’ with ‘industrial leadership’ meetings ) and “private” ( i.e. ‘residents’ via ‘television’, ‘radio’, ‘newspaper’ and ‘internet’ only ‘commercial advertisements’ ) between which two ( 2 ) sees “national disaster mitigation” ‘primary designated provisions’ for “high density population centers” near “coastal or low-lying regions” ( ‘large bodies of ocean, lake and river water’ ) “early warning” but for only one ( 1 ) being “public” ( i.e. ‘utility companies’ via ‘government’ with ‘industrial leadership’ meetings ) “in the interest of national security” limiting ‘national economic burdens’ from any significant Earth Event impact ‘aftermath’.

In short, and without all the governmentese ‘psycho-babble’ and double-speak’, costs continue being spent on ‘high technology’ efforts to ‘perfect’ a “seismic early warning” for the “exclusive use” ( ‘national government control’ ) that “provides” ( ‘control over’ ) “all major utility company distribution points” ( facilities from where ‘electrical power is only generated’ ) able to “interrupt power” ( ‘stop the flow of electricity nationwide’ from ‘distribution stations’ ), thus “saving additional lives” from “disasterous other problems” ( ‘aftermath loss of lives and injuries’ caused by ‘nuclear fallout radiation’, ‘exploding electrical transformers’, and ‘fires associated with overloaded electrical circuits’ ).

Logically, ‘much’ – but ‘not all’ – of the aforementioned ‘makes perfect sense’, except for “John Doe” or “Jane Doe” ‘exemplified anonomously’ ( herein ) as individuals whom if ‘earlier warned’ could have ‘stopped their vehicle ‘before crossing the bridge that collapsed’ or simply ‘stepped out of the way of a huge sign falling on them’ being ‘killed’ or ‘maimed’, however one might additionally consider ‘how many more would ‘otherwise be killed or maimed’ after an ‘ensuing mass public mob panics’ by ‘receiving’ an “early warning.” Tough call for many, but few.

Earth Data Publicly Minimized

Tohoku-oki earthquake ‘seismic wave form data’ showing the Japan eastcoast tectonic plate “bilaterally ruptured” ( split in-half for a distance of over 310-miles ) was obtained from the USArray seismic stations ( United States ) was analyzed and later modelled by Caltech scientists Lingsen Meng and Jean-Paul Ampuero whom created preliminary data animation demonstrating a ‘super major’ Earth Event simultaneously occurring when the ‘major’ earthquake struck Japan.

U.S. National Security Stations Technology Systems Projects

United States Seismic Array ( USArray ) Data Management Plan Earthscope is composed of three ( 3 ) Projects:

1. Incorporated Research Institutions for Seismology ( IRIS ), a National Science Foundation ( NSF ) consortium of universities, Data Management Center ( DMC ) is ‘managed’ by the “United States Seismic Array ( USArray )” Project;

2. UNAVCO INC. ‘implemented’ “Plate-Boundary Observatory ( PBO )” Project; and,

3. U.S. Geological Service ( USGS ) ‘operated’ “San Andreas Fault Observatory at Depth ( SAFOD )” Project at Stanford University ( California ).

Simultaneous Earth Data Management

USArray component “Earthscope” data management plan is held by USArray IRIS DMC.

USArray consists of four ( 4 ) data generating components:

Permanent Network

Advanced National Seismic System ( ANSS ) BackBone ( BB ) is a joint effort – between IRIS, USArray and USGS – to establish a ‘Permanent Network’ of approximately one-hundred ( 100 ) Earth monitoring ‘receiving stations’ ( alone ) located in the Continental United States ( CONUS ) or lowere 48 states of America, in-addition to ‘other stations’ located in the State of Alaska ( alone ).

Earth Data Multiple Other Monitors

USArray data contribution to the Advanced National Seismic System ( ANSS ) BackBone ( BB ) consists, of:

Nine ( 9 ) new ‘international Earth data accumulation receiving stations’ akin to the Global Seismic Network ( GSN );

Four ( 4 ) “cooperative other stations” from “Southern Methodist University” and “AFTAC”;

Twenty-six ( 26 ) ‘other receiving stations’ from the Advanced National Seismic System ( ANSS ) with ‘upgrade funding’ taken out-of the USArray Project “EarthScope;” plus,

Sixty ( 60 ) additional stations of the Advanced National Seismic System ( ANSS ) BackBone ( BB ) network that ‘currently exist’, ‘will be installed’ or ‘will be upgraded so that ‘data channel stream feeds’ can and ‘will be made seamlessly available’ through IRIS DMC where ‘data can be continuously recorded’ at forty ( 40 ) samples per second and where 1 sample per second can and ‘will be continously transmitted in real-time back into IRIS DMC where quality assurance is held at facilities located in ‘both’ Albuquerque, New Mexico and Golden, Colorado with ‘some’ U.S. Geological Survey ( USGS ) handling ‘some operational responsiblities’ thereof.

Albuquerque Seismological Laboratory ( ASL ) –

Albuquerque Seismological Laboratory ( ASL ) supports operation and maintenance of seismic networks for the U.S. Geological Survey ( USGS ) portion of the Global Seismographic Network ( GSN ) and Advanced National Seismic System ( ANSS ) Backbone network.

ASL runs the Advanced National Seismic System ( ANSS ) depot facility supporting the Advanced National Seismic System ( ANSS ) networks.

ASL also maintains the PASSCAL Instrument Center ( PIC ) facility at the University of New Mexico Tech ( Socorro, New Mexico ) developing, testing, and evaluating seismology monitoring and recording equipment.

Albuquerque Seismological Laboratory ( ASL ) staff are based in ‘both’ Albuquerque, New Mexico and Golden, Colorado.

Top-Down Bottom-Up Data Building Slows Earthquake Notifications

Seismic waveform ( ‘seismic Wave form frequency’ ) data is received by the Global Seismic Network ( GSN ) and Advanced National Seismic System ( ANSS ) BackBone ( BB ) network by electronic transmissions sent ‘slower than real-time’ by sending only ‘near-time data’ ( e.g. tape and compact disc recordings ) to the National Earthquake Information Center ( NEIC ) ‘station’ of the U.S. Geological Survey ( USGS ) ‘officially heralded’ for so-called “rapid earthquake response,”

Unbelieveably is the fact that in-addition to the aforementioned ‘slow Earth Event data delivery process’, an additional number of ‘data receiving stations’ have absolutely ‘no data streaming telemetry’ transmission capabilities whatsoever so, those station data recordings – on ‘tapes’ and ‘compact discs’ – are delivered by ‘other even more time consuming routes’ before that data can even reach the U.S. Geological Survey ( USGS ). In short, all the huge amounts of money being spent goes to ‘increasing computer technologies, sensors, satellites, ‘data stream channel networks’ and ‘secure facility building stations’ from the ‘top, down’ instead of building ‘monitoring stations’ and ‘recording stations’ from the ‘bottom, up’ until the entire earthquake monitoring and notification system is finally built properly. As it curreently stands, the ‘apple cart stations continue being built more and more’ while ‘apple tree stations are not receiving the proper technological nutrients’ to ‘delivery apples ( ‘data’ ) and ‘fed into notification markets’ ( ‘public’ ) where all this could do some good.

U.S. National Security Reviews Delay Already Slow Earthquake Notifications

IRIS Data Management Center ( DMC ) – after processing all incoming data streams from reporting stations around the world – then distributes seismic waveform data ‘back to’ both the Global Seismic Network ( GSN ) and Advanced National Seismic System ( ANSS ) BackBone ( BB ) network operations, but only ‘after seismic waveform data has been ‘thoroughly screened’ by what U.S. national security government Project leadership has deemed its ‘need to control all data’ by “limiting requirements” ( ‘red tape’ ) because ‘all data must undergo’ a long ardous ‘secure data clearing process’ before any data can be released’. Amusingly to some, the U.S. government – in its race to create another ‘official acronym’ of ‘double-speak’ – that national security requirement clearing process’ was ever so aptly named:

“Quality Assurance Framework” ( QUACK )

Enough said.

Let the public decide what to do with ‘those irresponsible officials’, afterall ‘only mass public lives’ are ‘swinging in the breeze’ at the very end-of a now-currently endless ‘dissinformation service rope’ being paid for by the tax-paying public.

In the meantime, while we are all ‘waiting for another Earth Event to take place far beyond, what ( besides this report ) might ‘slap the official horse’, spurring it to move quickly?

How about us? What shhould we do? Perhaps, brushing-up on a little basic knowledge might help.

Inner Earth Deeper Structure Deep Focus Earthquakes Rays And Related Anomalies

There is no substitute for knowledge, seeing information technology ( IT ) at the focal point of many new discoveries aided by supercomputing, modelling and analytics, but common sense does pretty good.

The following information, although an incredibily brief overview on such a wide variety of information topics surrounding a great deal of the in’s and out’s surrounding planet Earth, scratches more than just the surface but deep structure and deep focus impacting a multitude of generations from as far back as 700 years before the birth of Christ ( B.C. ).

Clearly referenced “Encyclopaedia Britannica” general public access information is all second-hand observations of records from other worldwide information collection sources, such as:

– Archives ( e.g. governments, institutions, public and private );

– Symposiums ( e.g. white papers );

– Journals ( professional and technical publications );

– Other information collection sources; and,

– Other information publications.

Encyclopaedias, available in a wide variety of styles and formats, are ’portable catalogs containing a large amount of basic information on a wide variety of topics’ available worldwide to billions of people for increasing their knowledge.

Encyclopedia information formats vary, and through ’volume reading’, within:

– Paper ‘books’ with either ’printed ink’ ( sighted ) or ’embossed dots’ ( Braille );

– Plastic ‘tape cartridges’ ( ‘electromagnetic’ media ) or ‘compact discs’ ( ‘optical’ media ) with ‘electronic device display’; or,

– Electron ‘internet’ ( ‘signal computing’ via ‘satellite’ or ‘telecomputing’ via ’landline’ or ‘node’ networking ) with ‘electronic device display’.

After thoroughly reviewing the Encyclopedia Britannica ‘specific compilation’, independent review found reasonable a facsimile of the original reformatted for easier public comprehension ( reproduced further below ).

Suprisingly, after that Encyclopedia Britannica ‘specific compilation’ information was reformatted for clearer reading comprehension, otherwise inner Earth ‘deep-structure’ geophysical studies formed an amazing correlation with additional factual activities within an equally amazing date chronology of man-made nuclear fracturing reformations of Earth geology geophysical – activities documented worldwide more than 1/2 century ago but somehow forgotten; either by chance or secret circumstance.

How could the Encyclopedia Britannica, or for that matter anyone else, missed something on such a grand scale that is now so obvious?

… [ TEMPORARILY EDITED-OUT FOR REVISION PURPOSES ONLY –  ] …

For more details, about the aforementioned, Click: Here!

Or,

To understand how all this relates, ‘begin with a better basic understanding’ by continuing to read the researched information ( below ):

====

Circa: March 21, 2012

Source:  Encyclopaedia Britannica

Earthquakes

Definition, Earthquake: Sudden shaking of Earth ground caused by passage of seismic waves through Earth rocks.

Seismic waves are produced when some form of energy stored in the Earth’s crust is suddenly released, usually when masses of rock straining against one another suddenly fracture and “slip.” Earthquakes occur most often along geologic faults, narrow zones where rock masses move in relation to one another. Major fault lines of the world are located at the fringes of the huge tectonic plates that make up the Earth’s crust. ( see table of major earthquakes further below )

By the early 20th Century ( 1900s ), little was understood about earthquakes until the emergence of seismology, involving scientific study of all aspects of earthquakes, now yielding answers to long-standing questions as to why and how earthquakes occur.

About 50,000 earthquakes, large enough to be noticed without the aid of instruments, occur every year over the entire Earth, and of these approximately one-hundred ( 100 ) are of sufficient size to produce substantial damage if their centers are near human habitation.

Very great earthquakes, occur on average about once a year, however over centuries these earthquakes have been responsible for millions of human life deaths and an incalculable amount of property damage.

Earthquakes A -Z

Earth’s major earthquakes occur primarily in belts coinciding with tectonic plate margins, apparent since early ( 700 B.C. ) experienced earthquake catalogs, and now more readily discernible by modern seismicity maps instrumentally depicting determined earthquake epicentres.

Most important, is the earthquake Circum-Pacific Belt affecting many populated coastal regions around the Pacific Ocean, namely:

South America;

– North America & Alaska;

Aleutian Islands;

Japan;

New Zealand; and,

New Guinea.

80% of the energy, estimated presently released in earthquakes, comes from those whose epicentres are in the Circum-Pacific Belt belt.

Seismic activity is by no means uniform throughout the belt, and there are a number of branches at various points. Because at many places the Circum-Pacific Belt is associated with volcanic activity, it has been popularly dubbed the “Pacific Ring of Fire.”

A second ( 2nd ) belt, known as the Alpide Belt, passes through the Mediterranean region eastward through Asia and joining the Circum-Pacific Belt in the East Indies where energy released in earthquakes from the Alpide Belt is about 15%of the world total.

There are also seismic activity ‘striking connected belts’, primarily along oceanic ridges including, those in the:

Arctic Ocean;

Atlantic Ocean;

Indian Ocean ( western ); and along,

East Africa rift valleys.

This global seismicity distribution is best understood in terms of its plate tectonic setting.

Forces

Earthquakes are caused by sudden releases of energy within a limited region of Earth rocks, and apparent pressure energy can be released, by:

Elastic strain;

– Gravity;

Chemical Reactions; and / or,

– Massive rock body motion.

Of all these, release of elastic rock strain is most important because this form of energy is the only kind that can be stored in sufficient quantities within the Earth to produce major ground disturbances.

Earthquakes, associated with this type of energy release, are called: Tectonic Earthquakes.

Tectonics

Tectonic plate earthquakes are explained by the so-called elastic rebound theory, formulated by the American geologist Harry Fielding Reid after the San Andreas Fault ruptured in 1906, generating the great San Francisco earthquake.

According to Reid theory of elastic rebound, a tectonic earthquake occurs when energy strains in rock masses have accumulated ( built-up ) to a point where resulting stresses exceed the strength of the rocks where then sudden fracturing results.

Fractures propagate ( travel ) rapidly ( see speeds further below ) through the rock, usually tending in the same direction and sometimes extending many kilometres along a local zone of weakness.

In 1906, for instance, the San Andreas Fault slipped along a plane 270-miles ( 430 kilometers) long, a line alongwhich ground was displaced horizontally as much as 20-feet ( 6 meters ).

As a fault rupture progresses along or up the fault, rock masses are flung in opposite directions, and thus spring back to a position where there is less strain.

At any one point this movement may take place not at-once but rather in irregular steps where these sudden slowings and restartings give rise to vibrations that propagate as seismic waves.

Such irregular properties of fault rupture are now included in ‘physical modeling” and ‘mathematical modeling’ earthquake sources.

Earthquake Focus ( Foci )

Roughnesses along the fault are referred to as asperities, and places where the rupture slows or stops are said to be fault barriers. Fault rupture starts at the earthquake focus ( foci ), a spot that ( in many cases ) is close to being from 5 kilometers to 15 kilometers ‘under the surface where the rupture propagates ( travels )’ in one ( 1 ) or both directions over the fault plane until stopped ( or slowed ) at a barrier ( boundary ).

Sometimes, instead of being stopped at the barrier, the fault rupture recommences on the far side; at other times the stresses in the rocks break the barrier, and the rupture continues.

Earthquakes have different properties depending on the type of fault slip that causes them.

The usual ‘fault model’ has a “strike” ( i.e., direction, from north, taken by a horizontal line in the fault plane ) and a “dip” ( i.e. angle from the horizontal shown by the steepest slope in the fault ).

Movement parallel to the dip is called dip-slip faulting.

In dip-slip faults, if the hanging-wall block moves downward relative to the footwall block, it is called “normal” faulting; the opposite motion, with the hanging wall moving upward relative to the footwall, produces reverse or thrust faulting. The lower wall ( of an inclined fault ) is the ‘footwall’, and laying over the footwall is the hanging wall.

When rock masses slip past each other ( parallel to the strike area ) movement is known as strike-slip faulting.

Strike-slip faults are right lateral or left lateral, depending on whether the block on the opposite side of the fault from an observer has moved to the right or left.

All known faults are assumed to have been the seat of one or more earthquakes in the past, though tectonic movements along faults are often slow, and most geologically ancient faults are now a-seismic ( i.e., they no longer cause earthquakes ).

Actual faulting, associated with an earthquake, may be complex and often unclear whether in one ( 1 ) particular earthquake, where total energy, is being issued from a single ( 1 ) fault plane.

Observed geologic faults sometimes show relative displacements on the order of hundreds of kilometres over geologic time, whereas the sudden slip offsets that produce seismic waves may range from only several centimetres to tens of metres.

During the 1976 Tangshan earthquake ( for example ), a surface strike-slip of about 1 meter was observed along the causative fault east of Beijing, China, and later ( as another example ) during the 1999 Taiwan earthquake the Chelung-pu fault slipped vertically up to 8 meters.

Volcanism & Earthquake Movement

A separate type of earthquake is associated with volcano activity known as a volcanic earthquake.

Although likely, even in such cases, disturbance is officially believed resultant from sudden slip of rock masses adjacent a volcano being consequential release of elastic rock strain energy, however stored energy may be partially of hydrodynamic origin due heat provided by magma flowing movements ( tidal ) throughout underground reservoirs beneath volcanoes or releasing under pressure gas, but then there certainly is a clear corresponding distinction between geographic distribution of volcanoes and major earthquakes particularly within the Circum-Pacific Belt traversing ocean ridges.

Volcano vents, however, are generally several hundred kilometres from epicentres of most ‘major shallow earthquakes’, and it is believed ’many earthquake sources’ occur ‘nowhere near active volcanoes’.

Even in cases where earthquake focus occurs where structures are marked ’directly below volcanic vents’, officially there is probably no immediate causal connection between the two ( 2 ) activities where likely both may be resultant on same tectonic processes.

Earth Fracturing

Artificially Created Inductions

Earthquakes are sometimes caused by human activities, including:

– Nuclear Explosion ( large megaton yield ) detonations underground;

– Oil & Gas wells ( deep Earth fluid injections )

– Mining ( deep Earth excavations );

– Reservoirs ( deep Earth voids filled with incredibly heavy large bodies of water ).

In the case of deep mining, the removal of rock produces changes in the strain around the tunnels.

Slip on adjacent, preexisting faults or outward shattering of rock into where new cavities may occur.

In fluid injection, the slip is thought to be induced by premature release of elastic rock strain, as in the case of tectonic earthquakes after fault surfaces are lubricated by the liquid.

Large underground nuclear explosions have been known to produce slip on already strained faults in the vicinity of test devices.

Reservoir Induction

Of the various earthquake causing activities cited above, the filling of large reservoirs ( see China ) being most prominent.

More than 20 significant cases have been documented in which local seismicity has increased following the impounding of water behind high dams. Often, causality cannot be substantiated, because no data exists to allow comparison of earthquake occurrence before and after the reservoir was filled.

Reservoir-induction effects are most marked for reservoirs exceeding 100 metres ( 330 feet ) in depth and 1 cubic km ( 0.24 cubic mile ) in volume. Three ( 3 ) sites where such connections have very probably occurred, are the:

Hoover Dam in the United States;

Aswan High Dam in Egypt; and.

Kariba Dam on the border between Zimbabwe and Zambia in Africa.

The most generally accepted explanation for earthquake occurrence in such cases assumes that rocks near the reservoir are already strained from regional tectonic forces to a point where nearby faults are almost ready to slip. Water in the reservoir adds a pressure perturbation that triggers the fault rupture. The pressure effect is perhaps enhanced by the fact that the rocks along the fault have lower strength because of increased water-pore pressure. These factors notwithstanding, the filling of most large reservoirs has not produced earthquakes large enough to be a hazard.

Specific seismic source mechanisms associated with reservoir induction have been established in a few cases. For the main shock at the Koyna Dam and Reservoir in India ( 1967 ), the evidence favours strike-slip faulting motion. At both the Kremasta Dam in Greece ( 1965 ) and the Kariba Dam in Zimbabwe-Zambia ( 1961 ), the generating mechanism was dip-slip on normal faults.

By contrast, thrust mechanisms have been determined for sources of earthquakes at the lake behind Nurek Dam in Tajikistan. More than 1,800 earthquakes occurred during the first 9-years after water was impounded in this 317 meter deep reservoir in 1972, a rate amounting to four ( 4 ) times the average number of shocks in the region prior to filling.

Nuclear Explosion Measurement Seismology Instruments

By 1958 representatives from several countries, including the United States and the Russia Soviet Union government, met to discuss the technical basis for a nuclear test-ban treaty where amongst matters considered was feasibility of developing effective means to detect underground nuclear explosions and to distinguish them seismically from earthquakes.

After that conference, much special research was directed to seismology, leading to major advances in seismic signal detection and analysis.

Recent seismological work on treaty verification has involved using high-resolution seismographs in a worldwide network, estimating the yield of explosions, studying wave attenuation in the Earth, determining wave amplitude and frequency spectra discriminants, and applying seismic arrays. The findings of such research have shown that underground nuclear explosions, compared with natural earthquakes, usually generate seismic waves through the body of the Earth that are of much larger amplitude than the surface waves. This telltale difference along with other types of seismic evidence suggest that an international monitoring network of two-hundred and seventy ( 270 ) seismographic stations could detect and locate all seismic events over the globe of magnitude 4.0 and above ( corresponding to an explosive yield of about 100 tons of TNT ).

Earthquake Effects

Earthquakes have varied effects, including changes in geologic features, damage to man-made structures, and impact on human and animal life. Most of these effects occur on solid ground, but, since most earthquake foci are actually located under the ocean bottom, severe effects are often observed along the margins of oceans.

Surface Phenomena

Earthquakes often cause dramatic geomorphological changes, including ground movements – either vertical or horizontal – along geologic fault traces; rising, dropping, and tilting of the ground surface; changes in the flow of groundwater; liquefaction of sandy ground; landslides; and mudflows. The investigation of topographic changes is aided by geodetic measurements, which are made systematically in a number of countries seriously affected by earthquakes.

Earthquakes can do significant damage to buildings, bridges, pipelines, railways, embankments, and other structures. The type and extent of damage inflicted are related to the strength of the ground motions and to the behaviour of the foundation soils. In the most intensely damaged region, called the meizoseismal area, the effects of a severe earthquake are usually complicated and depend on the topography and the nature of the surface materials. They are often more severe on soft alluvium and unconsolidated sediments than on hard rock. At distances of more than 100 km (60 miles) from the source, the main damage is caused by seismic waves traveling along the surface. In mines there is frequently little damage below depths of a few hundred metres even though the ground surface immediately above is considerably affected.

Earthquakes are frequently associated with reports of distinctive sounds and lights. The sounds are generally low-pitched and have been likened to the noise of an underground train passing through a station. The occurrence of such sounds is consistent with the passage of high-frequency seismic waves through the ground. Occasionally, luminous flashes, streamers, and bright balls have been reported in the night sky during earthquakes. These lights have been attributed to electric induction in the air along the earthquake source.

Tsunamis

Following certain earthquakes, very long-wavelength water waves in oceans or seas sweep inshore. More properly called seismic sea waves or tsunamis ( tsunami is a Japanese word for “harbour wave” ), they are commonly referred to as tidal waves, although the attractions of the Moon and Sun play no role in their formation. They sometimes come ashore to great heights—tens of metres above mean tide level—and may be extremely destructive.

The usual immediate cause of a tsunami is sudden displacement in a seabed sufficient to cause the sudden raising or lowering of a large body of water. This deformation may be the fault source of an earthquake, or it may be a submarine landslide arising from an earthquake.

Large volcanic eruptions along shorelines, such as those of Thera (c. 1580 bc) and Krakatoa (ad 1883), have also produced notable tsunamis. The most destructive tsunami ever recorded occurred on December 26, 2004, after an earthquake displaced the seabed off the coast of Sumatra, Indonesia. More than 200,000 people were killed by a series of waves that flooded coasts from Indonesia to Sri Lanka and even washed ashore on the Horn of Africa.

Following the initial disturbance to the sea surface, water waves spread in all directions. Their speed of travel in deep water is given by the formula (√gh), where h is the sea depth and g is the acceleration of gravity.

This speed may be considerable—100 metres per second ( 225 miles per hour ) when h is 1,000 metres ( 3,300 feet ). However, the amplitude ( i.e., the height of disturbance ) at the water surface does not exceed a few metres in deep water, and the principal wavelength may be on the order of hundreds of kilometres; correspondingly, the principal wave period—that is, the time interval between arrival of successive crests—may be on the order of tens of minutes. Because of these features, tsunami waves are not noticed by ships far out at sea.

When tsunamis approach shallow water, however, the wave amplitude increases. The waves may occasionally reach a height of 20 to 30 metres above mean sea level in U- and V-shaped harbours and inlets. They characteristically do a great deal of damage in low-lying ground around such inlets. Frequently, the wave front in the inlet is nearly vertical, as in a tidal bore, and the speed of onrush may be on the order of 10 metres per second. In some cases there are several great waves separated by intervals of several minutes or more. The first of these waves is often preceded by an extraordinary recession of water from the shore, which may commence several minutes or even half an hour beforehand.

Organizations, notably inJapan,Siberia,Alaska, andHawaii, have been set up to provide tsunami warnings. A key development is the Seismic Sea Wave Warning System, an internationally supported system designed to reduce loss of life in thePacific Ocean. Centred inHonolulu, it issues alerts based on reports of earthquakes from circum-Pacific seismographic stations.

Seiches

Seiches are rhythmic motions of water in nearly landlocked bays or lakes that are sometimes induced by earthquakes and tsunamis. Oscillations of this sort may last for hours or even for 1-day or 2-days.

The great Lisbon earthquake of 1755 caused the waters of canals and lakes in regions as far away as Scotland and Sweden to go into observable oscillations. Seiche surges in lakes in Texas, in the southwestern United States, commenced between 30 and 40 minutes after the 1964 Alaska earthquake, produced by seismic surface waves passing through the area.

A related effect is the result of seismic waves from an earthquake passing through the seawater following their refraction through the seafloor. The speed of these waves is about 1.5 km (0.9 mile) per second, the speed of sound in water. If such waves meet a ship with sufficient intensity, they give the impression that the ship has struck a submerged object. This phenomenon is called a seaquake.

Earthquake Intensity and Magnitude Scales

The violence of seismic shaking varies considerably over a single affected area. Because the entire range of observed effects is not capable of simple quantitative definition, the strength of the shaking is commonly estimated by reference to intensity scales that describe the effects in qualitative terms. Intensity scales date from the late 19th and early 20th centuries, before seismographs capable of accurate measurement of ground motion were developed. Since that time, the divisions in these scales have been associated with measurable accelerations of the local ground shaking. Intensity depends, however, in a complicated way not only on ground accelerations but also on the periods and other features of seismic waves, the distance of the measuring point from the source, and the local geologic structure. Furthermore, earthquake intensity, or strength, is distinct from earthquake magnitude, which is a measure of the amplitude, or size, of seismic waves as specified by a seismograph reading ( see below Earthquake magnitude )

A number of different intensity scales have been set up during the past century and applied to both current and ancient destructive earthquakes. For many years the most widely used was a 10-point scale devised in 1878 by Michele Stefano de Rossi and Franƈois-Alphonse Forel. The scale now generally employed in North America is the Mercalli scale, as modified by Harry O. Wood and Frank Neumann in 1931, in which intensity is considered to be more suitably graded.

A 12-point abridged form of the modified Mercalli scale is provided below. Modified Mercalli intensity VIII is roughly correlated with peak accelerations of about one-quarter that of gravity ( g = 9.8 metres, or 32.2 feet, per second squared ) and ground velocities of 20 cm (8 inches) per second. Alternative scales have been developed in bothJapan andEurope for local conditions.

The European ( MSK ) scale of 12 grades is similar to the abridged version of the Mercalli.

Modified Mercalli scale of earthquake intensity

  I. Not felt. Marginal and long-period effects of large earthquakes.

  II. Felt by persons at rest, on upper floors, or otherwise favourably placed to sense tremors.

  III. Felt indoors. Hanging objects swing. Vibrations are similar to those caused by the passing of light trucks. Duration can be estimated.

  IV. Vibrations are similar to those caused by the passing of heavy trucks (or a jolt similar to that caused by a heavy ball striking the walls). Standing automobiles rock. Windows, dishes, doors rattle. Glasses clink, crockery clashes. In the upper range of grade IV, wooden walls and frames creak.

  V. Felt outdoors; direction may be estimated. Sleepers awaken. Liquids are disturbed, some spilled. Small objects are displaced or upset. Doors swing, open, close. Pendulum clocks stop, start, change rate.

  VI. Felt by all; many are frightened and run outdoors. Persons walk unsteadily. Pictures fall off walls. Furniture moves or overturns. Weak plaster and masonry cracks. Small bells ring (church, school). Trees, bushes shake.

  VII. Difficult to stand. Noticed by drivers of automobiles. Hanging objects quivering. Furniture broken. Damage to weak masonry. Weak chimneys broken at roof line. Fall of plaster, loose bricks, stones, tiles, cornices. Waves on ponds; water turbid with mud. Small slides and caving along sand or gravel banks. Large bells ringing. Concrete irrigation ditches damaged.

  VIII. Steering of automobiles affected. Damage to masonry; partial collapse. Some damage to reinforced masonry; none to reinforced masonry designed to resist lateral forces. Fall of stucco and some masonry walls. Twisting, fall of chimneys, factory stacks, monuments, towers, elevated tanks. Frame houses moved on foundations if not bolted down; loose panel walls thrown out. Decayed pilings broken off. Branches broken from trees. Changes in flow or temperature of springs and wells. Cracks in wet ground and on steep slopes.

  IX. General panic. Weak masonry destroyed; ordinary masonry heavily damaged, sometimes with complete collapse; reinforced masonry seriously damaged. Serious damage to reservoirs. Underground pipes broken. Conspicuous cracks in ground. In alluvial areas, sand and mud ejected; earthquake fountains, sand craters.

  X. Most masonry and frame structures destroyed with their foundations. Some well-built wooden structures and bridges destroyed. Serious damage to dams, dikes, embankments. Large landslides. Water thrown on banks of canals, rivers, lakes, and so on. Sand and mud shifted horizontally on beaches and flat land. Railway rails bent slightly.

  XI. Rails bent greatly. Underground pipelines completely out of service.

  XII. Damage nearly total. Large rock masses displaced. Lines of sight and level distorted. Objects thrown into air.

With the use of an intensity scale, it is possible to summarize such data for an earthquake by constructing isoseismal curves, which are lines that connect points of equal intensity. If there were complete symmetry about the vertical through the earthquake’s focus, isoseismals would be circles with the epicentre (the point at the surface of the Earth immediately above where the earthquake originated) as the centre. However, because of the many unsymmetrical geologic factors influencing intensity, the curves are often far from circular. The most probable position of the epicentre is often assumed to be at a point inside the area of highest intensity. In some cases, instrumental data verify this calculation, but not infrequently the true epicentre lies outside the area of greatest intensity.

Earthquake Magnitude

Earthquake magnitude is a measure of the “size” or amplitude of the seismic waves generated by an earthquake source and recorded by seismographs.

Types and nature of these waves are described in Seismic waves ( further below ).

Because the size of earthquakes varies enormously, it is necessary for purposes of comparison to compress the range of wave amplitudes measured on seismograms by means of a mathematical device.

In 1935, American seismologist Charles F. Richter set up a magnitude scale of earthquakes as the logarithm to base 10 of the maximum seismic wave amplitude ( in thousandths of a millimetre ) recorded on a standard seismograph ( the Wood-Anderson torsion pendulum seismograph ) at a distance of 60-miles ( 100 kilometers ) from the earthquake epicentre.

Reduction of amplitudes observed at various distances to the amplitudes expected at the standard distance of 100 kilometers ( 50-miles ) is made on the basis of empirical tables.

Richter magnitudes ML are computed on the assumption the ratio of the maximum wave amplitudes at two ( 2 ) given distances is the same for all earthquakes and is independent of azimuth.

Richter first applied his magnitude scale to shallow-focus earthquakes recorded within 600 km of the epicentre in the southern California region. Later, additional empirical tables were set up, whereby observations made at distant stations and on seismographs other than the standard type could be used. Empirical tables were extended to cover earthquakes of all significant focal depths and to enable independent magnitude estimates to be made from body- and surface-wave observations.

A current form of the Richter scale is shown in the table.

Richter scale of earthquake magnitude

magnitude level

category

effects

earthquakes per year

less than 1.0 to 2.9

micro

generally not felt by people, though recorded on local instruments

more than 100,000

3.0-3.9

minor

felt by many people; no damage

12,000-100,000

4.0-4.9

light

felt by all; minor breakage of objects

2,000-12,000

5.0-5.9

moderate

some damage to weak structures

200-2,000

6.0-6.9

strong

moderate damage in populated areas

20-200

7.0-7.9

major

serious damage over large areas; loss of life

3-20

8.0 and higher

great

severe destruction and loss of life over large areas

fewer than 3

At the present time a number of different magnitude scales are used by scientists and engineers as a measure of the relative size of an earthquake. The P-wave magnitude (Mb), for one, is defined in terms of the amplitude of the P wave recorded on a standard seismograph. Similarly, the surface-wave magnitude (Ms) is defined in terms of the logarithm of the maximum amplitude of ground motion for surface waves with a wave period of 20 seconds.

As defined, an earthquake magnitude scale has no lower or upper limit. Sensitive seismographs can record earthquakes with magnitudes of negative value and have ‘recorded magnitudes up to’ about ‘9.0’ ( 1906 San Francisco earthquake, for example, had a Richter magnitude of 8.25 ).

A scientific weakness is that there is no direct mechanical basis for magnitude as defined above. Rather, it is an empirical parameter analogous to stellar magnitude assessed by astronomers. In modern practice a more soundly based mechanical measure of earthquake size is used—namely, the seismic moment (M0). Such a parameter is related to the angular leverage of the forces that produce the slip on the causative fault. It can be calculated both from recorded seismic waves and from field measurements of the size of the fault rupture. Consequently, seismic moment provides a more uniform scale of earthquake size based on classical mechanics. This measure allows a more scientific magnitude to be used called moment magnitude (Mw). It is proportional to the logarithm of the seismic moment; values do not differ greatly from Ms values for moderate earthquakes. Given the above definitions, the great Alaska earthquake of 1964, with a Richter magnitude (ML) of 8.3, also had the values Ms = 8.4, M0 = 820 × 1027 dyne centimetres, and Mw = 9.2

Earthquake Energy

Energy in an earthquake passing a particular surface site can be calculated directly from the recordings of seismic ground motion, given, for example, as ground velocity. Such recordings indicate an energy rate of 105 watts per square metre (9,300 watts per square foot) near a moderate-size earthquake source. The total power output of a rupturing fault in a shallow earthquake is on the order of 1014 watts, compared with the 105 watts generated in rocket motors.

The surface-wave magnitude Ms has also been connected with the surface energy Es of an earthquake by empirical formulas. These give Es = 6.3 × 1011 and 1.4 × 1025 ergs for earthquakes of Ms = 0 and 8.9, respectively. A unit increase in Ms corresponds to approximately a 32-fold increase in energy. Negative magnitudes Ms correspond to the smallest instrumentally recorded earthquakes, a magnitude of 1.5 to the smallest felt earthquakes, and one of 3.0 to any shock felt at a distance of up to 20 km ( 12 miles ). Earthquakes of magnitude 5.0 cause light damage near the epicentre; those of 6.0 are destructive over a restricted area; and those of 7.5 are at the lower limit of major earthquakes.

The total annual energy released in all earthquakes is about 1025 ergs, corresponding to a rate of work between 10,000,000 million and 100,000,000 million kilowatts. This is approximately one ( 1 ) 1,000th the ‘annual amount of heat escaping from the Earth interior’.

90% of the total seismic energy comes from earthquakes of ‘magnitude 7.0 and higher’ – that is, those whose energy is on the order of 1023 ergs or more.

Frequency

There also are empirical relations for the frequencies of earthquakes of various magnitudes. Suppose N to be the average number of shocks per year for which the magnitude lies in a range about Ms. Then log10 N = abMs fits the data well both globally and for particular regions; for example, for shallow earthquakes worldwide, a = 6.7 and b = 0.9 when Ms > 6.0. The frequency for larger earthquakes therefore increases by a factor of about 10 when the magnitude is diminished by one unit. The increase in frequency with reduction in Ms falls short, however, of matching the decrease in the energy E. Thus, larger earthquakes are overwhelmingly responsible for most of the total seismic energy release. The number of earthquakes per year with Mb > 4.0 reaches 50,000.

Earthquake Occurrences & Plate Tectonic associations

Global seismicity patterns had no strong theoretical explanation until the dynamic model called plate tectonics was developed during the late 1960s. This theory holds that the Earth’s upper shell, or lithosphere, consists of nearly a dozen large, quasi-stable slabs called plates. The thickness of each of these plates is roughly 50-miles ( 80 km ). Plates move horizontally relative to neighbouring plates at a rate of 0.4 to 4 inches ( 1-cm to 10-cm ) per year over a shell of lesser strength called the asthenosphere. At the plate edges where there is contact between adjoining plates, boundary tectonic forces operate on the rocks, causing physical and chemical changes in them. New lithosphere is created at oceanic ridges by the upwelling and cooling of magma from the Earth’s mantle. The horizontally moving plates are believed to be absorbed at the ocean trenches, where a subduction process carries the lithosphere downward into the Earth’s interior. The total amount of lithospheric material destroyed at these subduction zones equals that generated at the ridges. Seismological evidence ( e.g. location of major earthquake belts ) is everywhere in agreement with this tectonic model.

Earthquake Types:

– Shallow Earthquakes;

– Intermediate Earthquakes;

– Deep Focus ( Deep-Foci ) Earthquakes; and

– Deeper Focus ( Deeper-Foci ) Earthquakes.

Earthquake sources, are concentrated along oceanic ridges, corresponding to divergent plate boundaries.

At subduction zones, associated with convergent plate boundaries, deep-focus earthquakes and intermediate focus earthquakes mark locations of the upper part of a dipping lithosphere slab.

Focal ( Foci ) mechanisms indicate stresses aligned with dip of the lithosphere underneath the adjacent continent or island arc.

IntraPlate Seismic Event Anomalies

Some earthquakes associated with oceanic ridges are confined to strike-slip faults, called transform faults offset ridge crests. The majority of earthquakes occurring along such horizontal shear faults are characterized by slip motions.

Also in agreement with plate tectonics theory is high seismicity encountered along edges of plates where they slide past each other. Plate boundaries of this kind, sometimes called fracture zones include, the:

San Andreas Fault system in California; and,

– North Anatolian fault system in Turkey.

Such plate boundaries are the site of interplate earthquakes of shallow focus.

Low seismicity within plates is consistent with plate tectonic description. Small to large earthquakes do occur in limited regions well within the boundaries of plates, however such ‘intraplate seismic events’ can be explained by tectonic mechanisms other than plate boundary motions and their associated phenomena.

Most parts of the world experience at least occasional shallow earthquakes – those that originate within 60 km ( 40 miles ) of the Earth’s outer surface. In fact, the great ‘majority of earthquake foci ( focus ) are shallow’. It should be noted, however, that the geographic distribution of smaller earthquakes is less completely determined than more severe quakes, partly because the ‘availability of relevant data dependent on distribution of observatories’.

Of the total energy released in earthquakes, 12% comes from intermediate earthquakes—that is, quakes with a focal depth ranging from about 60 to 300 km. About 3 percent of total energy comes from deeper earthquakes. The frequency of occurrence falls off rapidly with increasing focal depth in the intermediate range. Below intermediate depth the distribution is fairly uniform until the greatest focal depths, of about 700 km (430 miles), are approached.

Deeper-Focus Earthquakes

Deeper-focus earthquakes commonly occur in patterns called Benioff zones dipping into the Earth, indicating presence of a subducting slab where dip angles of these slabs average about 45° – with some shallower – and others nearly vertical.

Benioff zones coincide with tectonically active island arcs, such as:

– Aleutian islands;

– Japan islands;

– Vanuatu islands; and

– Tonga.

Island arcs are, normally ( but not always ) associated, with:

Ultra-Deep Sea Ocean Trenches, such as the:

South America ( Andes mountain system ).

Exceptions to this rule, include:

– Romania ( East Europe ) mountain system; and

Hindu Kush mountain system.

Most Benioff zones,  deep-earthquake foci and intermediate-earthquake foci are usually found within a narrow layer, however recent more precise hypocentral locations – in Japan and elsewhere – indicate two ( 2 ) distinct parallel bands of foci only 12 miles ( 20 kilometers ) apart.

Aftershocks, Swarms and Foreshocks

Major or even moderate earthquake of shallow focus is followed by many lesser-size earthquakes close to the original source region. This is to be expected if the fault rupture producing a major earthquake does not relieve all the accumulated strain energy at once. In fact, this dislocation is liable to cause an increase in the stress and strain at a number of places in the vicinity of the focal region, bringing crustal rocks at certain points close to the stress at which fracture occurs. In some cases an earthquake may be followed by 1,000 or more aftershocks a day.

Sometimes a large earthquake is followed by a similar one along the same fault source within an hour or perhaps a day. An extreme case of this is multiple earthquakes. In most instances, however, the first principal earthquake of a series is much more severe than the aftershocks. In general, the number of aftershocks per day decreases with time.

Aftershock frequency, is ( roughly ):

Inversely proportional to time since occurrence of largest earthquake in series.

Most major earthquakes occur without detectable warning, but some principal earthquakes are preceded by foreshocks.

Japan 2-Years of Hundreds of Thousands Of Earthquakes

In another common pattern, large numbers of small earthquakes may occur in a region for months without a major earthquake.

In the Matsushiro region of Japan, for instance, there occurred ( between August 1965 and August 1967 ) a ‘series of earthquakes’ numbering in the hundreds of thousands – some sufficiently strong ( up to Richter magnitude 5.0 ) causing property damage but no casualties.

Maximum frequency? 6,780 small earthquakes on just April 17, 1966.

Such series of earthquakes are called earthquake swarms.

Earthquakes, associated with volcanic activity often occur in swarms – though swarms also have been observed in many nonvolcanic regions.

Study of Earthquakes

Seismic waves

Principal types of seismic waves

Seismic Waves ( S Waves ), generated by an earthquake source, are commonly classified into three ( 3 ) ‘leading types’.

The first two ( 2 ) leading types, propagate ( travel ) within the body of the Earth, are known as:

P ( Primary ) Seismic Waves; and,

S ( Secondary ) Seismic Waves ( S / S Waves ).

The third ( 3rd ) leading types, propagate ( travel ) along surface of the Earth, are known as:

L ( Love ) Seismic Waves; and,

R ( Rayleigh ) Seismic Waves.

During the 19th Century, existence of these types of seismic waves were mathematically predicted, and modern comparisons show close correspondence between such ‘theoretical calculations‘ and ‘actual measurements’ of seismic waves.

P seismic waves travel as elastic motions at the highest speeds, and are longitudinal waves transmitted by both solid and liquid materials within inner Earth.

P waves ( particles of the medium) vibrate in a manner ‘similar to sound waves’ transmitting media ( alternately compressed and expanded ).

The slower type of body wave, the S wave, travels only through solid material. With S waves, the particle motion is transverse to the direction of travel and involves a shearing of the transmitting rock.

Focus ( Foci )

Because of their greater speed, P waves are the first ( 1st ) to reach any point on the Earth’s surface. The first ( 1st ) P-wave onset ‘starts from the spot where an earthquake originates’. This point, usually at some depth within the Earth, is called the focus (also known as ) hypocentre.

Epicenter

Point ‘at the surface’ ( immediately ‘above the Focus / Foci’ ) is known as the ‘epicenter’.

Love waves and Rayleigh waves, guided by the free surface of the Earth, trail after P and S waves have passed through the body of planet Earth.

Rayleigh waves ( R Waves ) and Love waves ( L Waves ) involve ‘horizontal particle motion’, however ‘only Rayleigh waves exhibit ‘vertical ground displacements’.

Rayleigh waves ( R waves ) and Love ( L waves ) travel ( propagate ) and disperse into long wave trains, when occurring away-from ‘alluvial basin sources’, at substantial distances cause much of the Earth surface ground shaking felt during earthquakes.

Seismic Wave Focus ( Foci ) Properties

At all distances from the focus ( foci ), mechanical properties of rocks, such as incompressibility, rigidity and density play roles, in:

– Speed of ‘wave travel’;

– Duration of ‘wave trains’; and,

– Shape of ‘wave trains’.

Layering of the rocks and the physical properties of surface soil also affect wave characteristics.

In most cases, ‘elastic behaviors occur in earthquakes, however strong shaking ( of surface soils from the incident seismic waves ) sometimes result in ‘nonelastic behavior’, including slumping ( i.e., downward and outward movement of unconsolidated material ) and liquefaction of sandy soil.

Seismic wave that encounters a boundary separating ‘rocks of different elastic properties’ undergo reflection and refraction where a special complication exists because conversion between wave types usually also occur at such a boundary where an incident P or S wave can yield reflected P and S waves and refracted P and S waves.

Between Earth structural layers, boundaries give rise to diffracted and scattered waves, and these additional waves are partially responsible for complications observed in ground motion during earthquakes.

Modern research is concerned with ‘computing, synthetic records of ground motion realistic comparisons with observed actual ground shaking’, using wave theory in complex structures.

Grave Duration Long-Periods, Audible Earthquake Frequencies, and Other Earth Anomalies

Frequency range of seismic waves is widely varied, from being ‘High Frequency’ ( HF ) as an ‘audible range’ ( i.e. greater than > 20 hertz ) to, as Low Frequency ( LF ) as subtle as ‘free oscillations of planet Earth’ – with grave Long-Periods being 54-minutes ( see below Long-Period oscillations of the globe ).

Seismic wave attenuations in rock imposes High-Frequency ( HF ) limits, and in small to moderate earthquakes the dominant frequencies extend in Surface Waves from about 1.0 Hz to 0.1 Hertz.

Seismic wave amplitude range is also great in most earthquakes.

Displacement of ground ranges, from: 10−10 to 10−1 metre ( 4−12 to 4-inches ).

Great Earthquake Speed

Great Earthquake Ground Speed Moves Faster Than 32-Feet Per Second, Squared ( 9.8 Metres Per Second, Squared ) –

In the greatest earthquakes, ground amplitude of predominant P waves may be several centimetres at periods of 2-seconds to 5-seconds, however very close to seismic sources of ‘great earthquakes’, investigators measured ‘large wave amplitudes’ with ‘accelerations of the ground exceeding ( speed of gravity ) 32.2 feet per second squared ( 9.8 meters per second, squared ) at High Frequencies ( HF ) and ground displacements of 1 metre at Low Frequencies ( LF ).

Seismic Wave Measurement

Seismographs and Accelerometers

Seismographs ‘measure ground motion’ in both ‘earthquakes’ and ‘microseisms’ ( small oscillations described below ).

Most of these instruments are of the pendulum type. Early mechanical seismographs had a pendulum of large mass ( up to several tons ) and produced seismograms by scratching a line on smoked paper on a rotating drum.

In later instruments, seismograms ( also known as seismometers ) recorded via ‘rays of light bounced off a mirror’ within a galvanometer using electric current from electromagnetic induction ‘when the pendulum of the seismograph moved’.

Technological developments in electronics have given rise to ‘higher-precision pendulum seismometers’ and ‘sensors of ground motion’.

In these instruments electric voltages produced by motions of the pendulum or the equivalent are passed through electronic circuitry to ‘amplify ground motion digitized for more exactness’ readings.

Seismometer Nomenclature Meanings

Seismographs are divided into three ( 3 ) types of instruments knowingly confused by the public because of their varied names, as:

– Short-Period;

– Intermediate-Period ( also known as Long-Period );

– Long-Period ( also known as Intermediate-Period );

– Ultra-Long-Period ( also known as Broadband or Broad-Band ); or,

– Broadband ( also known as Ultra Long-Period or UltraLong-Period ).

Short-Period instruments are used to record P and S body waves with high magnification of the ground motion. For this purpose, the seismograph response is shaped to peak at a period of about 1-second or less.

Intermediate-period instruments, the type used by the World-Wide Standardized Seismographic Network ( WWSSN ) – described in the section Earthquake observatories – had about a 20-second ( maximum ) response.

Recently, in order to provide as much flexibility as possible for research work, the trend has been toward the operation of ‘very broadband seismographs’ digitizing representation of signals. This is usually accomplished with ‘very long-period pendulums’ and ‘electronic amplifiers’ passing signals in the band between 0.005 Hz and 50 Hertz.

When seismic waves close to their source are to be recorded, special design criteria are needed. Instrument sensitivity must ensure that the largest ground movements can be recorded without exceeding the upper scale limit of the device. For most seismological and engineering purposes the wave frequencies that must be recorded are higher than 1 hertz, and so the pendulum or its equivalent can be small. For this reason accelerometers that measure the rate at which the ground velocity is changing have an advantage for strong-motion recording. Integration is then performed to estimate ground velocity and displacement. The ground accelerations to be registered range up to two times that of gravity. Recording such accelerations can be accomplished mechanically with short torsion suspensions or force-balance mass-spring systems.

Because many strong-motion instruments need to be placed at unattended sites in ordinary buildings for periods of months or years before a strong earthquake occurs, they usually record only when a trigger mechanism is actuated with the onset of ground motion. Solid-state memories are now used, particularly with digital recording instruments, making it possible to preserve the first few seconds before the trigger starts the permanent recording and to store digitized signals on magnetic cassette tape or on a memory chip. In past design absolute timing was not provided on strong-motion records but only accurate relative time marks; the present trend, however, is to provide Universal Time ( the local mean time of the prime meridian ) by means of special radio receivers, small crystal clocks, or GPS ( Global Positioning System ) receivers from satellite clocks.

Prediction of strong ground motion and response of engineered structures in earthquakes depends critically on measurements of the spatial variability of earthquake intensities near the seismic wave source. In an effort to secure such measurements, special arrays of strong-motion seismographs have been installed in areas of high seismicity around the world.

Large-aperture seismic arrays (linear dimensions on the order of about 1/2 mile ( 0.6 mile ) to about 6 miles ( 1 kilometer to 10 kilometers ) of strong-motion accelerometers now used to improve estimations of speed, direction of propagation and types of seismic wave components.

Particularly important for full understanding of seismic wave patterns at the ground surface is measurement of the variation of wave motion with depth where to aid in this effort special digitally recording seismometers have been ‘installed in deep boreholes’.

Ocean-Bottom Measurements

70% of the Earth’s surface is covered by water so, ocean-bottom seismometers augment ( add to ) global land-based system of recording stations.

Field tests have established the feasibility of extensive long-term recording by instruments on the seafloor.

Japan has a ‘semi-permanent seismograph’ system of this type placed on the seafloor off the Pacific Ocean eastcoast of centralHonshu,Japan in 1978 by means of a ‘cable’.

Because of mechanical difficulties maintaining ‘permanent ocean-bottom instrumentation’, different systems have been considered.

They ‘all involve placement of instruments on the ocean bottom’, though they employ various mechanisms for data transmission.

Signals may be transmitted to the ocean surface for retransmission by auxiliary apparatus or transmitted via cable to a shore-based station. Another system is designed to release its recording device automatically, allowing it to float to the surface for later recovery.

Ocean bottom seismograph use should yield much-improved global coverage of seismic waves and provide new information on the seismicity of oceanic regions.

Ocean-bottom seismographs will enable investigators to determine the details of the crustal structure of the seafloor and, because of the relative ‘thinness of the oceanic crust‘, should make possible collection of clear seismic information about Earth’s upper mantle.

Ocean bottom seismograph systems are also expected to provide new data, on Earth:

– Continental Shelf Plate Boundaries;

– MicroSeisms ( origins and propagations ); and,

– Ocean to Continent behavior margins.

MicroSeisms Measurements

MicroSeisms ( also known as ) ‘small ground motions’ are commonly recorded by seismographs. Small weak seismic wave motions ( also known as ) MicroSeisms are ‘not generated by earthquakes’ but in some instances can complicate accurate earthquake measurement recording. MicroSeisms are of scientific interest because their form relates to Earth surface structure.

Microseisms ( some ) have ‘local cause’, for example:

Microseisms due to traffic ( or machinery ) or local wind effects, storms and rough surf against an extended steep coastline.

Another class of microseisms exhibits features that are very similar on records traced at earthquake observatories that are widely separated, including approximately simultaneous occurrence of maximum amplitudes and similar wave frequencies. These microseisms may persist for many hours and have more or less regular periods of about five to eight seconds.

The largest amplitudes of such microseisms are on the order of 10−3 cm ( 0.0004 inch ) and ‘occur in coastal regions’. Amplitudes also depend to some extent on local geologic structure.

Some microseisms are produced when ‘large standing water waves are formed far out at sea’. The period of this type of microseism is ‘half’ of the Standing Wave.

Observations of Earthquakes

Earthquake Observatories

During the late 1950s, there were only about seven-hundred ( 700 ) seismographic stations worldwide, equipped with seismographs of various types and frequency responses – few instruments of which were calibrated; actual ground motions could not be measured, and ‘timing errors of several seconds’ were common.

The World-Wide Standardized Seismographic Network ( WWSSN ), became the first modern worldwide standardized system established to remedy that situation.

Each of the WWSSN had six ( 6 ) seismograph stations with three ( 3 ) short-period and three ( 3 ) long-period seismographs with timing and accuracy maintained by quartz crystal clocks, and a calibration pulse placed daily on each record.

By 1967, the WWSSN consisted of about one-hundred twenty ( 120 ) stations throughout sixty ( 60 ) countries, resulting in data to provide the basis for significant advances in research, on:

– Earthquakes ( mechanisms );

– Plate Tectonics ( global ); and,

– Deep-Structure Earth ( interior ).

By the 1980s a further upgrading of permanent seismographic stations began with the installation of digital equipment by a number of organizations.

Global digital seismograph station networks, now in operation, consist of:

– Seismic Research Observatories ( SRO ) within boreholes drilled 330 feet ( 100 metres ) deep in Earth ground; and,

– Modified high-gain long-period earthquake observatories located on Earth ground surfaces.

The Global Digital Seismographic Network in particular has remarkable capability, recording all motions from Earth ocean tides to microscopic ground motions at the level of local ground noise.

At present there are about 128 sites. With this system the long-term seismological goal will have been accomplished to equip global observatories with seismographs that can record every small earthquake anywhere over a broad band of frequencies.

Epicentre Earthquakes Located

Many observatories make provisional estimates of the epicentres of important earthquakes. These estimates provide preliminary information locally about particular earthquakes and serve as first approximations for the calculations subsequently made by large coordinating centres.

If an earthquake’s epicentre is less than 105° away from an earthquake observatory, the epicentre position can often be estimated from the readings of three ( 3 ) seismograms recording perpendicular components of the ground motion.

For a shallow earthquake the epicentral distance is indicated by the interval between the arrival times of P and S waves; the azimuth and angle of wave emergence at the surface indicated by somparing sizes and directions of the first ( 1st ) movements indicated by seismograms and relative sizes of later waves – particularly surface waves.

Anomaly

It should be noted, however, that in certain regions the first ( 1st ) wave movement at a station arrives from a direction differing from the azimuth toward the epicentre. This ‘anomaly is usually explained’ by ‘strong variations in geologic structures’.

When data from more than one observatory are available, an earthquake’s epicentre may be estimated from the times of travel of the P and S waves from source to recorder. In many seismically active regions, networks of seismographs with telemetry transmission and centralized timing and recording are common. Whether analog or digital recording is used, such integrated systems greatly simplify observatory work: multichannel signal displays make identification and timing of phase onsets easier and more reliable. Moreover, online microprocessors can be programmed to pick automatically, with some degree of confidence, the onset of a significant common phase, such as P, by correlation of waveforms from parallel network channels. With the aid of specially designed computer programs, seismologists can then locate distant earthquakes to within about 10 km (6 miles) and the epicentre of a local earthquake to within a few kilometres.

Catalogs of earthquakes felt by humans and of earthquake observations have appeared intermittently for many centuries. The earliest known list of instrumentally recorded earthquakes with computed times of origin and epicentres is for the period 1899 – 1903, afterwhich cataloging of earthquakes became more uniform and complete.

Especially valuable is the service provided by the International Seismological Centre ( ISC ) in Newbury, UK that monthly receives more than 1,000,000 seismic readings from more than 2,000 seismic monitoring stations worldwide and preliminary estimates locations of approximately 1,600 earthquakes from national and regional agencies and observatories.

The ISC publishes a monthly bulletin about once ( 1 ) every 2-years. The bulletin, when published, provides ‘all available information that was’ on each of more than 5,000 earthquakes.

Various national and regional centres control networks of stations and act as intermediaries between individual stations and the international organizations.

Examples of long-standing national centers include, the:

Japan Meteorological Agency; and,

U.S. National Earthquake Information Center ( NEIC ), a subdivision of the U.S. Geological Survey ( USGS ).

Centers, such as the aforementioned, normally make ‘local earthquake estimates’, of:

– Magnitude;

– Epicentre;

– Time origin; and,

– Focal depth.

Global seismicity data is continually accessible via Incorporated Research Institutions for Seismology ( IRIS ) website.

An important research technique infers the character of faulting ( in an earthquake ) from recorded seismograms.

For example, observed distributions ( of the directions of the first onsets in waves arriving at the Earth’s surface ) have been effectively used.

Onsets are called “compressional” or “dilatational,” according to whether the direction is ‘away from’ or ‘toward’ the focus, respectively.

A polarity pattern becomes recognizable when the directions of the P-wave onsets are plotted on a map – there are broad areas in which the first onsets are predominantly compressions, separated from predominantly dilatational areas by nodal curves near which the P-wave amplitudes are abnormally small.

In 1926 the American geophysicist Perry E. Byerly used patterns of P onsets over the entire globe to infer the orientation of the fault plane in a large earthquake. The polarity method yields two P-nodal curves at the Earth’s surface; one curve is in the plane containing the assumed fault, and the other is in the plane ( called the auxiliary plane ) that passes through the focus and is perpendicular to the forces of the plane.

The recent availability of worldwide broad-based digital recording enabled computer programs written estimating the fault mechanism and seismic moment based on complete pattern of seismic wave arrivals.

Given a well-determined pattern at a number of earthquake observatories, it is possible to locate two ( 2 ) planes, one ( 1 ) of which is the plane containing the fault.

Earthquake Prediction

Earthquake Observations & Interpretations

Statistical earthquake occurrences are believed theorized, not widely accepted nor detecting periodic cycles, records of which old periodicities in time and space for major / great earthquakes cataloged are as old as 700 B.C. with China holding the ‘world’s most extensive catalog’ of approximately one ( 1 ) one-thousand ( 1,000 ) destructive earthquakes where ‘magnitude ( size )’ measurements were assessed based on ‘damage reports’ and experienced periods of ‘shaking’ and ‘other observations’ determining ‘intensity’ of those earthquakes.

Earthquake Attributions to Postulation

Precursor predictability approaches involve what some believe is sheer postulating what the initial trigger mechanisms are that force Earth ruptures, however where this becomes bizarre is where such forces have been attributed, to:

Weather Severity;

– Volcano Activity; and,

– Ocean Tide Force ( Moon ).

EXAMPLE: Correlations between physical phenomena assumed providing trigger mechanisms for earthquake repetition.

Professionals believe such must always be made to discover whether a causative link is actually present, and they further believe that to-date: ‘no cases possess any trigger mechanism’ – insofaras ‘moderate earthquakes’ to ‘large earthquakes’ unequivocally finding satisfaction with various necessary criteria.

Statistical methods also have been tried with populations of regional earthquakes with such suggested, but never established generally, that the slope b of the regression line between the logarithm of the number of earthquakes and the magnitude for a region may change characteristically with time.

Specifically, the claim is that the b value for the population of ‘foreshocks of a major earthquake’ may be ‘significantly smaller’ than the mean b value for the region averaged ‘over a long interval of time’.

Elastic rebound theory, of earthquake sources, allows rough prediction of the occurrence of large shallow earthquakes – for example – Harry F. Reid gave a crude forecast of the next great earthquake near San Francisco ( theory also predicted, of-course, the place would be along the San Andreas Fault or associated fault ). Geodetic data indicated that during an interval of 50 years relative displacements of 3.2 metres ( 10-1/2 feet ) had occurred at distant points across the fault. Elastic-rebound maximum offset ( along the fault in the 1906 earthquake ) was 6.5 metres. Therefore, ( 6.5 ÷ 3.2 ) × 50 or about 100-years would again elapse before sufficient strain accumulated for the occurrence of an earthquake comparable to that of 1906; premises being regional strain will grow uniformly and various constraints have not been altered by the great 1906 rupture itself ( such as by the onset of slow fault slip ).

Such ‘strain rates’ are now, however being more adequately measured ( along a number of active faults, e.g.San Andreas Fault) using networks of GPS sensors.

Earthquake Prediction Research

For many years prediction research has been influenced by the basic argument that ‘strain accumulates in rock masses in the vicinity of a fault, resulting in crustal deformation.

Deformations have been measured in ‘horizontal directions’ along active faults via ‘trilateration’ and ‘triangulation’ and in ‘vertical directions’ via ‘precise leveling and tiltmeters’.

Investigators ( some ) believe ‘ground-water level changes occur prior to earthquakes’ with variations of such reports from China.

Ground water levels respond to an array of complex factors ( e.g. ‘rainfall’ ) where such would have to be removed if changes in water level changes were studied in relation to earthquakes.

Phenomena Precursor Premonitories

Dilatancy theory ( i.e., volume increase of rock prior to rupture ) once occupied a central position in discussions of premonitory phenomena of earthquakes, but now receives less support based on observations that many solids exhibit dilatancy during deformation. For earthquake prediction, significance of dilatancy, if real, effects various measurable quantities of crustal Earth, i.e. seismic velocity, electric resistivity and ground and water levels. Consequences of dilatancy for earthquake prediction are summarized in the table ( below ):

The best-studied consequence is the effect on seismic velocities. The influence of internal cracks and pores on the elastic properties of rocks can be clearly demonstrated in laboratory measurements of those properties as a function of hydrostatic pressure. In the case of saturated rocks, experiments predict – for shallow earthquakes – that dilatancy occurs as a portion of the crust is stressed to failure, causing a decrease in the velocities of seismic waves. Recovery of velocity is brought about by subsequent rise of the pore pressure of water, which also has the effect of weakening the rock and enhancing fault slip.

Strain buildup in the focal region may have measurable effects on other observable properties, including electrical conductivity and gas concentration. Because the electrical conductivity of rocks depends largely on interconnected water channels within the rocks, resistivity may increase before the cracks become saturated. As pore fluid is expelled from the closing cracks, the local water table would rise and concentrations of gases such as radioactive radon would increase. No unequivocal confirming measurements have yet been published.

Geologic methods of extending the seismicity record back from the present also are being explored. Field studies indicate that the sequence of surface ruptures along major active faults associated with large earthquakes can sometimes be constructed.

An example is the series of large earthquakes inTurkeyin the 20th Century, which were caused mainly by successive westward ruptures of the North Anatolian Fault.

Liquefaction effects preserved in beds of sand and peat have provided evidence ( using radiometric dating methods ) for large paleoearthquakes back more than 1,000 years in many seismically active zones, including the U.S. Northwest Pacific Ocean Coastal Region.

Less well-grounded precursory phenomena, particularly earthquake lights and animal behaviour, sometimes draw more public attention than the precursors discussed above.

Unusual lights in the sky reported, and abnormal animal behaviour, preceding earthquakes are known to seismologists – mostly in anecdotal form.

Both phenomena, are usually explained away in terms of ( prior to earthquakes ) there being:

– Gaseous emmissions from Earth ground;

– Electric stimuli ( various ), e.g. HAARP, etcetera, from Earth ground; and,

– Acoustic stimuli ( various ), e.g. Seismic Wave subsonic emmissions from Earth ground.

At the present time, there is no definitive experimental evidence supporting reported claims of animals sometimes sensing an approaching earthquake.

… [ CENSORED-OUT ] …

Earthquake Hazard Reduction Methods

Considerable work has been done in seismology to explain the characteristics of the recorded ground motions in earthquakes. Such knowledge is needed to predict ground motions in future earthquakes so that earthquake-resistant structures can be designed.

Although earthquakes cause death and destruction via such secondary effects ( i.e. landslides, tsunamis, fires and fault rupture ), the greatest losses ( human lives and property ) result from ‘collapsing man-made structures’ amidst violent ground shaking.

The most effective way to mitigate ( minimize ) damage from earthquakes – from an engineering standpoint – is to design and construct structures capable of withstanding ‘strong ground motions’.

Interpreting recorded ground motions

Most ‘elastic waves’ recorded ( close to an extended fault source ) are complicated and difficult to interpret uniquely.

Understanding such, near-source motion, can be viewed as a 3 part problem.

The first ( 1st ) part stems from ‘elastic wave generations’ radiating ( from the slipping fault ) as the ‘moving rupture sweeps-out an area of slip’ ( along the fault plane ) – within a given time.

Wave pattern production dependencies on several parameters, such as:

Fault dimension and rupture velocity.

Elastic waves ( various types ) radiate, from the vicinity of the moving rupture, in all directions.

Geometric and frictional properties of the fault, critically affect wave pattern radiation from it.

The second ( 2nd ) part of the problem concerns the passage of the waves through the intervening rocks to the site and the effect of geologic conditions.

The third ( 3rd ) part involves the conditions at the recording site itself, such as topography and highly attenuating soils. All these questions must be considered when estimating likely earthquake effects at a site of any proposed structure.

Experience has shown that the ground strong-motion recordings have a variable pattern in detail but predictable regular shapes in general ( except in the case of strong multiple earthquakes ).

EXAMPLE: Actual ground shaking ( acceleration, velocity and displacement ) recorded during an earthquake ( see figure below ).

In a strong horizontal shaking of the ground near the fault source, there is an initial segment of motion made up mainly of P waves, which frequently manifest themselves strongly in the vertical motion. This is followed by the onset of S waves, often associated with a longer-period pulse of ground velocity and displacement related to the near-site fault slip or fling. This pulse is often enhanced in the direction of the fault rupture and normal to it. After the S onset there is shaking that consists of a mixture of S and P waves, but the S motions become dominant as the duration increases. Later, in the horizontal component, surface waves dominate, mixed with some S body waves. Depending on the distance of the site from the fault and the structure of the intervening rocks and soils, surface waves are spread out into long trains.

Expectant Seismic Hazard Maps Constructed

In many regions, seismic expectancy maps or hazard maps are now available for planning purposes. The anticipated intensity of ground shaking is represented by a number called the peak acceleration or the peak velocity.

To avoid weaknesses found in earlier earthquake hazard maps, the following general principles are usually adopted today:

The map should take into account not only the size but also the frequency of earthquakes.

The broad regionalization pattern should use historical seismicity as a database, including the following factors: major tectonic trends, acceleration attenuation curves, and intensity reports.

Regionalization should be defined by means of contour lines with design parameters referred to ordered numbers on neighbouring contour lines ( this procedure minimizes sensitivity concerning the exact location of boundary lines between separate zones ).

The map should be simple and not attempt to microzone the region.

The mapped contoured surface should not contain discontinuities, so that the level of hazard progresses gradually and in order across any profile drawn on the map.

Developing resistant structures

Developing engineered structural designs that are able to resist the forces generated by seismic waves can be achieved either by following building codes based on hazard maps or by appropriate methods of analysis. Many countries reserve theoretical structural analyses for the larger, more costly, or critical buildings to be constructed in the most seismically active regions, while simply requiring that ordinary structures conform to local building codes. Economic realities usually determine the goal, not of preventing all damage in all earthquakes but of minimizing damage in moderate, more common earthquakes and ensuring no major collapse at the strongest intensities. An essential part of what goes into engineering decisions on design and into the development and revision of earthquake-resistant design codes is therefore seismological, involving measurement of strong seismic waves, field studies of intensity and damage, and the probability of earthquake occurrence.

Earthquake risk can also be reduced by rapid post-earthquake response. Strong-motion accelerographs have been connected in some urban areas, such as Los Angeles, Tokyo, and Mexico City, to interactive computers.

Recorded waves are correlated with seismic intensity scales and rapidly displayed graphically on regional maps via the World Wide Web.

Exploration of the Earth’s interior with seismic waves

Seismological Tomography

Deep Structure Earth seismological data from several sources, including:

– Nuclear explosions containing P-Waves and S-Waves;

– Earthquakes containing P-Waves and S-Waves;

– Earth ‘surface wave dispersions’ from ‘distant earthquakes’; and,

– Earth ‘planetary vibration’ from ‘Great Earthquakes’

One of the major aims of seismology was to infer a minimum set of properties surrounding the planet interior of Earth that might explain recorded seismic ‘wave trains’ in detail.

Deep Structure Earth exploration made ‘tremendous progress during the first half of the 20th Century ( 1900s – 1950s ), realizing goals was severely limited until the 1960s because of laborious effort required just to evaluate theoretical models and process large amounts of recorded earthquake data.

Today’s application of supercomputer high-speed data processing enormous quantities of stored data and information retrieval capabilities opened information technology ( IT ) passageways leading to major advancements in the way data is manipulated ( data handling ) for advanced theoretical modeling, research analytics and developmental prototyping.

Earth structure realistic modeling studies by researchers since the middle 1970s include continental and oceanic boundaries, mountains and river valleys rather than simple structures such as those involving variation only with depth, and various technical developments have benefited observational seismology.

EXAMPLE: Deep Structure Earth significant exploration using 3D ( three dimensional ) imaging with equally impressive display ( monitor ) equipment possible from advanced microprocessor architecture redesign, new discoveries of materials and new concepts making seismic exploratory techniques developed by petroleum industry adaptations ( e.g. seismic reflection ) highly recognized as adopted procedures.

Deep Structure Earth major methods for determining planet interior is detailed analysis of seismograms of seismic waves; noting earthquake readings additionally provide estimates of, Earth internal:

Wave velocities;

– Density; and,

– Parameters of ‘elasticity’ ( stretchable ) and ‘inelasticity’ ( fixed ).

Earthquake Travel Time

Primary procedure is to measure the travel times of various wave types, such as P and S, from their source to the recording seismograph. First, however, identification of each wave type with its ray path through the Earth must be made.

Seismic rays for many paths of P and S waves leaving the earthquake focus F are shown in the figure.

Deep-Focus Deep-Structure Earth Coremetrics

Rays corresponding to waves that have been reflected at the Earth’s outer surface (or possibly at one of the interior discontinuity surfaces) are denoted as PP, PS, SP, PSS, and so on. For example, PS corresponds to a wave that is of P type before surface reflection and of S type afterward. In addition, there are rays such as pPP, sPP, and sPS, the symbols p and s corresponding to an initial ascent to the outer surface as P or S waves, respectively, from a deep focus.

An especially important class of rays is associated with a discontinuity surface separating the central core of the Earth from the mantle at a depth of about 2,900 km (1,800 miles) below the outer surface. The symbol c is used to indicate an upward reflection at this discontinuity. Thus, if a P wave travels down from a focus to the discontinuity surface in question, the upward reflection into an S wave is recorded at an observing station as the ray PcS and similarly with PcP, ScS, and ScP. The symbol K is used to denote the part (of P type) of the path of a wave that passes through the liquid central core. Thus, the ray SKS corresponds to a wave that starts as an S wave, is refracted into the central core as a P wave, and is refracted back into the mantle, wherein it finally emerges as an S wave. Such rays as SKKS correspond to waves that have suffered an internal reflection at the boundary of the central core.

The discovery of the existence of an inner core in 1936 by the Danish seismologist Inge Lehmann made it necessary to introduce additional basic symbols. For paths of waves inside the central core, the symbols i and I are used analogously to c and K for the whole Earth; therefore, i indicates reflection upward at the boundary between the outer and inner portions of the central core, and I corresponds to the part (of P type) of the path of a wave that lies inside the inner portion. Thus, for instance, discrimination needs to be made between the rays PKP, PKiKP, and PKIKP. The first of these corresponds to a wave that has entered the outer part of the central core but has not reached the inner core, the second to one that has been reflected upward at the inner core boundary, and the third to one that has penetrated into the inner portion.

By combining the symbols p, s, P, S, c, K, i, and I in various ways, notation is developed for all the main rays associated with body earthquake waves.

Hidden Inner Earth Deep Structure Anomalies

The symbol J, introduced to correspond with S waves located within Earth’s inner core, is only evidence if such ( if ever ) be found for such waves. Use of times of travel along rays to infer a hidden structure is analogous to the use of X-rays in medical tomography. The method involves reconstructing an image of internal anomalies from measurements made at the outer surface. Nowadays, hundreds of thousands of travel times of P and S waves are available in earthquake catalogs for the tomographic imaging of the Earth’s interior and the mapping of internal structure.

Inner Earth Deep Structure

Thinest & Thickest Part of Earth’s Crust

Inner Earth, based on earthquake records and imaging studies, are officially represented, as:

A solid layer flowing patterns of a mantle, at its ’thickest point’ being about 1,800-miles ( 2,900 kilometers ) thick, although at its ‘thinest point’ less than 6-miles ( 10 kilometers ) beneath the ocean seafloor bed beneath the surface of the ultra-deep sea.

The thin surface rock layer surrounding the mantle is the crust, whose lower boundary is called the Mohorovičić discontinuity. In normal continental regions the crust is about 30 kilometers to 40 km thick; there is usually a superficial low-velocity sedimentary layer underlain by a zone in which seismic velocity increases with depth. Beneath this zone there is a layer in which P-wave velocities in some places fall from 6 to 5.6 km per second. The middle part of the crust is characterized by a heterogeneous zone with P velocities of nearly 6 to 6.3 km per second. The lowest layer of the crust ( about 10 km thick ) has significantly higher P velocities, ranging up to nearly 7 km per second.

In the deep ocean there is a sedimentary layer that is about 1 km thick. Underneath is the lower layer of the oceanic crust, which is about 4 km thick. This layer is inferred to consist of basalt that formed where extrusions of basaltic magma at oceanic ridges have been added to the upper part of lithospheric plates as they spread away from the ridge crests. This crustal layer cools as it moves away from the ridge crest, and its seismic velocities increase correspondingly.

Below Earth’s mantle at its ‘thickest point’ exists a shell depth of 1,800 miles ( 2,255 km ), which seismic waves indicate, has liquid property form, and at Earth’s ‘shallowest point’ only 6-miles ( 10 kilometers ) located beneath the ultra-deep seafloor of the planet ocean.

At the very centre of the planet is a separate solid core with a radius of 1,216 km. Recent work with observed seismic waves has revealed three-dimensional structural details inside the Earth, especially in the crust and lithosphere, under the subduction zones, at the base of the mantle, and in the inner core. These regional variations are important in explaining the dynamic history of the planet.

Long-Period Global Oscillations

Sometimes earthquakes can be so great, the entire planet Earth will vibrate like a ringing bell’s echo, with the deepest tone of vibration recorded by modern man on planet Earth is a period of measurement where the length of time between the arrival of successive crests in a wave train has been 54-minutes considered by human beings as ‘grave’ ( an extremely significant danger ).

Knowledge of these vibrations has come from a remarkable extension in the ‘range of periods of ground movements’ now able to be recorded by modern ‘digital long-period seismographs’ spanning the entire allowable spectrum of earthquake wave periods, from: ordinary P waves ( with periods of tenths of seconds ) to vibrations ( with periods on the order of 12-hours and 24-hours ), i.e. those movements occuring within Earth ocean tides.

The measurements of vibrations of the whole Earth provide important information on the properties of the interior of the planet. It should be emphasized that these free vibrations are set up by the energy release of the earthquake source but continue for many hours and sometimes even days. For an elastic sphere such as the Earth, two types of vibrations are known to be possible. In one type, called S modes, or spheroidal vibrations, the motions of the elements of the sphere have components along the radius as well as along the tangent. In the second [ 2nd ] type, which are designated as T modes or torsional vibrations, there is shear but no radial displacements. The nomenclature is nSl and nTl, where the letters n and l are related to the surfaces in the vibration at which there is zero motion. Four ( 4 ) examples are illustrated in the figure. The subscript n gives a count of the number of internal zero-motion ( nodal ) surfaces, and l indicates the number of surface nodal lines.

Several hundred types of S and T vibrations have been identified and the associated periods measured. The amplitudes of the ground motion in the vibrations have been determined for particular earthquakes, and, more important, the attenuation of each component vibration has been measured. The dimensionless measure of this decay constant is called the quality factor Q. The greater the value of Q, the less the wave or vibration damping. Typically, for oS10 and oT10, the Q values are about 250.

The rate of decay of the vibrations of the whole Earth with the passage of time can be seen in the figure, where they appear superimposed for 20 hours of the 12-hour tidal deformations of the Earth. At the bottom of the figure these vibrations have been split up into a series of peaks, each with a definite frequency, similar to that of the spectrum of light.

Such a spectrum indicates the relative amplitude of each harmonic present in the free oscillations. If the physical properties of the Earth’s interior were known, all these individual peaks could be calculated directly. Instead, the internal structure must be estimated from the observed peaks.

Recent research has shown that observations of long-period oscillations of the Earth discriminate fairly finely between different Earth models. In applying the observations to improve the resolution and precision of such representations of the planet’s internal structure, a considerable number of Earth models are set up, and all the periods of their free oscillations are computed and checked against the observations. Models can then be successively eliminated until only a small range remains. In practice, the work starts with existing models; efforts are made to amend them by sequential steps until full compatibility with the observations is achieved, within the uncertainties of the observations. Even so, the resulting computed Earth structure is not a unique solution to the problem.

Extraterrestrial Seismic Phenomena

Space vehicles have carried equipment onto the our Moon and Mars surface recording seismic waves from where seismologists on Earth receive telemetry signals from seismic events from both.

By 1969, seismographs had been placed at six sites on the Moon during the U.S. Apollo missions. Recording of seismic data ceased in September 1977. The instruments detected between 600 and 3,000 moonquakes during each year of their operation, though most of these seismic events were very small. The ground noise on the lunar surface is low compared with that of the Earth, so that the seismographs could be operated at very high magnifications. Because there was more than one station on the Moon, it was possible to use the arrival times of P and S waves at the lunar stations from the moonquakes to determine foci in the same way as is done on the Earth.

Moonquakes are of three types. First, there are the events caused by the impact of lunar modules, booster rockets, and meteorites. The lunar seismograph stations were able to detect meteorites hitting the Moon’s surface more than 1,000 km (600 miles) away. The two other types of moonquakes had natural sources in the Moon’s interior: they presumably resulted from rock fracturing, as on Earth. The most common type of natural moonquake had deep foci, at depths of 600 to 1,000 km; the less common variety had shallow focal depths.

Seismological research on Mars has been less successful. Only one of the seismometers carried to the Martian surface by the U.S. Viking landers during the mid-1970s remained operational, and only one potential marsquake was detected in 546 Martian days.

Historical Major Earthquakes

Major historical earthquakes chronological listing in table ( below ).

 

Major Earthquake History

Year

Region / Area

Affected

* Mag.

Intensity

Human Death

Numbers

( approx. )

Remarks

c. 1500 BCE

Knossos,

Crete

(Greece)

X

One of several events that leveled the capital of Minoan civilization, this quake accompanied the explosion of the nearby volcanic islandof Thera.

27 BCE

Thebes

(Egypt)

This quake cracked one of the statues known as the Colossi of Memnon, and for almost two centuries the “singing Memnon” emitted musical tones on certain mornings as it was warmed by the Sun’s rays.

62 CE

Pompeii

and Herculaneum

(Italy)

X

These two prosperous Roman cities had not yet recovered from the quake of 62 when they were buried by the eruption of Mount Vesuvius in 79.

115

AntiochAntakya,

(Turkey)

XI

A centre of Hellenistic and early Christian culture, Antiochsuffered many devastating quakes; this one almost killed the visiting Roman emperor Trajan.

1556

Shaanxi

( province )

China

IX

830,000

Deadliest earthquake ever recorded, possible.

1650

Cuzco

(Peru)

8.1

VIII

Many ofCuzco’s Baroque monuments date to the rebuilding of the city after this quake.

1692

Port Royal (Jamaica)

2,000

Much of thisBritish West Indiesport, a notorious haven for buccaneers and slave traders, sank beneath the sea following the quake.

1693

southeasternSicily,

(Italy)

XI

93,000

Syracuse, Catania, and Ragusa were almost completely destroyed but were rebuilt with a Baroque splendour that still attracts tourists.

1755

Lisbon,Portugal

XI

62,000

The Lisbon earthquake of 1755 was felt as far away asAlgiers and caused a tsunami that reached theCaribbean.

1780

Tabriz

(Iran)

7.7

200,000

This ancient highland city was destroyed and rebuilt, as it had been in 791, 858, 1041, and 1721 and would be again in 1927.
1811 – 1812

NewMadrid,Missouri

(USA)

7.5 – 7.7

XII

A series of quakes at the New Madrid Fault caused few deaths, but the New Madrid earthquake of 1811 – 1812 rerouted portions of the Mississippi River and was felt fromCanada to theGulf of Mexico.

1812

Caracas

(Venezuela)

9.6

X

26,000

A provincial town in 1812,Caracasrecovered and eventually becameVenezuela’s capital.

1835

Concepción,

(Chile)

8.5

35

British naturalist Charles Darwin, witnessing this quake, marveled at the power of the Earth to destroy cities and alter landscapes.

1886

Charleston,South Carolina

(USA)

IX

60

This was one of the largest quakes ever to hit the easternUnited States.

1895

Ljubljana

(Slovenia)

6.1

VIII

ModernLjubljanais said to have been born in the rebuilding after this quake.

1906

San Francisco,California

(USA)

7.9

XI

700

San Franciscostill dates its modern development from the San Francisco earthquake of 1906 and the resulting fires.

1908

Messina and Reggio di Calabria,Italy

7.5

XII

110,000

These two cities on theStraitofMessinawere almost completely destroyed in what is said to beEurope’s worst earthquake ever.

1920

Gansu

( province )

China

8.5

200,000

Many of the deaths in this quake-prone province were caused by huge landslides.

1923

Tokyo-Yokohama,

(Japan)

7.9

142,800

Japan’s capital and its principal port, located on soft alluvial ground, suffered severely from the Tokyo-Yokohama earthquake of 1923.

1931

Hawke Bay,New Zealand

7.9

256

The bayside towns of Napier and Hastings were rebuilt in an Art Deco style that is now a great tourist attraction.

1935

Quetta (Pakistan)

7.5

X

20,000

The capital of Balochistan province was severely damaged in the most destructive quake to hitSouth Asiain the 20th century.

1948

Ashgabat (Turkmenistan)

7.3

X

176,000

Every year,Turkmenistancommemorates the utter destruction of its capital in this quake.

1950

Assam,India

8.7

X

574

The largest quake ever recorded inSouth Asiakilled relatively few people in a lightly populated region along the Indo-Chinese border.

1960

Valdivia

and

Puerto Montt,

(Chile)

9.5

XI

5,700

The Chile earthquake of 1960, the largest quake ever recorded in the world, produced a tsunami that crossed the Pacific Ocean toJapan, where it killed more than 100 people.

1963

Skopje,Macedonia

6.9

X

1,070

The capital ofMacedoniahad to be rebuilt almost completely following this quake.

1964

Prince William Sound,Alaska,U.S.

9.2

131

Anchorage, Seward, and Valdez were damaged, but most deaths in the Alaska earthquake of 1964 were caused by tsunamis inAlaska and as far away asCalifornia.

1970

Chimbote,Peru

7.9

70,000

Most of the damage and loss of life resulting from the Ancash earthquake of 1970 was caused by landslides and the collapse of poorly constructed buildings.

1972

Managua,Nicaragua

6.2

10,000

The centre of the capital ofNicaraguawas almost completely destroyed; the business section was later rebuilt some 6 miles (10 km) away.

1976

Guatemala City,Guatemala

7.5

IX

23,000

Rebuilt following a series of devastating quakes in 1917–18, the capital ofGuatemalaagain suffered great destruction.

1976

Tangshan,

(China)

7.5

X

242,000

In the Tangshan earthquake of 1976, this industrial city was almost completely destroyed in the worst earthquake disaster in modern history.

1985

Michoacán state and Mexico City,Mexico

8.1

IX

10,000

The centre of Mexico City, built largely on the soft subsoil of an ancient lake, suffered great damage in the Mexico City earthquake of 1985.

1988

Spitak and Gyumri,Armenia

6.8

X

25,000

This quake destroyed nearly one-third ofArmenia’s industrial capacity.

1989

Loma Prieta,California,U.S.

7.1

IX

62

The San Francisco–Oakland earthquake of 1989, the first sizable movement of the San Andreas Fault since 1906, collapsed a section of the San Francisco–Oakland Bay Bridge.

1994

Northridge,

California

(USA)

6.8

IX

60

Centred in the urbanized San Fernando Valley, the Northridge earthquake of 1994 collapsed freeways and some buildings, but damage was limited by earthquake-resistant construction.

1995

Kobe,

(Japan)

6.9

XI

5,502

The Great Hanshin Earthquake destroyed or damaged 200,000 buildings and left 300,000 people homeless.

1999

Izmit,Turkey

7.4

X

17,000

The Izmit earthquake of 1999 heavily damaged the industrial city ofIzmit and the naval base at Golcuk.

1999

Nan-t’ou county,Taiwan

7.7

X

2,400

The Taiwan earthquake of 1999, the worst to hitTaiwan since 1935, provided a wealth of digitized data for seismic and engineering studies.

2001

Bhuj,

Gujarat

( state )

India

8.0

X

20,000

The Bhuj earthquake of 2001, possibly the deadliest ever to hitIndia, was felt acrossIndia andPakistan.

2003

Bam

(Iran)

6.6

IX

26,000

This ancientSilk Roadfortress city, built mostly of mud brick, was almost completely destroyed.

2004

Aceh

( province )

Sumatra

(Indonesia)

9.1

200,000

The deaths resulting from this offshore quake actually were caused by a tsunami originating in the Indian Ocean that, in addition to killing more than 150,000 inIndonesia, killed people as far away asSri Lanka andSomalia.

2005

Azad Kashmir

(Pakistanadministered )

( Kashmir )

7.6

VIII

80,000

The Kashmir earthquake of 2005, perhaps the deadliest shock ever to strikeSouth Asia, left hundreds of thousands of people exposed to the coming winter weather.

2008

Sichuan

( province )

(China

7.9

IX

69,000

The Sichuan earthquake of 2008 left over 5 million people homeless across the region, and over half of Beichuan city was destroyed by the initial seismic event and the release of water from a lake formed by nearby landslides.

2009

L’Aquila,

(Italy)

6.3

VIII

300

The L’Aquila earthquake of 2009 left more than 60,000 people homeless and damaged many of the city’s medieval buildings.

2010

Port-au-Prince,

(Haiti)

7.0

IX

316,000

The Haiti earthquake of 2010 devastated the metropolitan area ofPort-au-Prince and left an estimated 1.5 million survivors homeless.

2010

Maule,

(Chile)

8.8

VIII

521

The Chile earthquake of 2010 produced widespread damage inChile’s central region and triggered tsunami warnings throughout the Pacific basin.

2010

Christchurch,(New Zealand)

7.0

VIII

180

Most of the devastation associated with the Christchurch earthquakes of 2010–11 resulted from a magnitude-6.3 aftershock that struck on February 22, 2011.

2011

Honshu,

(Japan)

9.0

VIII

20,000

The powerful Japan earthquake and tsunami of 2011, which sent tsunami waves across the Pacific basin, caused widespread damage throughout easternHonshu.

2011

Erciş

And

Van,

(Turkey)

7.2

IX

The Erciş-Van earthquake of 2011 destroyed several apartment complexes and shattered mud-brick homes throughout the region.
  Data Sources: National Oceanic and Atmospheric Administration ( NOAA ), National Geophysical Data Center ( NGDC ), Significant Earthquake Database ( SED ), a searchable online database using the Catalog of Significant Earthquakes 2150 B.C. – 1991 A.D. ( with Addenda ), and U.S. Geological Survey ( USGS ), Earthquake Hazards Program.  * Measures of magnitude may differ from other sources.

ARTICLE

AdditionalReading

Earthquakes are covered mainly in books on seismology.

Recommended introductory texts, are:

Bruce A. Bolt, Earthquakes, 4th ed. (1999), and Earthquakes and Geological Discovery (1993); and,

Jack Oliver, Shocks and Rocks: Seismology and the Plate Tectonics Revolution (1996).

Comprehensive books on key aspects of seismic hazards, are:

Leon Reiter, Earthquake Hazard Analysis – Issues and Insights (1990); and,

Robert S. Yeats, Kerry Sieh, and Clarence R. Allen, The Geology of Earthquakes (1997).

A history of discrimination, between:

Underground nuclear explosions and natural earthquakes, is given by:

Bruce A. Bolt, “Nuclear Explosions and Earthquakes: The Parted Veil” ( 1976 ).

More advanced texts that treat the theory of earthquake waves in detail, are:

Agustín Udías, Principles of Seismology (1999);

Thorne Lay and Terry C. Wallace, Modern Global Seismology (1995);

Peter M. Shearer, Introduction to Seismology (1999); and,

K.E. Bullen and Bruce A. Bolt, An Introduction to the Theory of Seismology, 4th ed. (1985).

LINKS

Year in Review

Britannica provides coverage of “earthquake” in the following Year in Review articles.

Bhutan  ( in  Bhutan )

geophysics  ( in  geophysics )

Japan

Kyrgyzstan  (in  Kyrgyzstan)

Nepal  (in  Nepal)

New Zealand  (in  New Zealand )

Chile  (in  Chile: Year In Review 2010)

China  (in  China: Year In Review 2010)

“Engineering for Earthquakes”  ( in  Engineering for Earthquakes: Year In Review 2010 (earthquake) )

geophysics  (in  Earth Sciences: Year In Review 2010)

Haiti  (in  Haiti: Year In Review 2010; in  Haiti earthquake of 2010 )

Mauritius  (in  Mauritius: Year In Review 2010)

New Zealand  (in  New Zealand: Year In Review 2010 )

Bhutan  (in  Bhutan: Year In Review 2009)

Costa Rica  (in  Costa Rica: Year In Review 2009 )

geophysics  (in  Earth Sciences: Year In Review 2009)

Indonesia  (in  Indonesia: Year In Review 2009)

Italy  (in  Italy: Year In Review 2009; in  Vatican City State: Year In Review 2009 )

Samoa  (in  Samoa: Year In Review 2009)

“Major Earthquake Shakes China’s Sichuan Province, A”  ( in  A Major Earthquake Shakes China’s Sichuan Province: Year In Review 2008 (earthquake) )

China  (in  China: Year In Review 2008; in  United Nations: Year In Review 2008 )

Congo, Democratic Republic of the  (in  Democratic Republic of the Congo: Year In Review 2008)

geology  (in  Earth Sciences: Year In Review 2008)

geophysics  (in  Earth Sciences: Year In Review 2008 )

geophysics  (in  Earth Sciences: Year In Review 2007)

paleontology  (in  Life Sciences: Year In Review 2007)

Peru  (in  Peru: Year In Review 2007 )

geophysics  (in  Earth Sciences: Year In Review 2006)

glaciers  (in  Earth Sciences: Year In Review 2006)

Mozambique  (in  Mozambique: Year In Review 2006)

archaeology  ( in  Anthropology and Archaeology: Year In Review 2005 )

geophysics  (in  Earth Sciences: Year In Review 2005)

India  (in  India: Year In Review 2005 )

Pakistan(in  Pakistan: Year In Review 2005 )

“Cataclysm in Kashmir”  (in  Cataclysm in Kashmir: Year In Review 2005 (Jammu and Kashmir))

geophysics  (in  Earth Sciences: Year In Review 2004)

Japan  (in  Japan: Year In Review 2004)

tsunami  (in  The Deadliest Tsunami: Year In Review 2004 (tsunami))

geophysics  (in  Earth Sciences: Year In Review 1996)

geophysics  (in  Earth and Space Sciences: Year In Review 1995)

LINKS

Other Britannica Sites

Get involved Share

Articles from Britannica encyclopedias for elementary and high school students.

Earthquake – Children’s Encyclopedia ( Ages 8-11 ) – During an earthquake, huge masses of rock move beneath the Earth’s surface and cause the ground to shake. Earthquakes occur constantly around the world. Often they are too small for people to feel at all. Sometimes, however, earthquakes cause great losses of life and property.

Earthquake – Student Encyclopedia ( Ages 11 and up ) – Sudden shaking of the ground that occurs when masses of rock change position below Earth’s surface is called an earthquake. The shifting masses send out shock waves that may be powerful enough to alter the surface, thrusting up cliffs and opening great cracks in the ground.

The topic earthquake is discussed at the following external Web sites.

Citations

To cite this page: MLAAPAHarvardChicago Manual of Style

MLA Style: “earthquake.” Encyclopædia Britannica. Encyclopædia Britannica Online. Encyclopædia Britannica Inc., 2012. Web. 21 Mar. 2012. http://www.britannica.com/EBchecked/topic/176199/earthquake

Reference

http://www.britannica.com/EBchecked/topic/176199/earthquake/247989/Shallow-intermediate-and-deep-foci?anchor=ref105456

– – – –

Feeling ‘educated’? Think you’re out-of the earthquake and tsunami water subject?

March 23, 2012 news, however contradicts decades of professional scientific knowledge and studies so, if you were just feeling ‘overly educated’ about earthquakes and tsunamis – don’t be. You’re now lost at sea, in the same proverbial ‘boat’, with all those global government scientific and technical ( S&T ) professionals who thought they understood previous information surrounding earthquakes and tsunamis.

After comparing Japan 9.0 ‘earthquake directional arrows’, depicted on the charts ( further above ), with ocean currents, tidal charts and trade winds from the global jet stream there’s a problem that cannot be explained when on March 23, 2012 British Columbia, Canada reported its northwest Pacific Ocean coastal sea waters held a 100-foot fishing boat ‘still afloat’ – more than 1-year after the Japan tsunami from its 9.0 earthquake on March 11, 2011.

[ IMAGE ( above ): 11MAR11 Japan 9.0 earthquake tsunami vistim fishing boat ( 50-metre ) found more than 1-year later still adrift in the Pacific Ocean – but thousands of miles away – off North America Pacific Ocean west coastal territory of Haida Gwaii, British Columbia, Canada ( Click on image to enlarge ) ]

– – – –

Source: CBS News – British Columbia ( Canada )

Tsunami Linked Fishing Boat Adrift Off B.C.

Nobody Believed Aboard 50-Meter Vessel Swept Away In 2011 Japanese Disaster CBC News

March 23, 2012 21:35 ( PST ) Updated from: 23MAR12 18:59 ( PST )

A Japanese fishing boat that was washed out to sea in the March 2011 Japanese tsunami has been located adrift off the coast of British Columbia ( B.C. ), according to the federal Transport Ministry.

The 50-metre vessel was spotted by the crew of an aircraft on routine patrol about 275 kilometres off Haida Gwaii, formerly known as the Queen Charlotte Islands, ministry spokeswoman Sau Sau Liu said Friday.

“Close visual aerial inspection and hails to the ship indicate there is no one on board,” Liu said. “The owner of the vessel has been contacted and made aware of its location.”

U.S. Senator Maria Cantwell, ofWashington, said in a release that the boat was expected to drift slowly southeast.

“On its current trajectory and speed, the vessel would not [ yet ] make landfall for approximately 50-days,” Cantwell said. Cantwell did not specify where landfall was expected to be.

First large debris

The boat is the first large piece of debris found following the earthquake and tsunami that struckJapanone year ago.

Scientists, at the University of Hawaii say a field of about 18,000,000 million tonnes of debris is slowly being carried by ocean currents toward North America. The field is estimated to be about 3,200 kilometres long and 1,600 kilometres wide.

Scientists have estimated some of the debris would hit B.C. shores by 2014.

Some people on the west coast of Vancouver Island believe ‘smaller pieces of debris have already washed ashore there’.

The March 11, 2011, tsunami was generated after a magnitude 9.0 earthquake struck off the coast of northern Japan. The huge waves and swells of the tsunami moved inland and then retreated back into the Pacific Ocean, carrying human beings, wreckage of buildings, cars and boats.

Nearly 19,000 people were killed.

Reference

http://www.cbc.ca/news/canada/british-columbia/story/2012/03/23/bc-fishing-boat-tsunami-debris.html?cmp=rss

– – – –

Submitted for review and commentary by,

Kentron Intellect Research Vault

E-MAIL: KentronIntellectResearchVault@Gmail.Com

WWW: http://KentronIntellectResearchVault.WordPress.Com

References

http://earthquake.usgs.gov/earthquakes/world/japan/031111_M9.0prelim_geodetic_slip.php
http://en.wikipedia.org/wiki/Moment_magnitude_scale
http://www.gsi.go.jp/cais/topic110315.2-index-e.html
http://www.seismolab.caltech.edu
http://www.tectonics.caltech.edu/slip_history/2011_taiheiyo-oki
http://supersites.earthobservations.org/ARIA_japan_co_postseismic.pdf
ftp://sideshow.jpl.nasa.gov/pub/usrs/ARIA/README.txt
http://speclib.jpl.nasa.gov/documents/jhu_desc
http://earthquake.usgs.gov/regional/pacnw/paleo/greateq/conf.php
http://www.passcal.nmt.edu/content/array-arrays-elusive-ets-cascadia-subduction-zone
http://wcda.pgc.nrcan.gc.ca:8080/wcda/tams_e.php
http://www.pnsn.org/tremor
http://earthquake.usgs.gov/earthquakes/recenteqscanv/Quakes/quakes_all.html
http://nthmp.tsunami.gov
http://wcatwc.arh.noaa.gov
http://www.pnsn.org/NEWS/PRESS_RELEASES/CAFE/CAFE_intro.html
http://www.pnsn.org/WEBICORDER/DEEPTREM/summer2009.html
http://earthquake.usgs.gov/prepare
http://www.passcal.nmt.edu/content/usarray
http://www.iris.washington.edu/hq
http://www.iris.edu/dms/dmc
http://www.iris.edu/dhi/clients.htm
http://www.iris.edu/hq/middle_america/docs/presentations/1026/MORENO.pdf
http://www.unavco.org/aboutus/history.html
http://earthquake.usgs.gov/monitoring/anss
http://earthquake.usgs.gov/regional/asl
http://earthquake.usgs.gov/regional/asl/data
http://www.usarray.org/files/docs/pubs/US_Data_Plan_Final-V7.pdf
http://neic.usgs.gov/neis/gis/station_comma_list.asc
http://earthquake.usgs.gov/research/physics/lab
http://earthquake.usgs.gov/regional/asl/data
http://pubs.usgs.gov/gip/dynamic/Pangaea.html
http://coaps.fsu.edu/scatterometry/meeting/docs/2009_august/intro/shimoda.pdf
http://coaps.fsu.edu/scatterometry/meeting/past.php
http://eqinfo.ucsd.edu/dbrecenteqs/anza
http://www.ceri.memphis.edu/seismic
http://conceptactivityresearchvault.wordpress.com/2011/03/28/global-environmental-intelligence-gei
http://www.af.mil/information/factsheets/factsheet_print.asp?fsID=157&page=1
https://login.afwa.af.mil/register/
https://login.afwa.af.mil/amserver/UI/Login?goto=https%3A%2F%2Fweather.afwa.af.mil%3A443%2FHOST_HOME%2FDNXM%2FWRF%2Findex.html
http://www.globalsecurity.org/space/library/news/2011/space-110225-afns01.htm
http://www.wrf-model.org/plots/wrfrealtime.php

Global Environmental Intelligence ( GEI )

Global Environmental Intelligence - GEI

Global Environmental Intelligence ( GEI )
by, Kentron Intellect Research Vault ( KIRV )

TUCSON, Arizona – March 7, 2012 – Nebraska is part of America’s heartland where less than fifty ( 50 ) people are ‘officially designated’ – by the U.S. Air Force Weather Agency ( AFWA ) – to analyze the pulsebeat of ‘doom and gloom’ through ‘scientific terms’ calculated within a new ‘strategic center’ using the latest ultra high-tech computer system, software application program modules and networking capabilities for what ‘global government national infrastructures’ expect, based on what the National Aeronautic Space Administration ( NASA ) warned would hit Earth during the current ”Solar Maximum” ( Cycle 24 ) sending a ‘significant’ “Solar Energetic Particle Event” ( SEPE ) to Earth that “we all need to be concerned about.”

What possibly could go wrong?

We’re all protected, aren’t we?

We are ‘not protected against natural events’ that we have ‘no control over’, but governments have been doing their best to monitor activities, which is one ( 1 ) reason ‘why’ the United States Air Force ( USAF ) ‘subordinate organization’ known as its Air Force Weather Agency ( AFWA ) analyzes significant scientific data from horizons above and below the Earth, searching for clues as to what ‘natural significant event’ ( i.e. Solar Energetic Particle Event ( SEPE ), tectonic plate shift ( continental earthquakes ),  magnetic pole shift ( climate change ), or something else as dramatic will occur.

The sole “Mission” of the U.S. Air Force Weather Agency ( USAFWA ) is tasked as the only U.S. government Agency ‘officially designated to analyze’ incredible amounts of scientific doom and gloom information the public might think came from ‘doomsday prophets’ concerned about Earth calamities going to happen soon.

Space Weather Earth Forecasting

Current official information ( received 24-hours a day and 365-days a year ) by the U.S. Air Force Weather Agency ( USAFWA ) and its host Directorate of Operational Weather Squadrons ( OWS ), group of ‘solar observatories’, ‘satellites’, ‘spacecraft’, ‘interstellar lightwave spectral and radio-frequency ( RF ) imaging sensors’, ‘telescopes’ with a wide array of ‘cameras’ as well as ‘human observers’ are continuing to report enough worrisome information that would scare many of us, but – ‘officially’ – the public will never know ‘what’ is about to befall them or ‘when’ that will occur.

After a considerable amount of research into this subject, the U.S. Air Force was caught ‘publicly admitting’ they are able to “know within minutes” as to ‘what significant Earth Event is coming’ and just ‘when it will arrive’ – anywhere between a few hours up-to days in-advance’, but the public will never be informed. Why?

Doom & Gloom or Fantasy Island?

Undoubtedly, classified ( in the ‘interest of national security’ ), would be the officially given reason to the public. Some might ask, “But, for ‘whose’ “security?” People, for the most part, are ‘not secure’ – especially when facing a pending injurious consequence resulting in ruination of life as they may have come to know their’s – and most gravitate toward wanting to “be with family” ( or “loved ones” ) as any pending cataclysmic Earth Event draws nearer to them all. The U.S. Air Force Weather Agency ( AFWA ), however realizes that being occupants of planet Earth is ‘not the same thing as’ “Fantasy Island” or a trip to “McDonaldland” as the rest of the public at-large has come to know ‘their version’ of what planet Earth is for ‘them’ and their children.

Preparedness

What may even be more interesting to the public is to review official U.S. data surrounding what U.S. Air Force commands ‘are already preparing for’, what ‘they already have set in-place’ for ‘everyone living in the Continental United States ( CONUS )’, and ‘how they are going to accomplish’ their secret-sensitive “Mission” on ‘people’ throughout the United States. This is ‘definitely not’ a ‘science fiction movie’ or ‘Orwellian Theory’ within this report, but strictly what the U.S. Air Force Weather Agency and another U.S. Air Force ‘subordinate organization’, the Air and Space Operations ( ASO ), Director of Weather ( USAF Deputy Chief of Staff ) has ‘recently laid-out publicly’, but for the ‘public’ to decipher ‘all of what was so officially presented’ many were lost in the government psychobabble resulting in no impact on the public as to the ‘seriousness’ for what they need to ‘begin preparing for’.

Television Commercial ‘Official Advertisement’ Already Warned Public!

Recently ( 2010 and 2011 ), the U.S. Federal Emergency Management Agency ( FEMA ) began advertising its own advertising commercial on American television. The FEMA TV ad commercial ’warns the public to prepare’ for an “unexpected Event” that “can suddenly turn your life – and those around you – ‘upside down’.”

FEMA’s TV commercial advertisement ( see video clip – below ) is easily recalled by those who watch / watched it – depicting a ‘family inside a home’ where all of a sudden everything inside the home’ begins floating up in the air; as though ‘gravity was somehow suddeny lost’. This ‘professionally produced television commercial advertisement’ from FEMA ‘stylishly warns people to prepare for any unexpected natural disaster Earth Event, however FEMA did not describe ‘what type of event strike for people to expect’.

The sadest part about this is that the U.S. government actually considers this television commercial advertisement an “official public warning to prepare for a national disaster” while simultaneously the U.S. government hopes the commercial will ‘not’ create ‘panic’ or ‘chaos’ in the streets that would disrupt ‘skyrocketing profits’ received by ‘global elite’ and/or ‘big business’ through their stock market portfolios. In short, this alluding to a national disaster is nothing short of being the ‘Biggest Show On Earth’ continuing profits from basic living expenses saddling the ‘little people’, ‘worthless eaters’, and ‘consumers of “blue gold” ( water ) the global elite predict will rise – more-so than precious metals – after such an Earth Disaster Event ( EDE ).

Survival By Chance or Circumstance

One of the most worrisome concerns, after reviewing this report ( filled with official U.S. government information’ ), is: ‘What’, ‘when’ and ‘how’ U.S. government ‘decision-makers’ react to their ‘official alert’. Many will undoubtedly and instantly gather with ‘their families’, however ‘what about the rest of us’ gathering with ‘our families? We will not be provided the same ‘early alert’ to even know – until after it ( significant Earth Event ) already takes place! Is this ‘fair’? Is this ‘just’? Is this ‘humane’? Or, is this pure and unadulterated ‘elitist selfishness’ that government affords for only a ‘select few’ lucky souls fortunate enough to be provided with sufficient warning in-advance and additionally told were to go and what to do in order to escape the wrath of such a cataclysmic Earth Event? Few people realize that those ‘decision-makers’ have already been instructed in-advance as to ‘what’ they need to do and ‘when’ they need to do it.

AFWA Notifies 350 Military Installations About Earth Disaster Event

Even more frightening is, ‘what’ the U.S. military has ‘already been commanded’ as to ‘how to accomplish’ their “Mission” based on a highly-classified U.S. Presidential Executive Order ( EO ) turning the U.S. population over to U.S. military control. The U.S. Air Force Weather Agency ( AFWA ) was also ‘officially designated’ as the “Agency” ‘controlling internet ( cyberspace ) shutdown’, according to the report ( below ) and further researching the capabilities of its ‘subordinate organization’ ( un-named ) “Strategic Center.”

Current Reporting Revelations

This report contains ‘no old news’ but ‘new official U.S. government revelations’, which cause the public to re-think where they are and what is coming. The only information this report does ‘not’ provide publicly are the ‘names’, ‘addresses, ‘phone numbers’, and ‘photographs’ of those ‘key individuals’ whom are ‘already designated’ as those ‘ready’, ‘willing’ and ‘able’ to ‘perform as ordered on-command’ – even if that command is ‘unable to be given’ after a cataclysmic Earth Event!

Reviewers Need To Know

There are only two ( 2 ) ‘official U.S. government reports’ provided ( below ) in this report, which contains internet links to more ‘official information’ on this subject.

Knowing the public at-large, and hot it predominantly enjoys ‘entertaining flowery written articles’ – they never bother to  ‘research’ anything further about’ – this report was purposely kept ‘short and sweet’ ( no flowers – no frills ). It includes an ‘official government article’ that was converted into a ‘very simple outline’, and that is followed-up with yet another ‘official government article’ left as-is. Hopefully, some may consume this report far-easier than others at this website so its point on this subject may be even-better publicly received.

The first ( 1st ) report ( included immediately below ) was ‘thoroughly analyzed’ in its ‘original format’, and then – based on ‘even more detailed official information’ discovered – was ‘re-formatted into an even-more proper perspective’, and – for the sake of making this report far-more easily comprehendible for public review – it was finally ‘converted into a simple outline format’ but only because of the fashion by-which government complicated the original public information release’; saying everything but not saying it very well for the public to easily understand.

The second ( 2nd ) report ( further below ) was left in its ‘original format’ because anyone reviewing it can quickly understand the nature of the ‘true problems’ surrounding what we all are going to be facing soon enough.

You be the judge, as to ‘what’ you will do, to ‘prepare’ for the ‘officially expected’ Earth Event. Enjoy the report ( below ) and be sure to click on the embedded links to learn more about ‘what effects’ are ‘coming’ and ‘what’ the U.S. Department of Homeland Security ( DHS ) is preparing for.

Tell a friend about what you review ( below ) and your reply may be, “You can’t trust what you read on the internet.” If anyone does ‘not trust’ this report ( below ) – or anything else on this website – they should click on the ‘official government website links’ ( provided at the bottom of this report, and other links found herein ) – if they are equipped with an ‘attention-span’ ( longer than a bug ) or ‘not easily distracted’ by something as simple as a ‘horn honk’ – then they can ‘research it all’ for themselves and try to prove this information and you are incorrect.

Many will be amazed at how much these official government revelations will help them to prepare for more than ‘entertainment’.

– – – –

Source: United States Air Force ( USAF ) Weather Agency ( AFWA )

Air Force Weather Agency ( AFWA ) 106 Peacekeeper Drive, Suite 2 Offutt Air Force Base, Nebraska 68113-4039 USA TEL: +1 (402) 232-8166 ( Public Affairs ) TEL: DSN TEL: 272-8166 ( DSN )

HISTORY –

United States Air Force Weather Agency ( USAFWA ) – aka – Air Force Weather Agency ( AFWA ) claims heritage extrapolated back to World War I, when the U.S. Army ( USA ) ‘subordinate organization’ Signal Corps ( ASC ) ‘subordinate organization’ “Meteorological Service” ( SCMC ) provided ‘weather support’ for U.S. Army defense aircraft pilots.

Amidst the legacy of the U.S. Secretary of War, ( known ‘today’ as ) U.S. Department of Defense ( DoD ), it – having realized its ‘subordinate organization’ U.S. Army had ‘increasing military personnel numbers’ due to an ‘increasing defense stockpile inventory’ of U.S. Army ‘militarized aircraft’ – created a ‘then-new subordinate organization’ known as the U.S. Army ( USA ) Air Corps ( AAC ) that became well-known as the U.S. Army Air Corps ( USAAC ).

On July 1, 1937 the U.S. Secretary of War ‘reorganized’ the U.S. Army ‘subordinate organization’ Signal Corps ( ASC ) ‘subordinate organization’ Metereological Service ( MS ) ‘out-of’ the Signal Corps ( SC ) and ‘in-to’ the U.S. Army ( USA ) ‘subordinate organization’ Air Corps ( ASC ) as its ‘then-new subordinate organization’ Meteorological Service ( ACMS ).

On April 14, 1943 the U.S. Army Air Corps ( USAAC ) ‘subordinate organization’ Meteorological Service ( MS ) was ‘renamed’ “Weather Wing” ( WW ) and ‘physically moved’ to an Asheville, North Carolina location – where ‘today’ the U.S. Air Force Weather Agency ( USAFWA ) has its 14th Operational Weather Squadron ( OWS ) located.

In 1945, the U.S. Army Air Corps ( USAAC ) Weather Wing ( ACWW ) was ‘renamed’ the U.S. Army Air Corps “Weather Service” ( ACWS ).

In early 1946, the U.S. Army Air Corps Weather Service ( ACWS ) was ‘physically moved’ from Asheville, North Carolina to the U.S. Army Air Corps Langley Field Station – where ‘today’ Langley Air Force Base ( LAFB ) is located.

On March 13, 1946 the U.S. Army Air Corps Weather Service ( ACWS ) was ‘reorganized’ under the U.S. Army Air Corps ( USAAC ) ‘new subordinate organization’ Air Transport Command ( ATC ) where its ‘then-new subordinate organization’ “Weather Service” ( ACWS ) was ‘renamed’ “Air Weather Service” ( AWS ).

Later-on, in 1946, the U.S. Army Air Corps ( USAAC ) Air Transport Command ( ATC ) Air Weather Service ( AWS ) was ‘physically moved’ to Gravelly Point, Virginia.

In 1947, the U.S. Army Air Corps ( USAAC ) was ‘renamed’ United States Air Force ( USAF ) whereupon the “Air Weather Service” ( AWS ) became ‘directly under it’ and assumed ‘sole responsibility’ over ‘all global weather reporting’ and ‘all global weather forecasting’ for ‘both’ the U.S. Air Force and U.S. Army.

In 1948, the USAF ‘newly activated’ Military Air Transport Service ( MATS ) – ( known ‘today’ as ) Military Airlift Command ( MAC ) – then-received its ‘newly-transferred subordinate organization’ Air Weather Service ( AWS ) that was ‘removed from being directly under’ USAF ‘headquarters’. USAF Military Air Transport Service ( MATS ) ‘subordinate organization’ Air Weather Service ( AWS ) was then ‘physically moved’ to Andrews Air Force Base ( AAFB ) in Maryland.

In 1958, the USAF Military Airlift Command ( MAC ) ‘subordinate organization’ Air Weather Service ( AWS ) was then ‘physically moved’ to Scott Air Force Base ( SAFB ) in Illinois where – for almost 40-years – it remained.

In 1991, the USAF Military Airlift Command ( MAC ) ‘subordinate organization’ Air Weather Service ( AWS ) was ‘redesignated’ as a “field operating agency” and then ‘reorganized’ directly back under USAF ‘headquarters’.

On October 15, 1997 the USAF Air Weather Service ( AWS ) was ‘redesignated’ as the “Air Force Weather Agency” ( AFWA ) and ‘physically moved’ to Offutt Air Force Base ( OAFB ) in Nebraska where it remains ‘today’ – at least for the moment.

– –

United States Air Force ( USAF )

USAF Weather Agency ( USAFWA )

BUDGET –

$183,000,000 million U.S. dollars ( annually ) includes $98,000,000 million dollars for ‘Operations’ and ‘Maintenance’.

ESTABLISHED –

October 15, 1997.

LOCATION –

Offutt Air Force Base ( OAFB ) in Nebraska ( USA ).

REPORTING –

USAF Air and Space Operations ( ASO ), Director of Weather ( DOW ), Deputy Chief of Staff ( DCS ).

MISSION –

Enable U.S. decision makers’ exploitation of all ‘resources’ based-on ‘relevant’ Global Environmental Intelligence ( GEI ) for a ‘global spectrum of military warfare’ by ‘maximizing U.S. power, over:

– Land; – Air; – Space; and, – Internet ( cyberspace ).

PERSONNEL –

1,400 + manned, by:

– Military ( active-duty and reserve ); – Contractors ( government contracts ); and, – Civilians.

ORGANIZATION –

– ? ( # ) Agencies ( staff ); – Two ( 2 ) Weather Groups ( WXG – Coordinators ); – Fourteen + ( 14 + ) Weather Squadrons ( OWS – Directorates ); – Five ( 5 ) Observatories ( solar ); and, – One ( 1 ) Strategic Operation Systems Center ( Unknown – ‘subordinate organization’ ).

– –

USAFWA

Global Environmental Intelligence ( GEI )

DATA –

Global Environmental Intelligence ( GEI ) data is ‘collected’ from a variety of ‘information monitoring sources’ ( Earth based and space based locations ) providing a variety of ‘data feed streams’ of information from a computer operated system network out-of a ‘Strategic Center’ processing indications detecting almost any ‘significant natural negative impact’ that ‘could occur’ on ‘people’ and ‘national infractures’ throughout ‘global geographic regions’ and in ‘outerspace’.

SOURCES –

GEI data is provided by, many sources, amongst-which, its ( subordinate ) Strategic Center that processes data through its Computer Operation Systems Network that receives ‘channeled data streams’ from ‘encoded data telemetry’ sent from ‘ground-base sourced’ and ‘space-based sourced’ detection monitors comprised, of:

– Ground Sensors ( infra-red, spectral and sonic );; – Submerged Sensors ( infra-red, spectral and sonic );; – Space-based Sensors ( lightwave, spectral and sonic ); – Long-Range Telescopes ( radio-frequency wave, optical lightwave and sonic ); – Imaging Cameras ( lightwave, spectral and sonic ); and, – Human Observers ( various means ).

SPECIAL NOTE:

USAF ASO A6 –

The U.S. Air Force ( USAF ) Air and Space Operations ( ASO ) Communications Directorate ( ” A6 ” ) ‘provides’ the ‘policy oversight’ and ‘planning  oversight’ of Command, Control, and Communication Intelligence ( C3I ) for the U.S. Air Force Weather Agency ( USAFWA ).

USAF Air and Space Operations ( ASO ) Communications Directorate ( A6 ) also ‘provides support’ over ‘daily operations’, ‘contingency actions’, and ‘general’ Command, Control, Communication and Computer Intelligence ( C4I ) for the U.S. Air Force Weather Agency ( USAFWA ).

– –

USAFWA

Weather Groups ( WXD ) –

Weather Groups ( WXD ) are specifically designated to, coordinate:

– Global Environmental Intelligence ( GEI ) – Operational Weather Squadrons ( OWS ).

– –

USAFWA

Operational Weather SquadronS ( OWS ) –

STATIONS –

A ‘minimum’ of fourteen ( 14 ) Operational Weather Squadrons ( OWS ) are on-station ( active ) 24 hours per day 7 days per week 365 weeks per year to mitigate ( handle ) Global Environmental Intelligence ( GEI ) issues.

Operational Weather Squadrons ( OWS ) – also known as – Weather Squadrons ( WS ) are ‘designated’, to:

– ‘Process’ Global Environmental Intelligence ( GEI ) data; and, – ‘Provisionally Distribute’ Global Environmental Intelligence ( GEI ) data.

Operational Weather Squadron Global Environmental Intelligence ( OWS GEI )

– –

2nd Weather Group ( WXD )

2ND WXD

OVERSIGHT –

– 2nd ( Strategic Computing Network Center ) Systems Operations Squadron ( SOS ) – Offutt Air Force Base, Nebraska; – 2nd Operational Weather Squadron ( OWS ) – Offutt Air Force Base, Nebraska; and, – 14th Operational Weather Squadron ( OWS ) – Asheville, North Carolina.

2ND WXD Global Environmental Intelligence ( GEI )

DISTRIBUTION –

– U.S. Agency ‘decision-makers’; – U.S. Department of Defense ( DoD ) ‘decision-makers’; – U.S. Allied Foreign Nation ‘decision-makers’; and, – U.S. Joint Operations ‘Warfighters’.

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘develops’ and ‘maintains’ the ‘concepts of operations’ for how U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) supports “most weather sensitive” areas of ‘joint capabilities’.

REPORTS –

– Timely; – Relevant; and, – Specialized.

SOURCES –

Solar Observatories ( Detachments – Det. ), four ( 4 ):

Det. 1 ( 2ND WXG SO Detachment 1 – Learmonth, Australia ); Det. 2 ( 2ND WXG SO Detachment 2 – Sagamore Hill, Massachusetts ); Det. 4 ( 2ND WXG SO Detachment 3 – Holloman Air Force Base – New Mexico ); and, Det. 5 ( 2ND WXG SO Detachment 4 – Palehua, Hawaii ).

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘works in conjunction with’ MAJCOM ‘functional counterpart users’ of products, data and services supplied from U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ).

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) is ‘one ( 1 ) agent’ of the ‘lead Command’ for ‘gathering operational requirements’.

USAGES –

– Planning – Military Operation Missions ( global spectrum ); and, – Executing – Military Operation Missions ( global spectrum ).

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘coordinates’ U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) policy issues.

TASKS –

Four ( 4 ) Operational Weather Squadrons ( OWS ) ‘specifically conduct’ Global Environmental Intelligence ( GEI ):

– Warnings ( 24-hours / 7-days a week / 365-weeks a year ); – Military Mission Briefings ( 24-hours / 7-days a week / 365-weeks a year ) – Forecasts ( 24-hours / 7-days a week / 365-weeks a year ); and, – Analysis ( 24-hours / 7-days a week / 365-weeks a year ).

PROCESSING –

Strategic Center ( cost: $277,000,000 million dollars ) ‘subordinate organization’, provides:

– Computer System Complex; – Computer System Network Productions; – Computer System Applications; – Computer System Operations; – Computer System Sustainment; and, – Computer System Maintenance.

Air Force Weather Enterprise ( AFWE ) is a ‘computer system’ of the U.S. Department of Defense  (DoD ). AFWE computer system access is ‘restricted’ to members of the United States military ( Active Duty, National Guard, or Reserve Forces), U.S. Government, or U.S. government contractors that do business with the U.S. government and require ‘weather information’.

SPECIAL NOTE(S):

USAF ASO A8 –

The U.S. Air Force ( USAF ) Air and Space Operations ( ASO ) Strategic Plans and Programs Directorate ( ” A8 ” ) ‘directs’ the ‘planning’, ‘programming’, ‘budgeting’, ‘development’, ‘acquisition’, ‘engineering’, ‘configuration management’, ‘modification’, ‘installation’, ‘integration’, ‘logistics’ and ‘life cycle maintenance support’ over ‘all’ of the ‘computer processing equipment’ and over ‘all’ of the ‘standard weather systems’.

– –

USAFWA

1st Weather Group ( 1ST WXG ) Directorate

1ST WXG

Global Environmental Intelligence ( GEI )

GEI DISTRIBUTION –

The USAF Air Force Weather Agency ( AFWA ) 1st Weather Group ( 1ST WXG ) Directorate uses ‘four’ ( 4 ) Operational Weather Squadrons ( OWS ) ready 24-hours a day 7-days a week 365-days a year to provide its Global Environmental Intelligence ( GEI ) notification distributions to U.S. ‘military forces’ ( see military branches below ) at 350 ‘specific installations’ located in ‘five’ ( 5 ) Continental United States ( CONUS ) Regions.

OWS GEI NOTIFICATION REGIONS –

SOUTHEASTERN ( CONUS )

9th Operational Weather Squadron ( OWS ) Shaw Air Force Base South Carolina

9TH OWS – Notifies:

– U.S. Air Force; – U.S. Air Force Reserve; – U.S. Air Force National Guard; – U.S. Army; – U.S. Army Reserve; and, – U.S. Army National Guard.

NORTHERN ( CONUS ) AND, NORTHEASTERN ( CONUS )

15th Operational Weather Squadron ( OWS ) Scott Air Force Base Illinois

15TH OWS – Notifies ( Both Regions ):

– U.S. Air Force; – U.S. Air Force Reserve; – U.S. Air Force National Guard; – U.S. Army; – U.S. Army Reserve; and, – U.S. Army National Guard.

WESTERN ( CONUS )

25th Operational Weather Squadron ( OWS ) Davis-Monthan Air Force Base Tucson, Arizona

25TH OWS – Notifies:

– U.S. Air Force; – U.S. Air Force Reserve; – U.S. Air Force National Guard; – U.S. Army; – U.S. Army Reserve; and, – U.S. Army National Guard.

SOUTHERN ( CONUS )

26th Operational Weather Squadron ( OWS ) Barksdale Air Force Base Louisiana

26TH OWS – Notifies:

– U.S. Air Force; – U.S. Air Force Reserve; – U.S. Air Force National Guard; – U.S. Army; – U.S. Army Reserve; and, – U.S. Army National Guard.

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘works with’ USAF Air and Space Operations ( ASO ) staff to integrate U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) Continental Operations ( CONOPS ) with U.S. Air Force ‘plans’ and ‘Programs’.

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘assists’ in the ‘exploitation of weather information’ for ‘warfighting operations’.

– –

The aforementioned four ( 4 ) Operational Weather Squadrons ( OWS ) additionally, provide:

– USAF Weather Agency officer ‘upgrade’ training; and, – Apprentice forecaster ‘initial’ qualification.

– –

USAFWA Manpower & Personnel Directorate ( A1 ) ‘provides’ the global spectrum, of:

– Manpower; – Organization; – Personnel; and, – Training.

SPECIAL NOTE:

USAF ASO A3 and USAF ASO A5 –

USAF Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘oversees’ and ‘executes’ U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) Standardization and Evaluation Program for ‘Weather Operations’.

U.S. Air Force ( USAF ) Air and Space Operations ( ASO ) Plans and Requirements Directorate ( PRD ) – also known as – ( ” A3 ” and ” A5 ” ) ‘assists’ the USAF Air and Space Operations ( ASO ) staff with managing U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) the ‘process’ of ‘career field-training’ by ‘obtaining training’ and ‘implementing training’ to meet ‘career field-training requirements’.

– –

USAF

Air Force Combat Weather Center ( USAFCWC – aka – AFCWC )

USAF Combat Weather Center ( Hurlburt Field, Florida ) ‘develops’, ‘evaluates’, ‘exploits’ and ‘implements’ a variety of ‘new tactics’, ‘new techniques’, ‘new procedures’ and ‘new technologies’ across U.S. Air Force Weather Agency ( USAFWA ) – also known as – Air Force Weather ( AFW ) ‘enhancing effectiveness’ for U.S.  military force branches, of the:

U.S. Special Forces; U.S. Joint Operations; U.S. Combined Operations; U.S. Air Force; and, U.S. Army.

– – – –

Source: Global Security

Space

AFNS

Space Weather Team Readies For Upcoming Solar Max by, Ryan Hansen ( 55th Wing Public Affairs )

February 25, 2011

NEBRASKA, Offutt Air Force Base – February 25, 2011 – Solar Maximum may sound like the name of a super hero, but it’s certainly no comic book or 3-D movie. Solar Maximum is actually the name for the Sun’s most active period in the solar cycle, consistently producing solar emissions, solar flares and sun spots.

For a little background on the sun’s activities, the star goes through roughly 11-year cycles of where it is very active and also relatively calm. The Sun’s last Solar Maximum occurred in 2000 and it is expected to awaken from its current solar minimum and get more active this year.

According to the members of the 2nd Weather Squadron ( WS ), an active sun can cause all sorts of problems for us. “Solar weather plays a huge part in the warfighter’s mission,” said Staff Sgt. Matthew Money, a forecaster with the space weather flight. “Impacts from solar weather can cause radio blackouts, satellite communication failure, satellite orbit changes, satellite surface charging, or short circuits, and radar clutter.”

That is why the squadron’s worldwide space weather team of roughly 50 active-duty members, civilians and contractors continually analyze, forecast and provide alert notifications for the entire U.S. Department of Defense ( DoD ), as well as a slew of other government agencies.

“When ‘space weather’ causes impacts to earth that meet or exceed warning thresholds our end users are informed within minutes,” said Staff Sgt. Jonathan Lash, space weather flight forecaster. We send out warning bulletins through a computerized distribution system, [ and ] we have other graphical products that show what happened in the past 6-hours around the globe as well as what we expect to happen in the upcoming 6-hours,” he said.

Members of the 2nd Weather Squadron [ OWS ] rely on five ( 5 ) ground-based Solar Observatories, as well as a network of satellites orbiting the earth, to accomplish their mission.

“There aren’t too many opportunities to be the Air Force’s sole provider of something,” said Lt. Colonel Jim Jones, 2nd Weather Squadron [ OWS ] Commander. “In this case, the mission is unique to the entire DoD.”

Solar Observatories are ‘strategically placed’ around the globe in such places, as:

– Australia; – Hawaii; – Italy; – Massachusetts; and, – New Mexico.

They include both optical telescopes and radio telescopes and ensure the Weather Squadron always has one eye, or ear, on the sun.

“The optical telescope network monitors solar surface features,” said Master Sgt. Shane McIntire, the space weather flight chief. “It automatically tracks the sun and directs light to the instruments, which collect data and are controlled by computers. It scans specific regions at a rate of at least twice per minute.”

Through filtered lenses space weather analysts are able to perform flare patrol and view sunspots to determine the magnetic complexity of the region.

“The telescope has special filters that isolate a single optical wavelength,” said Master Sgt. Shane Siebert, who leads the Detachment 4 solar observatory for the 2nd Weather Squadron at Holloman Air Force Base, New Mexico. This wavelength ( 6563 angstroms ) is called ‘hydrogen alpha’ ( H-Alpha ) where the majority of solar activity occurs,” he said, “Analysts monitor this wavelength from sunrise to sunset, and are looking for specific signatures that may lead to solar flares and other adverse activity.”

But not all of the sun’s activities can be captured using optical telescopes.

Some events have a unique radio-frequency signature that can also be measured.

Using a mixture of technology from the 1970s to the present day, radio observatories are able to monitor frequencies in the 25 MhZ to 180 megahertz range, as well as eight ( 8 ) other discrete frequencies. Their digitized output is collected by a computer and then processed and analyzed for solar activity.

“We actually are able to detect the specific strength at a given radio frequency,” said Major Bradley Harbaugh, who commands the Detachment 5 solar observatory for the 2nd Weather Squadron [ OWS ] at Palehua, Hawaii. What we detect are energetic solar emissions in [ specific ] frequency bands or ranges. When detected, we [ are able to describe ] the start time, duration, intensity and type of solar emission. This helps describe the potential impacts by identifying the characteristics of what may impact earth,” he said.

Identifying these solar emissions is crucial to warfighter communication abilities.

“If there is solar energy that increases on your frequency, you can try to talk into your radio, but the noise from the sun will be stronger than your transmission, therefore drowning-out what you are saying,” Major Harbaugh said, “As an operator, you can increase your radio power to try and ‘out-broadcast’ the sun but you are also now broadcasting over a much larger area, making your transmission more susceptible to enemy detection. Therefore, the Sun’s impact must be a consideration when planning a mission.”

The Weather Squadron network of satellites includes those owned and operated by the U.S. Department of Defense ( DoD ), National Aeronautic and Space Administration ( NASA ) and the National Oceanic and Atmospheric Administration ( NOAA ) that include a combination of systems that are dedicated solely to space weather as well as a few utilizing space weather sensors.

“We gather a significant amount of data from satellites,” Sergeant McIntire said. “Imagery from [ satellites ] can augment the ground-based network, providing real-time monitoring of solar features at wavelengths that can’t be seen from the ground.”

Data from all of these sources combined, are continually pushed to the space weather operations center at the U.S. Air Force Weather Agency ( USAFWA ) here.

With this information in hand, the Weather Squadron can produce the most reliable space weather forecast possible, however even with all of this data, producing a space weather forecast is still much more difficult than creating one for terrestrial weather.

“Space weather is a terribly difficult science and it takes a lot of training and experience,” Colonel Jones said.

“Space weather forecasting is very reactive,” Sergeant Money said. “The ‘knowledge and tools are not quite up to par in order to do accurate forecasting’ like we do here on Earth.”

It is also important to note that today the world is much more reliant on space-based assets than they were during the last Solar Maximum, officials said. With cellphones, portable navigation devices and satellite television receivers, all part of our daily lives, a huge solar weather event could wreak havoc on quite a few different platforms.

“The impact of a solar storm in 2000 was probably not as great, due to the lower density of space technology, and the limited number of consumers utilizing the data,” Major Harbaugh said, “However, the ripple from a ‘major solar event now will more likely be felt across a much broader consumer base’ [ the public ] since there are many more assets and many more users of space data.”

However, with improved technology and an increased knowledge of the sun’s activities, the Weather Squadron ( WS ) is more prepared than ever for the upcoming Solar Maximum, Colonel Jones said. “Since the last solar maximum, we’ve upgraded most of our numerical models in terms of both their basic science and the data they ingest,” he said. “That’s a direct result of the advances in sensors and the technology that enables rapid data transfer. We can react faster and see farther than ever before.”

“We already have members within the unit developing forecast techniques based on signatures we see on the sensors,” Sergeant Money said.

So it’s a safe bet, that ‘the next 2-years’ will be ‘hectic’ for the 2nd Weather Squadron [ OWS ].

Their mission, to provide ‘situational awareness to key decision-makers’ will certainly keep ‘them’ busy.

“In the last 30-days alone ( February 2011 ), we’ve had [ more than 30 ] reportable [ solar ] energy events,” Major Harbaugh said. “The workload has ‘already increased’ and ‘will continue to do so’ for probably ‘the next 1-year’ or ’2-years’.”

“About 1-year ago, it was not uncommon for an analyst to only have one ( 1 ) very small Solar Region of the Sun to monitor,” Sergeant Siebert said. “Today, it is normal for analysts to keep fairly busy monitoring 4 Solar Regions to 6 Solar Regions. Studies, of the last Solar Maximum, show that typically 1-day included twenty-two ( 22 ) active Solar Regions – almost 4 times our current workload,” he added.

Regardless, Weather Squadron [ OWS ] ‘space weather’ analysts, forecasters and technicians globally are ready for the upcoming solar fury [ see, e.g. Solar Energetic Particle Event ], Colonel Jones said.

– – – –

The U.S. Air Force obviously needs ‘far-more’:

– More personnel for U.S. Air Force Weather Agency ( USAFWA ) Weather Group ( WXD ) Operational Weather Squadrons ( OWS ) effortings;

– Independent ’governing board’ or ‘blue ribbon watchdog committee’ overseeing an ’independent monitoring agency’ placing the U.S. Air Force Weather Agency ( AFWA ) and its USAF Air and Space Operations ( ASO ) under an independent ‘microscope’; and,

– Public Transparency notifications, would also be nice, but then…

– – – –

Source: MSNBC.COM

Huge Solar Flare’s Magnetic Storm May Disrupt Satellites, Power Grids

March 7, 2012 13:19 p.m. Eastern Standard Time ( EST )

A massive solar flare that erupted from the Sun late Tuesday ( March 6, 2012 ) is unleashing one of the most powerful solar storms in more than 5-years, ‘a solar tempest that may potentially interfere with satellites in orbit and power grids when it reaches Earth’.

“Space weather has gotten very interesting over the last 24 hours,” Joseph Kunches, a space weather scientist at the National Oceanic and Atmospheric Administration ( NOAA ), told reporters today ( March 7, 2012 ). “This was quite the Super Tuesday — you bet.”

Several NASA spacecraft caught videos of the solar flare as it hurled a wave of solar plasma and charged particles, called a Coronal mass Ejection ( CME ), into space. The CME is not expected to hit Earth directly, but the cloud of charged particles could deliver a glancing blow to the planet.

Early predictions estimate that the Coronal Mass Ejections ( CMEs ) will reach Earth tomorrow ( March 8, 2012 ) at 07:00 a.m. Eastern Standard Time ( EST ), with the ‘effects likely lasting for 24-hours and possibly lingering into Friday ( March 9, 2012 )’, Kunches said.

The solar eruptions occurred late Tuesday night ( March 6, 2012 ) when the sun let loose two ( 2 ) huge X-Class solar flares that ‘ranked among the strongest type’ of sun storms. The biggest of those 2 flares registered as an X Class Category 5.4 solar flare geomagnetic storm on the space weather scale, making it ‘the strongest sun eruption so far this year’.

Typically, Coronal Mass Ejections ( CMEs ) contain 10,000,000,000 billion tons of solar plasma and material, and the CME triggered by last night’s ( March 6, 2012 ) X-Class Category 5.4 solar flare is ‘the one’ that could disrupt satellite operations, Kunches said.

“When the shock arrives, the expectation is for heightened geomagnetic storm activity and the potential for heightened solar radiation,” Kunches said.

This heightened geomagnetic activity and increase in solar radiation could impact satellites in space and ‘power grids on the ground’.

Some high-precision GPS ( Global Positioning Satellite ) users could also be affected, he said.

“There is the potential for ‘induced currents in power grids’,” Kunches said. “‘Power grid operators have all been alerted’. It could start to ’cause some unwanted induced currents’.”

Airplanes that fly over the polar caps could also experience communications issues during this time, and some commercial airliners have already taken precautionary actions, Kunches said.

Powerful solar storms can also be hazardous to astronauts in space, and NOAA is working close with NASA’s Johnson Space Center to determine if the six ( 6 ) spacecraft residents of the International Space Station ( ISS ) need to take shelter in more protected areas of the orbiting laboratory, he added.

The flurry of recent space weather events could also supercharge aurora displays ( also known as the Northern Lights and Southern Lights ) for sky-watchers at high latitudes.

“Auroras are probably the treat that we get when the sun erupts,” Kunches said.

Over the next couple days, Kunches estimates that brightened auroras could potentially be seen as far south as the southern Great Lakes region, provided the skies are clear.

Yesterday’s ( March 6, 2012 ) solar flares erupted from the giant active sunspot AR1429, which spewed an earlier X Class Category 1.1 solar flare on Sunday ( March 4, 2012 ). The CME from that one ( 1 ) outburst mostly missed Earth, passing Earth by last night ( March 6, 2012 ) at around 11 p.m. EST, according to the Space Weather Prediction Center ( SWPC ), which is jointly managed by NOAA and the National Weather Service ( NWS ).

This means that the planet ( Earth ) is ‘already experiencing heightened geomagnetic and radiation effects in-advance’ of the next oncoming ( March 8, 2012 thru March 9, 2012 ) Coronal Mass Ejection ( CME ).

“We’ve got ‘a whole series of things going off’, and ‘they take different times to arrive’, so they’re ‘all piling on top of each other’,” Harlan Spence, an astrophysicist at the University of New Hampshire, told SPACE.com. “It ‘complicates the forecasting and predicting’ because ‘there are always inherent uncertainties with any single event’ but now ‘with multiple events piling on top of one another’, that ‘uncertainty grows’.”

Scientists are closely monitoring the situation, particularly because ‘the AR1429 sunspot region remains potent’. “We think ‘there will be more coming’,” Kunches said. “The ‘potential for more activity’ still looms.”

As the Sun rotates, ‘the AR1429 region is shifting closer to the central meridian of the solar disk where flares and associated Coronal Mass Ejections ( CMEs ) may ‘pack more a punch’ because ‘they are more directly pointed at Earth’.

“The Sun is waking up at a time in the month when ‘Earth is coming into harms way’,” Spence said. “Think of these ‘CMEs somewhat like a bullet that is shot from the sun in more or less a straight line’. ‘When the sunspot is right in the middle of the sun’, something ‘launched from there is more or less directed right at Earth’. It’s kind of like how getting sideswiped by a car is different than ‘a head-on collision’. Even still, being ‘sideswiped by a big CME can be quite dramatic’.” Spence estimates that ‘sunspot region AR 1429 will rotate past the central meridian in about 1-week’.

The sun’s activity ebbs and flows on an 11-year cycle. The sun is in the midst of Solar Maximum Cycle 24, and activity is expected to ramp up toward the height of the Solar Maximum in 2013.

Reference

http://www.msnbc.msn.com/id/46655901/

– – – –

Cordially submitted for review and commentary by,

 

Kentron Intellect Research Vault ( KIRV )
E-MAIL: KentronIntellectResearchVault@Gmail.Com
WWW: http://KentronIntellectResearchVault.WordPress.Com

/

/

Research References

http://www.af.mil/information/factsheets/factsheet_print.asp?fsID=157&page=1
https://login.afwa.af.mil/register/
https://login.afwa.af.mil/amserver/UI/Login?goto=https%3A%2F%2Fweather.afwa.af.mil%3A443%2FHOST_HOME%2FDNXM%2FWRF%2Findex.html
http://conceptactivityresearchvault.wordpress.com/2011/01/02/solar-energetic-particle-event-effects
http://www.globalsecurity.org/space/library/news/2011/space-110225-afns01.htm
http://www.wrf-model.org/plots/wrfrealtime.php

 

Solar Energetic Particle Event ( SEPE ) Effects

Solar Dynamic Thermal Magnetics

[ IMAGE ( above ): Click on image to enlarge. ]

Solar Energetic Particle Event ( SEPE ) Effects
by, Paul Collin ( Author for Concept Activity Research Vault – CARV )

LOS ANGELES – January 13, 2016 ( Updated ) [ Originally Published: January 2, 2011 ] – On the evening of October 25, 2006 the National Aeronautics and Space Administration ( NASA ) launched ( from Kennedy Space Center in Florida ) two ( 2) unmanned spacecraft ( i.e. STEREO-A and STEREO-B ) on an unprecedented space mission to record and warn officials about our Sun erupting any ‘Earth-directed’ “Coronal Mass Ejection” ( CME ) ‘solar flare’ creating a seriously significant “geomagnetic storm” impacting Earth with a “Solar Energetic Particle ( SEP ) Event” that NASA officials claim “we all need to be concerned about” because such can ‘instantly shutdown all electrically operated equipment’ on the ‘ground’ and ‘sea’ as well as in the ‘air’ and ‘space’.

What exactly, on Earth, is a ‘Solar Energetic Particle Event’ ( SEPE )?

How ‘important’ is ‘preparing’ for a ‘significant’ Solar Energetic Particle Event ( SEPE )?

Many governments around the world have key individuals ( e.g. officials, military adjunct specialists, public utility electricity company representatives and other technical professionals whom will know when a ‘seriously significant’ Solar Energetic Particle Event ( SEPE ) from a solar flare that will cause the shut down of ‘national electricity infrastructure grids’, and if only for a period of “1-week”  – such could throw population centers of people back into a period of the Dark Ages ‘living without electricty’.

U.S. Air Force Weather Agency, Office of Naval Research ( ONR ), Naval Research Laboratory ( NRL ), NASA, EUROPEAN SPACE AGENCY ( ESA ), JAPAN SPACE AGENCY ( JSA ), NOAA, and a ’few others’ have developed extremely good Global Environmental Intelligence ( GEI ) predictives based on mathematical equations formula astrophysics, geothermal and geochemical scientific studies calculations, however absolutely ‘no’ United States ( USA ) federal government organization states any alert may even be provided’ as to ‘when’ the “Solar Maximum SEPE” will likely occur so ‘populations cannot react with much of a degree of wonderment’ until ‘very shortly before’ the Solar Energetic Particle Event ( SEPE ) ‘strikes Earth’.

Serious U.S. Department of Homeland Security Business

Solar Energetic Particle Event ( SEPE ) effects on Earth and mankind are understood by professionals who are familiar with the damages caused by them, however very few citizens know about this, and even fewer know ’what to be preparing for’ is coming very soon that will impact their lives.

While a Solar Energetic Particle Event on Earth is ‘not doomsday with a crash and bang’, many will soon realize it comes with ‘no warning’ and is ‘extremely silent’ on ‘large sections of populations’ whom never even knew – let alone expected – what a Solar Energetic Particle Event ( SEPE ) really means for those living in modern day society.

Masses of people will be ‘immediately confused’, ’not fully understanding’ what has happened to them all when nothing electrical operates for a long period of time.

Experiencing ‘no electricity’ will be ‘totally foreign’ for many so, it becomes ‘vitaly important’ to begin learning today ‘how to prepare to survive’ within a ‘not too distant future society’ that has been instantly seperated from all modern day conveniences they had lived with for decades.

Sound impossible or ridiculous? Try telling the U.S. Department of Homeland Security, who knows what is coming soon, that it is impossible and ridiculous and see what they tell you!

Global governments are preparing to ‘try to minimize societal impacts’ for a significant Solar Energetic Particle Event ( SEPE ) coming that will ‘initially affect regional populations’ and ‘secondarily impact global populations’ immediately dependent on ‘consumable food’ and ‘fresh drinking water’ worldwide.

Basic Understandings

FIRST ( 1st ) – “Coronal Mass Ejections ( CME )” or “solar flares” throw off ’high-energy electron particles’, but those thrown ‘directly at Earth’ – in ‘massive quantities’ and at ‘very high speed’ – creates ‘serious impacts on Earth societies’; and,

SECOND ( 2nd ) – “Solar Energetic Particle Events ( SEPE )” are ‘invisible electrical disturbances striking Earth’ from the Sun during its “11-year cycle” but primarily during its most volatile cycle period known as its “Solar Maximum” when the Sun’s ‘inner magnetic pole’ shifts ‘upside down’ in a 180-degree flip.

Nothing can be done, to ‘stop’ tha Solar Energetic Particle Event ( SEPE ) from happening on Earth, but plenty can be done to ‘reduce societal effects’ so long as people are taught ‘how to survive without electricity’ for an ‘extended period of time’.

Many would like to know just ‘how long’ an ’extended period of time’ would be that ‘they will have to go without electricty’. More information on that and more has unfortunately not been provided by government officials or even mainstream news media broadcasts either so, many people still remain in the dark as to ‘how to even prepare to survive’.

Solar Energetic Particle Event Effects To Understand –

FIRST – Understand only ‘factual details’;

SECOND – Heighten ‘other people’s awareness levels’ by ‘openly discussing SEPE effects’;

THIRD – Create ’your own individual plan’ to follow ‘no matter where you are located’ ( ‘stationary’ or ‘traveling ) during ‘any time’ ( ‘day’ or ‘night’ ).

Unfortunately, ‘not even one’ ( 1 ) – of the aforementioned three ( 3 ) basics – was suggested by any ‘government official’ nor was any discussed on any ‘news broadcast’.

What Is Going To Inform People About Solar Energetic Particle Event ( SEPE ) Effects?

The only information report providing ‘advanced public notification details’ for ‘what will be encountered’, ‘when to plan’, and ‘how to survive without electricity’ is only ‘this report’, because ’no report has been prepared or distributed for the public’ by either the U.S. Department of Homeland Security ( DHS ) or U.S. Federal Emergency Management Agency ( FEMA ).

How Do We Know It’s So Important?

Global Environmental Intelligence ( GEI ) is processed through the U.S. Air Force Weather Agency ( AFWA ) Air Force Weather Enterprise ( AFWE ) computer network system controlled by the U.S. National Security Agency ( NSA ) under the U.S. Department of Defense ( DOD ).

Does this stuff have anything to do with the ancient Mayan calendar ending in 2012 or any other ‘doomsday predictions’ circulating on the internet? NASA says “No, a Solar Maximum Solar Energetic Particle Event ( SEPE ), is something we all need to be concerned about.” NASA officials add, the “Mayan calendar ‘end date’ of  ’2012′ is nothing to be concerned about.”

Prominent astrophysicists worldwide claim this particular Solar Energetic Particle Event ( SEPE ) will bring ‘negative impact effects’ to Earth that will ‘definitely’ be something “we all need to be very concerned about,” because if ‘enough high-energy electron particles strike Earth all at once’, as during the 1859 SEPE incident known as the “Carrington Flare,” all ‘electricity supply points’ relied upon for everything will ‘overload’, ‘burn out’ and ‘shut down’ a sizeable portion of ‘national infrastructure electricity grids’.

U.S. Department of Homeland Security ( see charts below ) experts and advisors have already confirmed an incredible burdensome task to rebuild our modern-day electrical infrastructure could take up-to 10-years dependent on ‘remaining numbers of specialists to rebuild it’ – those surviving aftermath population panic facing mob mentality crimes against humanity.

Rebuilding the ‘national infrastructure electrical grid’ will not be easy for a variety of other reasons too since ‘availability of certain transformers’ – like those from ABB ( Switzerland ) – may not quickly be replaced as more than just transportation complications may ensue.

National dependence on ‘foreign commerce transportation’, sees ‘international businesses’ lacking ‘large sailing ships’ to carry ‘large quantities of any cargo’ across ‘long distance seas’ today because ‘modern ships’ depend on ‘electronic controls’ to ‘operate properly’.

National dependence on ‘domestic commerce transportation’, sees ‘local businesses’ lacking ‘horse-driven or mule-driven wagons and dog-driven sleds’ to carry ‘large quantities of any cargo’ across ‘long distance lands and mountains’ today because both ‘modern trucks’ and ‘modern trains’ depend on ‘electronic controls’ to ‘operate properly’.

Interstate highways may not remain ‘safe passageways’ with ‘millions of people fleeing major population centers’ – both ‘downtown and urban panicked areas where mob crimes flourish – only to end-up ‘stranded in rural areas’ where yet ‘other more improvised crimes’ include ‘attacks / raids on highways’ where theft and worse are likely to occur.

If losing the ‘national infrastructure electrical grid’ becomes a reality, as NASA and astrophysicists predict – based on advanced high technology monitoring stations in space – then in order for people not to become ‘stranded’ – since ‘emergency help will not be instantly available’ – the most important question everyone needs to know is, “When should we all ‘stop traveling’?”

Will this large solar flare SEPE happen in 2011, 2012 or 2013?

Could experts at least narrow down the time-window for some of us so, we could at least know “1-week” or “1-day” ‘before’ an SEPE is expected to strike Earth? The immediate problem is ‘answers’ to such questions as these are not being answered or even presented by anyone in government.

What are ‘we the people’ supposed to do?

How can people prepare for the effects of a ‘seriously significant’ Solar Maximum SEPE?

Unfortunately, today ( 30MAY11 ) the public still receives the old ( 29JAN09 ) NASA basic storyline warning, “Did you know that a solar flare can make your toilet stop working?” ( immediately below ):

– – – –

Source: National Aeronautics and Space Administration ( NASA )

Science News

Severe Space Weather – Social and Economic Impacts

January 29, 2009

Did you know a solar flare can make your toilet stop working?

That’s the surprising conclusion of a NASA funded study by the National Academy of Sciences entitled “Severe Space Weather Events – Understanding Societal and Economic Impacts.” In the 132-page report, experts detailed what might happen to our modern, high-tech society in the event of a “super solar flare” followed by an extreme geomagnetic storm. They found that almost nothing is immune from space weather – not even the water in your bathroom.

Reference

http://science.nasa.gov/science-news/science-at-nasa/2009/21jan_severespaceweather/

– – – –

People must awaken to ‘at least the very basic caution and warning’ NASA has already provided officially, as well as the myriad of global astrophysicist’s warnings about a significant Solar Energetic Particle Event ( SEPE ) coming soon in everyone’s direction here on Earth. People ‘need to know’ basically ‘how to prepare’ for a significant ‘solar electron particle effect’ on Earth.

Well in advance, government considers alot of issues including those decidely kept secret in cases where the shocking truth creates panic, it’s easy to detect by simply watching official government aloof answers skirt facts to the public typically provided no decent information that could otherwise prove to be of some genuine assistance.

People need to immediately consider ‘how to continue basic living’ during the immediate moment of the SEPE, the day after the SEPE and in its aftermath up to a period of 2-years. Why? The government cannot assist everyone. Understanding how severely an SEPE negatively impacts government, infrastructure and people will become a harsh reality for many to either begin understanding now.

Solar Energetic Particle Event ( SEPE ) will take nations up to 10-years ( other nations even longer ) to rebuild their infrastructure, during which time ‘population patience will expiret’ amongst the masses long before government accomplish that, which is precisely why ’even further consequential damages’ from mob mentalities and roving gangs will lengthen the years necessary for the national recovery process. Man’s greatest threat is selfishness, unwillingness to understand, and disregard for most of his fellow man that also carries the same genetic trigger mechanisms, where only upon realizing a significant threat these two ( 2 ) species clash to turn loose ancient animalistic behavior capable of destroying each other regardless of all else. Man is equipped with his own doomsday button pushing against his age of modernization with a SEPE as all the trigger necessary to set everything else haywire.

Restoring law and order will not be quick, nor will it be just, amidst man’s selfishness inhumanity towards his fellow man when an SEPE impacts an ‘already emotional fragile society’. Such will quickly flip any peaceful community into an absolute ’deadly nightmare’ for many whom are unaware now of what faces them soon.

Man’s free will decides whether or not to prepare their own choice as to how they wish to deal with a Solar Energetic Particle Event ( SEPE ).

If one wishes their new environment to be amongst the wild embroiled masses and street gang pandemonium, then that is ‘their choice to not prepare now’. Nobody is forcing anyone to do anything ‘yet’, but if you think ‘stashed ammunition’ can hold-off a ‘hungry crowd of hundreds’, good luck. If one thinks they can ride a horse near an angry crowd and remain on that horse, good luck! Most really need to ‘think’ ahead.

NASA has been expecting a significant Solar Energetic Particle Event to occur during the Solar Maximum ‘cycle’ ( between 2011 – 2013 ), however other well respected scientists and astrophysicists with access to much of the same data NASA uses indicates the ‘date for the SEPE’ could hit Earth as early as 2011. Who is correct? Should you wait to find out? What if by that time it’s too late for you to prepare? Are you ready, willing and able to face those consequences? How about your family? Did you bother to stop and ‘think’ about ‘their chances of survivability’ during such an SEPE without you able to reach them? Do they know you are reviewing this? Shouldn’t they be reviewing it as well? Will you remember to tell them about this? For most, they will remain unaware until it’s too late.

Some professionals believe NASA only releases data from ‘unclassified instruments’ aboard STEREO spacecraft, however other highly classified instrumentation exists to provide far-more information received constantly in real-time via a direct blue-green laser link receiver fed down into a ‘black vault’ room located in Hawthorne, California where ‘unevaluated scientific intelligence’ data is fed onward to a ‘scientific data analysis center’ in Fort G. Meade, Maryland that processes and routes ’evaluated intelligence’ data for ’actionable processing’ through another government information dissemination directorate routing to branches holding ‘directorate decision orders’ ( DDO ).

While the ‘date’ may be a guessing game amongst some professionals, one ( 1 ) thing they all agree upon. The SEPE is coming, and it will occur soon.

Earth history indicates several devastating Solar Energetic Particle Events ( SEPE ) that have surprised everyone, especially in 1859 when immediately through the atmosphere the SEPE Aurora delivered high-energy electrons automatically entering the molecular structure of all types of metal wire. Because of electron excitation in both telegraph system wire and even wire fences became so ‘hot’ wild fires were experienced throughout America and England.

Although the 1859 SEPE was ‘before’ the modern age of electrically dependent devices and systems, recent history recorded other devastating SEPE surprises for millions of people on Earth.

It is extremely important for everyone to ‘begin understanding what is actually coming’ to Earth and ‘how to prepare’ for the serious warning NASA gave.

If anyone missed the ‘numerous mainstream news media broadcasts’ about the “Solar Maximum” they can now view all those same broadcasts within this report ( further below ).

Unfortunately, the mainstream news media did ‘not detail’ the Solar Maximum creation of what is called a Solar Energetic Particle Event ( SEPE ) on Earth that many will now be able to learn much more about in this exclusive report.

Many want to ‘know the precise date’. This report will point everyone where they can ‘learn how to monitor that information’, and they will know, but only if they bother to ‘review this report thoroughly’.

Many will discover from this report what they ‘cannot reply on’, which is actually a very big first step to ‘preparing alternate means by which they can personally tailor their own needs to survive this SEPE’.

Many expect to hear all about the SEPE coming in-advance by simply listening to their vehicle radio, watching their home television, listening for their wireless communication devices to signal them, or catching the news from Bill the barber who lives next door to them. That type of expectation is not only unrealistic but extremely foolish as this exclusive report will certainly prove to everyone who bothers reviewing it thoroughly.

The most difficult part to convey to almost everyone is that ’all electronic devices and systems’ will ‘not provide sufficient warning time’ for them to prepare a plan for themself and their family, and this will consequently result in many being stranded immediately right alongside millions of other people whom chose to do the exact same thing. And, they will likely be stranded for a very long and unexpected time period with no way out of their own SEPE situation.

This exclusive report does not subscribe to ’doomsday prophecies’, but what U.S. federal officials already announced and research has proven.

Many seem to be simply ignoring the NASA news warning because ‘most do not believe it will ever adversely affect them’. This exclusive report will likely be ’their last warning’ before the SEPE hits Earth.

All information contained in this report will prepare many for what is about to occur. People want to know what the SEPE ‘effects’ have been like ‘before’ on Earth so, a specially prepared ’historical information timeline’ can now be reviewed ( immediately below ).

Solar Energetic Particle Event Problem History

1859 Solar Maximum SEPE proved that ‘all types of metal wires’ instantly absorb ( invisibly through the atmosphere ) huge surges of high-energy particles, e.g. X-rays, protons ( secondary protons, neutrons, deuterons, tritons ), electrons, making wire ‘too hot to touch’ and thereby, causing:

1. Wire attached to ‘wood posts’ in the ground, providing ‘perimeter protection’ quite suddenly served to ‘ignite widespread grass fires’ destroying farms, farmlands, agricultural, homes, and more all across America and England; and,

2. Wire attached to ‘wood poles’ in the ground strung from ‘telegraph office structures’, providing ‘business and personal communication links’ quite suddenly served to ‘ignite widespread structure fires’ destroying telegraph offices and adjacent businesses and buildings ( e.g. general stores, banks, other businesses, hotels and residences ) across America and England.

Imagine what an 1859 SEPE would do today to everything based on ‘electricity’, ’electrical’ and/or ‘electronics’ containing ‘wire filaments’, ’wire cables’, ‘small wires’, ‘microchip wires’ where anything containing any type of ’wire’ instantly overloads, burns out and consequently stops everything, from water being pumped up into faucets to transportation stoppage. NASA has officially stated ‘recently’ that a Solar Maximum SEPE is coming and will strike Earth sometime between 2012 and 2013. Astrophysicists believe it will strike May 2013, while others ’admit they cannot be sure’ and state, “it could even happen tomorrow.”

The Sun, according to NASA and other officials, already shows a pattern indicative of a significant Coronal Mass Ejection ( CME ) due to occur. NASA has already alerted people through several mainstream media news channels that it ’will’:

1. Happen; and,

2. Happen soon.

National Academy of Sciences recent report indicates a Solar Maximum Solar Energetic Particle Event ( SEPE ) like the 1859 SEPE striking Earth might recover in only 4-years but more likely 10-years where ‘initial damages’ ( alone ) to just the high technology infrastructure ( alone ) will initially see costs up to $2,000,000,000,000 trillion dollars ( Source: NASA Science News – 28JAN11 ).

Earth’s magnetic field usually protects the Earth’s surface by diminishing the effects of highly accelerated ( near lightspeed and invisible ) high-energy particle bombardments but ‘not in all cases’ where the Earth experiences a Solar Energetic Particle Event ( SEPE ).

Solar Maximum Solar Energetic Particle Event ( SEPE ) Effects Chronology –

In September 1859, Earth’s magnetic shield defenses were overwhelmed by an incredibly powerful Solar Maximum SEPE blast of electrons, protons and X-Rays causing ‘telegraph wires’, ‘other wires’ and even ‘batteries’ to suddenly become too hot to touch, short-circuit, ignite fires and disrupt telegraph communications between the United States and Europe experiencing simultaneous emergency events at once, as the Sun’s:

1. – Solar plasma ( extremely hot gas ) on the surface ( corona ) of the Sun ejected an ‘extremely intense’ electromagnetic field in a Coronal Mass Ejection ( CME );

2. – Coronal Mass Ejection ( CME ) of electrons, protons, and X-Ray particles became accelerated to nearly lightspeed traveling 93,000,000 million miles ( 150,000,000 million kilometers ) quickly ( 17-hours and 40-minutes ) in less than 1-day unlike typical solar storms travelling the same distance in 2-days to 4-days.

3. Coronal Mass Ejection ( CME ) contained ‘highly excited magnetic field electrons’ ( high-energy ) particles ‘charged in diametrical opposition’ that slammed into ’magnetic field directions of electrons’ on Earth.

[ SEPE 1859 Impact ‘if’ during 2008 – Collapse of “365” EHV Transformers for Years ( click to read by enlarging image ) ]

In 1921, the Solar Maximum SEPE that struck Earth, if occurred in 2008 would have removed a minimum of 365 EVH transformers throughout the United States ( see risk projection map above ).

In 1940, a Solar Maximum SEPE penetrated the Atlantic Ocean where Canada ( Newfoundland ) to Scotland underwater telecommunication cable wires were slammed with electrons spiking electrical to 2,600 volts recorded by scientists.

In 1972, a Solar Maximum SEPE spiked Canada electricity current overloaded and exploded a 230,000 volt transformer of the British Columbia Hydroelectric Authority.

In July 1989, more than 151-years later, a Solar Maximum SEPE ‘stopped all electrical power’ to more than 6,000,000 million people throughout the entire province of Quebec, Canada triggering a gas pipeline explosion decimating large sections of the Trans-Siberian Railway, engulfed two ( 2 ) passenger trains into flames, killed 500 people, effected areas of the northern United States of America ( USA ), and the 1859 Solar Maximum SEPE was ‘three ( 3 ) times more powerful’ than this. Interestingly, within a few weeks of this Solar Energetic Particle Event ( SEPE ),  ‘officially’ being discussed in the United States was, the following ( immediately below ):

====

Source: North American Electric Reliability Corporation ( NERC ) Washington, D.C.

OPERATING COMMITTEE

Adam’s Mark Hotel St. Louis, Missouri

June 6, 1989

MINUTES OF THE MEETING –

A regular meeting of the North American Electric Reliability Council [ NERC ] Operating Committee was convened by Chairman Larry C. Kinard at 8 a.m. on Tuesday June 6, 1989.

A copy of the meeting notice, agenda, and list of attendees are attached as Exhibits A, B, and C, respectively.

Introductions –

Chairman – Kinard, welcomed:

– James Jackson – Manager of System Operation for the Southern California Edison Company ( SCE ), as a new member representing WSCC ( California / Nevada ); and,

– Terry Bruck – Vice President of Electric Operations for Cincinnati Gas & Electric Company ( CGE ), as a new member representing ECAR.

These meeting attendees then introduced themselves around the table.

Approval of Minutes of February 28, 1989 thru March 1, 1989 Meeting –

It was on motion by Mr. Hardage, seconded by Mr. Bulley, that the minutes were approved as distributed.

Organization –

Chairman – Kinard introduced Mr. William C. Phillips, Director of System Operations for Middle South Services ( MSS ), being the new Chairman of the Performance Subcommittee.

Vice Chairman – Uggerud reported Mr. Jackson was assigned to the Operating Information Subcommittee ( OIS ), and Mr. Bruck was assigned to the Disturbance Analysis Working Group ( DAWG ).

Board of Trustees Meeting, April 3, 1989 thru April 4, 1989 –

Chairman Kinard reviewed the highlights of the Board meeting, as follows:

– The Board accepted the Operating Committee’s strategic plan recommendations, but modified Recommendation #10 dealing with development of a white paper on Open Transmission Access ( OTA ).  Chairman Kinard noted Mr. Bulley was assigned to scope this Project.

– Don Benjamin discussed the Solar Magnetic Storm of March 13 [ 1989 ], and Hydro-Québec outage that resulted.

– Mr. Stout – Chairman of the Reliability Assessment Subcommittee, discussed concern about ‘load forecast uncertainty’.

NERC Board of Director new officers, are:

– Chairman – William Clagett, Administrator of the Western Area Power Administration ( WAPA ); – Vice Chairman – Earl Dille, President of Union Electric Company ( UEC ); and, – Secretary-Treasurer – George Edwards, Chairman of the Board & CEO of The United Illuminating Company ( TUIC ).

Technical Steering (TSC) Committee Meeting, April 28, 1989 –

– The Technical Steering (TSC) Committee discussed the March 13, 1989 solar magnetic storm.

– NESC activities were rolled into the Engineering Committee and Operating Committee.

– Engineering Committee representatives Pasternack and Mercier were added to the Interconnection Dynamics Task Force ( IDTF ). The Interconnection Dynamics Tutorial ( IDT ), will be distributed to utility attendees at the General Meeting.

– Vice Chairman Uggerud reported on the Technical Steering Committee ( TSC ) meeting, and assigned five ( 5 ) of the NERC strategies to the Engineering or Operating Committee.

– The Transmission Wheeling and Transfers ( TWT ) reference document, approved by the Engineering Committee and Operating Committee during their meeting, will be presented to the Board of Directors during their July 1989 meeting.

Operating Committee Executive Committee ( OCEC ) –

Chairman Kinard reviewed the Operating Committee Executive Committee ( OCEC ) meeting held on June 5, 1989, discussing:

– How-to mitigate Solar Magnetic Storm effects on Bulk Electric Systems ( BES ).

– Electro-Magnetic Pulse ( EMP ) events requiring restoration of utility companies, part of a U.S. Department of Energy ( DOE ) contract, identified the biggest impact falling on ‘communications systems’ of utility companies, which Vince Kruse of ASEA BROWN-BOVERI ( ABB ) [ ASEA est. 1890 ( Stockholm, Sweden ) and BROWN BOVERI et Cie. est. 1891 ( Baden, Switzerland ) ] and consultant Edward Taylor discussed with the Operating Committee Executive Committee ( OCEC ).

– Operating Committee Executive Committee ( OCEC ) strategies status.

– Operating Guide review status.

– 1988 Control performance compliance.

NERC Staff Report –

Don Benjamin discussed the Telecommunications Service Priority ( TSP ) system status and U.S. Department of Energy ( DOE ) proposed Electric Service Priority restoration system ( ESP ) for National Security Emergency Preparedness facilities.

NERC offered to assist the U.S. Department of Energy with learning about electricity customer restoration procedures.

Updates were distributed by NERC staff for several sections of 1,200 Operating Manuals in circulation.

Procedures for Notifying Utilities About Solar Magnetic Activity ( SMA )

Don Benjamin reported on new procedures for Solar Magnetic Disturbance ( SMD ) notification.

Following the Hydro-Québec blackout – March 13, 1989 – Don Benjamin, WSCC staff, Bonneville Power Administration ( BPA ) and American Electric Power Co. Inc. ( AEP ) representatives held a ‘conference call’ with the staff of the National Oceanic and Atmospheric Administration ( NOAA ) Space Environment Services Center ( SESC ), asking NOAA to provide forecasts of Solar Magnetic Disturbance ( SMD ) activity to help utility companies prepare for the event. NOAA said it could provide a forecast, agreeing to supply AEP and BPA with this information, beginning April 1, 1989.

Solar Magnetic Storm of March 13, 1989 and Hydro-Québec Outage –

Ray Elsliger, of Hydro-Québec, discussed Hydro-Québec ( HQ ) outage details, and explained:

– Solar Wind effects on Earth’s ‘magnetic field‘;

[ EDITED INSERT NOTE: Sun eruptive Solar Flares bombard Solar Energetic Particles ( SEP ) creating Geomagnetic Storms on Earth, that is driven by a somewhat organized ‘geomagnetic core generator’ comprised of ‘molten magma’ ( lava ) consisting of ‘multiple types’ of ‘mineral materials’ containing ‘multiple types’ of ‘High Field Strength Elements’ ( HFSE ) [ e.g. niobium, kaersutite, titanmagnetite, etc. ] under ‘extreme’ ‘high-pressure‘ and ‘heat’ causes element ‘properties’ to ‘react’ in ‘multiple ways’ assuming ‘astrophysical phenomena characteristics’ only one ( 1 ) of which is ‘ExtraSuperConductivity’ ( ESC ) forcing ‘finite properties’ of subatomic particles ( electron neutrinos, tauon neutrinos, muon neutrinos, leptons, etc. ) exit ( faster than lightspeed, invisible to the naked eye ) away-from Earth until phenomenally stopping to form an ‘Extra-Planetary Core Supplied Field’ ( EPCSF ) or ‘magnetic property Earth shield’ known as the ‘Magnetosphere’, surrounding Earth, that wards-off ‘high-intensity cosmic rays’ ( hazardous to Earth-living things ) ];

and,

– Solar-Induced Currents ( SIC ).

[ EDITED INSERT NOTE: Also known as “Auroral Currents” ( namesake taken from the “Aurora Borealis” also known as the “Northern Lights” ), dependent upon ‘solar high field strength’ and ‘directional-focus’ ( aimed at Earth ) invariably ‘penetrate’ Earth’s Magnetosphere where beneath they become ‘atmospheric high-energy electrons’ arriving on Earth where en masse their effects can ‘penetrate water’, ‘accumulate’ within ‘all types of wire’ ( including ‘fence wire’ ), exhibits no regard for existing Earth Field Strength ( EFS ) electrons already flowing within electricity transmission system lines ( wires / cables ) experiencing this quasi form of Solar-Induced Direct Current ‘ ( SIDC ) ‘traveling by its own wild propulsion’ especially when reaching ‘high ground-impedance’ in ‘geographic regions’ consisting of ‘igneous rock’, and ‘inherently carries while incorporating’ – along electrical transmission system lines – ‘second-harmonic currents‘ that can ‘totally saturate’ ( overfill ) electricity temporary storage capacitor ( electron storage containers ) banks exceeding over-current ( overload ) plus see ‘high static’ VAR compensators unsuccessfully handling high VAR demands causing relays to trip-off / shut-down transformers. ]

A copy of his presentation is attached, as: “ Exhibit D “

Mr. Cucchi noted PJM [ utility company ] experienced the effects of the Solar Storm, with tripped-off capacitor banks and damaged step-up transformers, and while PJM is seeking ‘operating procedures’ to assist them in mitigating future problems, he suggested a way to reveal the presence of Solar-Induced Currents ( SIC ) by summing VAR flows around transformers.

Solar Magnetic Disturbance ( SMD ) forecasting sees more work ahead.

Mr. Rodie noted that in Canada Ontario Hydro is monitoring Solar Magnetic Activity ( SMA ) forecasts from, both the:

– National Oceanic and Atmospheric Administration ( NOAA ); and, – Canadian Energy Mines and Resources ( CEMR ).

Chairman Kinard said the Solar Magnetic Disturbance ( SMD ) issue must be coordinated with the NERC Engineering Committee.

He asked the Disturbance Analysis Working Group to work on the Solar Magnetic Disturbance ( SMD ) forecasting and alerting procedures for further discussion at the August 1989 meeting.

Strategic Planning Task Force –

Mr. Wolak reported nine ( 9 ) out-of twenty ( 20 ) strategic recommendations were assigned to the standing subcommittees and the Disturbance Analysis Working Group. The Operating Committee Executive Committee is now working on the staffing requirements and scheduling.  He expects to have timetables and other details at the August 1989 Operating Committee meeting. Recommendations will most likely need personnel support from the utilities.

Regional Reserve Requirements –

As part of its strategies, the Committee has agreed to review regional operating guides for comparison to the NERC Guides.

In November 1988, the Committee selected operating reserve as its first topic, and asked Mr. Ellis to request each Region’s operating guide on this matter.  Since then, Mr. Ellis has been summarizing the guides for review and at this meeting discussed his summary of the Regions’ operating reserve requirements.  The Committee discussed each and found ‘the guides differed greatly’ from Region to Region.  Chairman Kinard will write the Committee members to ask they review their Region’s guides for agreement with NERC Operating Guides. Chairman Kinard asked Mr. McEacharn to select another operating guide topic for the August 1989 Operating Committee meeting.

Pre-Seasonal Assessments and Post-Seasonal Assessments –

President Gent reviewed the history of pre-seasonal assessment.  It began with the Electric Utility Act of 1967 to which the utilities agreed to conduct seasonal assessments as well as 10-year assessments.

Both tasks were originally handled by the NERC Reliability Assessment Subcommittee but in 1988 the Operating Committee was assigned the seasonal assessments because these reports deal with near-term system conditions that are within the period of system operations.

President Gent said the seasonal assessments need to be more candid in their description of events and asked the Operating Committee to carefully consider this in the future.

Chairman Kinard emphasized the need for the Operating Committee’s involvement in the pre- and post-seasonal assessments.  He asked Mr. Benjamin to send the seasonal assessment schedule and assessment guidelines to the Operating Committee.

Interchange Scheduling –

Mr. McEacharn, Chairman of the Interconnected Operations Subcommittee ( IOSC ), discussed the history of the Operating Guide on scheduled interchange.  This used to be Guide 3, and became Guide I.D. after the Guides were reformatted in 1987.  Many believe the Guide does not address parallel flows adequately, and the IOSC has been working on that aspect for many months.

Now, the Interconnected Operations Subcommittee ( IOSC ) needs direction from the Operating Committee on how to proceed.

Mr. McEacharn reviewed a questionnaire the Interconnected Operations Subcommittee ( IOSC ) compiled that addressed several aspects of scheduling.  After discussing the issue, the Committee members agreed to complete the survey on behalf of the utilities in their Regions.

Mr. McEacharn will send the questionnaire to the Operating Committee by July 1, 1989.  One response from each Region will be due back to Mr. McEacharn by October 1, 1989 and results will be discussed at the November 1989 meeting.

The answers should include the Regions’ opinions and background information behind their answers.

Mr. Nassief will address the impact of phase angle regulators in the WSCC response.

DC Control

Mr. Powell has discussed the need with several utilities for a Guide to address DC control.

He will have a suggested Guide revision for the Operating Committee to review at its August 1989 meeting.

1988 System Disturbances Report

Mr. Cucchi, Chairman of the Disturbance Analysis Working Group, discussed the draft report.  He noted the report’s emphasis of the problems associated with solid-state relay equipment and its sensitivity to system conditions.

It was on motion by Mr. Ham, seconded by Mr. Bulley, that the 1988 System Disturbances Report was unanimously approved for publication.

After discussion, the Committee did not believe the report warranted any changes to the Operating Guides.

MAP / TOP Task Force –

Mr. Paul Emmerich, who recently resigned from the chairmanship of the MAP/TOP Task Force, discussed the status of the MAP/TOP effort.  He emphasized the need for electric utility input to the standard.

Eastern Interconnection Hotline –

Mr. Phillips said he would like to test the Hotline more often during off-business hours.  The Committee agreed.  He will distribute the results of the first ( 1st ) Hotline test at the August 1989 Operating Committee meeting.

Don Benjamin noted that Southern Company was recording Hotline conversations.  Mr. Erickson said MAPP was too.  The Operating Committee discussed this issue and generally believed it was not a problem, however Mr. McEacharn was concerned about recording conversations by parties that were not part of the conversation. Messrs. Erickson and Campbell will determine whether MAPP and Southern Company are able to record conversations of which they are not part.

1989 Reliability Assessment –

Mr. Stout, Chairman of the Reliability Assessment Subcommittee, reviewed the capacity and demand forecasts, as well as other highlights, in the 1989 Reliability Assessment.  He also discussed a proposed questionnaire dealing with post-assessment statistics.  The Operating Committee expressed its concern about the appropriateness of the information, which would have to be collected on a non-coincident basis.

Performance Subcommittee –

Mr. Phillips’ Subcommittee report is attached as Exhibit E.

Adjourn

There being no further business, before the Operating Committee, the meeting was adjourned at 5 p.m. on June 6, 1989. F. Rich Nassief Secretary

NERC

Reference

http://www.nerc.com/docs/docs/oc/oc-8906m.wpd

====

[ U.S. Infrastructure Electricity Grid Disruption Areas – Map ( click to enlarge and read image ) ]

Not all Solar Maximums are as severe as the 1859, 1940,  1972 and 1989 SEPE effects on Earth.

In 2003, a Solar Maximum SEPE did last 2-weeks, crippling two ( 2 ) satellites, plus instruments aboard the Mars Orbiter spacecraft.

NASA, indicates the Solar Maximum SEPE effects to hit Earth in 2012 or 2013 are something “we all need to be concerned about.”

Solar Maximum Solar Energetic Particle Event ( SEPE ) Effects –

Solar geomagnetic storms ( also known as ) ‘Space Weather’ ( also known as ) ‘Solar Wind’ ( also known as ) ‘Sunspots’, and depending on the level of severity can effect ‘all types of metal wire’ causing serious side effects to property as well as posing serious dangers to human lives.

Wire Products –

Fence wire; Light bulb filament wires; Microelectronic computer components; Batteries holding electrons; Electrical wire ( straight, curved, bent and coiled ); plus, More…

Products containing wire possess ‘relatively organized’ electron particles, but when these get hit ( at near light speed but invisible to the naked eye ) by Sunlight acceleration into becoming highly energetic ( high-energy ) electrons, protons, and X-Ray particles they will confuse normally operating electrical devices and systems by ‘initial, severe and sudden disruption’ followed by ’subtle deactivating’, and these detrimental effects become experienced through two ( 2 ) long stage effects:

FIRST ( INITIAL EFFECT ) STAGE – Voltage spike overload creates instant transformer explosions, electric wire cable circuit burnouts, and wire fence heat igniting grass fires, etc.; and,

LAST ( RESULTANT EFFECT ) STAGE – loss of electrification by ‘electron particle relaxation’ and ‘electrolyte cessation’ ( stopped / halted ).

Once a Solar Maximum hits Earth it becomes known as a “Solar Particle Energetic Event” ( SEPE ) having already occurred and affects many things in a variety of locations and on different levels, such as:

ABOVE GROUND: Fence wire, electrical power lines and cables, wire in motors, generators, batteries, microelectronic components, etc.;

UNDER WATER: Sea cable communication wire and wiring;

ABOVE WATER: Ocean going vessel ship microelectronic components and wiring;

AIR: Airplane avionic microelectronic components and wiring; and,

SPACE: Satellite microelectronic components and wiring.

On January 21, 2009 NASA released a report claiming Solar Maximum storms that exhibit geomagnetic affects – resembling Electro-Magnetic Pulse ( EMP ) radiation ( non-harmful to humans but causing severe electrical black-outs from nuclear bomb detonations ) – would experience severe electron overloads and anything electrical disabled.

NASA report co-author John Kappenmann, of METATECH CORPORATION, indicated that ‘even more powerful’ than the aforementioned 1989 Solar Maximum was the much earlier May 1921 Solar Maximum that if hit today in modern society would feel the effects of at least 350 major electrical transformers being destroyed leaving over 130,000,000 million people without electricity in the United States alone.

[ click to read by enlarging image ( above ) ]

The loss of electricity would ripple across the social infrastructure of the United States and other foreign country citizens whom would see ”water distribution affected within several hours, perishable foods and medications lost in 12-hours to 24-hours, loss of heating, air conditioning, sewage disposal, phone service, fuel re-supply and so on,” indicated Kappenmann in the NASA 2009 report.

Now, the May 1921 Solar Energetic Particle Event ( SEPE ) was not nearly as powerful as the earlier September 1, 1859 through September 2, 1859 Solar Energetic Particle Event where just one ( 1 ) of many aspects of the ‘negative affects’ was that even barbed wire fences became ‘too hot to touch’ that the wire strung on wooden fence posts ignited the wood on fire.

What more should we be expecting from a Solar Maximum that NASA warns “we all need to be concerned about?”

NASA Jet Propulsion Laboratory ( Pasadena, California ) research led by Bruce Tsurutani indicated, “In 1859, the technology was quite low in comparison to today’s technology, however the technology that we rely on today is much more vulnerable.”

Society back then ( 1859 ) were not adversely affected by that Solar Maximum storm in anyway it would now affect everyone today on Earth. In 1859, the telegraph system was only 15-years old. There were no Earth orbit satellites supplying broadcast news feeds to cable television, cellphones, or automated teller machines, electric power grids, computer chips, and computer controls within automobiles, trucks, trains, airplanes, ships, dams, water supply, refrigeration, and more.

European Space Agency project scientist Bernhard Fleck, for the SOHO spacecraft watching the Sun, indicates the next Solar Maximum super solar storm will be detectable, however that will be only half the story because we really will not know what the size of the affects will be until after it hits Earth again. According to ‘previous NASA reports’ the Solar Maximum is due to hit Earth between 2010 and 2011.

SOHO deputy scientist Paal Brekke indicates the 1859 Solar Maximum flare caught observers logging almost 1-minute of sunlight ‘doubling’ in the flare region of the Sun’s corona. “Such a strong White Light ( WL ) ( now known as “White-light emission” ) flare has never been seen since.”

“So if this type of flare happened, yes we would know right away,” adding that Earth’s ‘magnetic field orientation would not be known’.

Communication disruption will be imminent as soon as another Solar Maximum reaches Earth.

[ click to enlarge image ( above ) ]

There is, however, another historical projection from September 1859 when after the Solar Maximum hit Earth. Amazingly, batteries were being disconnected because they simply were no longer needed ( temporarily ) to operate many types of electrical equipment that automatically kept running by what 1859 people accounted to being a most unusual “atmospheric current” or ”aurora current” and referred to as an “auroral current.”

Who’s in real in danger of losing out economically, the utility companies? Electric metro rail commuter trains operate on electricity so, if ’no electricity is required’ does that mean public passengers will be charged less for ticket fare then they are today? What other ‘positive effects’ could occur? Electric vehicles and gasoline electric hybrid cars, will they continue to operate without any electricity’ or sorely ‘fail to operate at all’ in the wake of the Solar Maximum aftermath?

Solar Maximum SEPE ‘aftermath effects’ should see us preparing now for what also comes ‘after’ the “Auroral current” experience dissipates.

Will all electrical devices and systems be permanently affected and go dormant until old burned-out parts can be replaced with new replacement parts? How long will these types of major jobs take to complete before we may return to a life of normalcy?

Who will experience greater losses? People? Companies?

Food suppliers ( e.g. ARCHER DANIELS MIDLAND, BARILLA, etc. ) will see ‘no way to deliver fresh food to customers’, especially in metropolitan and urban areas, and while ‘farms’ and ‘farmer markets’ will become popularly widespread it is equally important to realize the ‘time it takes to deliver fresh food to customers’.

What about ‘Winter’ blizzards when roads become inaccessible, do people stop eating?

What about ‘draught’, as national farm belt regions encounter ‘periods of hot temperatures’, or ‘no water pumped electrically’ from ‘water utility companies’?

Even more issues to consider about everyone’s destiny, is why governments may secretly be expecting ‘difficult access’ to ‘water’, ‘food’, or ‘both’. Sound impossible?

Touching one ( 1 ) secret’ reveals ‘national native tribal agriculture seeds’ ( ‘purest food form seeds’ ) are being purchased at ‘incredibly alarming rate’ and equally alarming ‘higher prices’ than usual. Why purchase these seeds, and who is buying them all up? Governments are doing this, and worldwide too.

For many decades, Russia still continues to maintain its ‘top secret’ government “State Seed Protectorate,” and while the U.S. government recently began buying up all the seeds they could from Native American Indians, other governments in Latin America, Australia, Africa and elsewhere are buying up all they can too. Interestingly, as that may seem, even more incredible is ‘where governments are storing these incredible huge quantities of seeds’. An incredibly huge ‘seed bank’, located in – of all places – a ‘northern’ Artic region. Why the sudden rush?

While ‘utility suppliers’ – without electricity – see ‘no way to distribute electricity’, ‘natural gas’ to heat customer homes, ‘home heating oil’ to heat homes, ‘telephone’ customers without any telephone service, it is an equally economic fact that ‘governments will lose those company customer tax incomes’, however ‘coal companies’ will takeover for the United States has the world’s largest natural resources of ‘coal reserves’ that may be used in the future to heat not only homes but also return to power trains, trucks and even automobiles.

Might we be forced to ‘invent new means’ by which we can go forward operating under an entirely new magnetic electrical principle on Earth?

Will all spacecraft satellites and the International Space Station ( ISS ) experience ‘orbit decay’ consequently falling back to Earth? If so, what effects can Earth residents and businesses globally expect from such falling objects from space?

Worldwide employment is about to become busier than ever before in the history of mankind.

Global unbalance from Solar Maximum SEPE

As governments amass mankind to rebuild government infrastructures, what will be done about ‘not enough people to fill so many new job openings’?

Will everyone so focused on such jobs that war might become a thing of the past? If so, what will governments do about an incredibly consequential population explosion?

How will governments ‘manage supplying even more water’ and ‘even more food’ to ‘even more children being born and growing-up during a 10-year ‘infrastructure recovery period’?

While speculation continues to abound, surrounding a variety of problems answered by new concepts, NASA believes it provided everyone with all they need to know  “Toilets won’t flush,” because of a Solar Maximum SEPE due, which is “something we all should be concerned about.”

What does ‘that’ NASA statement allow anyone to really understand? Nothing perhaps, except that the word ”concerned” is a ‘comforting word’ used to ‘allay fear and panic’ by eliminating the ‘reality word’; “worried.”

What precisely is ‘it’ – or is there actually ‘more than one issue’ – ”we all need to be worried about?”

Unfortunately, NASA is ‘not saying anything more’ other than it’s ‘something’ “we all need to be worried about.” NASA’s “something,” is it a ‘major’ “something” that will change everyone’s life?

No ‘clear official government definition’ is provided anywhere, which throws-open the doors of speculation about ‘what’ will ‘all the ill affects be’, and what ‘do’ “we all need to be ‘concerned’ about?”

Speculation, on this magnitude of an event with no answers coming from anyone in an official position simply fails to direct anyone to do anything to prepare themselves or their families. Hence, ‘this’ is precisely what leaves open the door for wild worry and panic to enter many lives so, is ‘sheer panic’ what governments want to take hold?

Does a Solar Maximum SEPE scatter our brain waves rendering everyone unable to function, much in the same fashion by which government left everyone in a fog to guess about ‘what’, ‘when’, and ‘how much’ will happen that is  “something we all need to be concerned about”? Will ‘mental health’ issues increase? Why then is there ‘no information guidance’ as to what the effects will be with the Solar Maximum SEPE event due, according to NASA?

It becomes real scary when NASA says, “We don’t know to what extent the effects will be yet, and will not know until after it hits Earth.” Gee? That “waiting to see what happens” is ‘real comforting’, right?

What does the U.S. Federal Emergency Management Agency ( FEMA ) – now appearing on yet another U.S. government ‘website’ named “Ready.Gov” – have to announce to the public about a Solar Maximum SEPE that will disrupt electricity supply? ( see immediately below )

Will a Solar Maximum SEPE allow everyone to ‘float through the air’ ( joking ) in the electrically excited auroral current of ‘green atmospheric electrons’ ( no joke )?

There ’must be a reasonable limit’ set as to both ‘negative affects’ and ‘positive effects’ that all citizens should be made aware of by their government leadership.

Today we rely on more than just ‘ground-based’ telephone and telegraph wire cables’ to communicate across vast distances by also using ‘space-based’ satellites and ‘sea-based’ oceanic communication cables. Is this ‘major solar event’ something we really need to be concerned about?

Can NASA get ‘their’ story straight?

NASA, leadership sees other NASA officials saying “nothing significant” is going to occur. Whoa, wait a minute. Then ‘what U.S. government agency’ is the ‘public supposed to be listening to’?

Can the U.S. government get ‘their’ story straight?

Wondering about all this, I decided to call the U.S. Federal Emergency Management Agency ( FEMA ), who said to check with NASA, who said check with the U.S. Department of Homeland Security ( DHS ), who said check with your local government, who did ‘not even know what a SEPE was’ let alone ‘how to prepare for one’. Several local government leaders said, “Is this a joke?”

Who knows about the ‘effects’ from a Solar Maximum SEPE on Earth?

If the public will not wake up to the ‘most current’ NASA announcement on ‘several television news broadcasts’, should they pay closer attention to someone else? Forget about asking the “Men In Black” ( MIB ) whom were taken off air waves in the late 1960s when Quinn Martin shut down his black and white television series, “The Invaders,” where ‘phone lines’ and ‘people’ often turned-up ‘dead’.

How about FOX News Glen Beck? He says he has his “FOX Team” of “researchers” whom discover things for him all the time!

Perhaps, ‘Jay Leno’ would get off his motorcycle and ‘ask the man on the street’?

How about your ‘local psychic’ who claims to ‘know all’ and ‘see all’ in-to ‘your future’!

I’m sure Tina, the gal standing on the corner with the pink spiked hair-do, could find out for you if the price was right.

I’m leaning, however toward the man on the street holding the sign, “The End Is Near!”

You decide ‘who will be the best judge over information’ for you and your family!

The fact is we ‘all’ – at-least – need to be ‘aware’ of the Solar Maximum SEPE as a coming ‘major event’ that will likely be changing how we will be living here on Earth sometime in either 2011, 2012, or 2013 according to ‘some brave government officials’, ‘professionals’ and ‘researchers’ around the world who probably ‘know better than most of us’ now at the current time.

Everyone’s life today is ‘dictated by matters of convenience’

Today ( 2011 ) ‘we have all become as children’, ‘extremely dependent’ on certain things that are ‘important to us very personally’, and ‘electronics enable us’ to ‘accomplish many things for ourselves’ and ‘on a daily basis’ now for many decades. Unfortunately, many people take for granted items like ‘gasoline’ provided at ‘gas stations’ ( ‘pumps’ all ‘controlled by electricity’ ) where we know our ‘vehicles’, ‘buses’, and transport trucks ( ‘engines’ all ‘controlled by ‘electricity’ ) bring us ‘near to everything’ we ‘need’. Our foods, we know are ‘kept fresh’ by ‘cool temperatures’ ( ‘refrigerated containers’ all ‘controlled by electricity’ ). Other conveniences, we take for granted are Automatic Teller Machines ( ‘ATMs’ all ‘controlled by electricity’ ) for cash, banks ( ‘computers’ all ‘controlled by electricity’ ) for direct deposits and cash, Point Of Sale ( ‘POS’ machines  all ‘controlled by electricity’ ) for retail counter purchase transactions ( cash, debit, or credit ), ‘postal mail delivery’ ( ‘trucks’ and ‘airplanes’ all ‘controlled by electricity’ ) and ‘last but not least’ – that even NASA warned us about in 2009 – “toilets that will not flush” ( ‘water company pumps’ all ‘controlled by electricity’ ) but ‘even more importantly’ than “toilets” is ‘how water reaches us to drink’ – from faucets!

Many will discover more answers from this report, its video clips and references that will now at least provide a few helpful hints for people to begin considering and planning for when the Solar Maximum SEPE arrives tomorrow – and ‘it’ will be coming to Earth ‘very soon’.

Predicting a Solar Maximum SEPE on Earth

Although one ( 1 ), of many, NASA missions began over 4-Years, 4-Months and 10-Days ago it is important to know that ‘invisible energy particles’ come from a Coronal Mass Ejection ( CME ) that most people call a “solar flare.”

New terms ‘distancing public awareness’ controlling ‘public complacency’

But what’s in a name? Government ‘official politically correct terms’ lead many into a state of ‘false complacency’ ( rest at ease, because “nothing’s going to happen” ) by ‘now rarely using the familiar term’ “solar flare,” now “solar wind.”

Gee? On the surface it appears as “nothing” to the public but a nice “solar wind” washing across everyone’s face as they bask in the Sun at the beach, right? Wrong!

A “solar wind” – by the time it strikes Earth – will ‘not blow’ or ‘even move’ so much as one ( 1 ) single ‘feather’, but the term “solar wind” ‘allays public fear’ of any ‘emerging ray’ while ‘dumbing-down public knowledge’ as to what a “solar flare” really is, what it consists of, and the ‘damage it can and usually does’ to many people’s lives on Earth. Hey, but do ‘not’ hang the person who coined the term “solar wind,” at-least not yet.

A “coronal mass ejection” ( CME ) is, a “solar flare” that becomes, a “solar wind” when it leaves the surface of the Sun, and by the time it strikes Earth, it becomes known as a “Solar Energetic Particle Event” ( SEPE ) consisting of ‘invisible high-energy particles’ of which ‘electrons’ negatively affect ‘electricity’.

Remember the aforementioned and while everyone else is lost not knowing what to do – thanks to ‘government officials’ and ‘television news’ making everything in life sound so rosy, you’ll ‘know precisely’ what ‘it’ is ‘that just struck you’ and ‘all around you’ so you can ‘quickly begin dealing with the aftermath’ that will definitely follow.

Who knows? You might even end-up becoming a ‘surviving leader’ of a ‘whole new tribe’ of ‘poor unfortunate souls’. Those wishing to be ‘followers’ – please skip reading the rest of this report <grin>.

Coronal Mass Ejections ( CME ) pack one heck of a powerful wallop throughout our Solar System on several planets so they are fairly complicated, as follows:

– Sun surface ‘high-energy magnetic eruptions’ are like ‘reversed electrical lightening storms with a thermonuclear magnetic boost’ from ‘highly magnetic field layers of liquid magma’ – from the ‘core’ of the Sun – circulating up to the surface where ‘extreme high field strength element ( EHFSE ) niobitic magnetic loops’ seep through the surface ( corona ) reaching ‘extremely high altitudes’ where they ‘sway back and forth’ collecting ‘cosmic ray particles’ ( CRP ) causing a ‘cluster magnetic convergence’ explosion cutting ‘solar magnetic umbilical cords’ skyrocketing them into outer space. Those CME “solar flares” – exploding in certain regions of the Sun – head toward Earth while other “solar flares’ go elsewhere throughout our planetary Solar System.

“Solar Flare” ‘high-energy magnetic electron particles’ ( SEMEP ), upon reaching Earth, only become a “Solar Energetic Particle Event” ( SEPE ) creating ‘various strengths’, of:

– Geomagnetic storms.

When the Sun’s Coronal Mass Ejection ( CME ) discharges highly energetic ( out of control ) electrons ( invisible to the naked eye ), a Solar Energetic Particle Event ( SEPE ) or Solar Energetic Particle ( SEP ) Event occurs, but only ‘dependent upon the quantity’ of High Field Strength Element ( HFSE ) niobium based magnetic electrons able to ‘penetrate’ Earth’s “Magnetosphere” ( Earth’s ‘core generated magnetic shield’ outside Earth extending into outer space ), the ‘electrical amplitude disruption’ will either be ‘minor’ ( ‘radio frequency’ RF signal disruption ), ‘major’ ( electrical equipment burn out failures ), or ‘significantly serious’ enough to cause Earth’s ‘inner magnetic pole to shift in degrees’ – one way or the other.

Important to understand is, that from the very instant these “Solar Maximum” SEPE highly excited ( no orderly flow ) electrons strike Earth, ‘electrons flowing through conductive materials’ ( e.g. wires, batteries, electronic components, etc. ) will temporarily cease ‘operating electrical devices and systems around the globe’ until the invisible ( to the naked eye ) electron storm event ceases to be a problem on Earth.

NASA claims, U.S. Air Force Weather Agency ( AFWA ) “Global Environmental Intelligence” ( GEI ) predictives’ currently indicate, that a:

– Solar Energetic Particle Event ( SEPE ) ‘effects can occur’ between ’2010′ and ’2013′;

– Solar Energetic Particle Event ( SEPE ) ‘effects can occur’ in ‘less than 18-hours’ from the largest Coronal Mass Ejection ( CME ) recorded in 1859;

– Solar Energetic Particle Event ( SEPE ) ‘effects will last’ anywhere from a ‘few months’ to “1-year” or “2-years”; and,

– Solar Energetic Particle Event ( SEPE ) ‘effects may be deflected’ by ‘establishing a conductor array’ to ‘absorb’ ambient ‘high-energy electron particles’.

“Solar Maximum” affects ‘all electronic devices and systems relying on a normal flow of electrons in order to operate’, which means ‘all electrical wire filaments with electrical current flowing through them’ will stop operating properly. The ‘brief list’ of which, includes:

– Batteries: Lithium ion – LiIo, Nickel Cadmium – NiCad, Rechargeables, Dielectrics ( automotive, trucks, buses, locomotive trains, airplanes, boats, motorcycles, etc. ), others;

– Photo voltaic cells;

– Electrical motors – Generators ( turbine ), Transformers, Coils, Alternators, Starters, Voltage Regulators, Electric Motors ( drills / grinders, etc. – with copper wire windings within );

– Light bulbs ( filament and neon with electricity ballasts will NOT operate – ‘liquid light’ sources WILL operate! );

– Water well sources ( ‘electrically pumped’ will NOT operate – ‘manual gravity pumps’ WILL operate )

– Telephones ( switching centers, distribution boxes, handsets, headsets, microcircuits, transformers, coils, etc. – containing electrical wire or cable );

– Televisions, video cameras, charged couple devices ( CCD ), digital cameras, radios, and flash lights ( electrical, battery-operated, solar-powered, or hand-cranked will ‘NOT’ operate – ‘liquid light’ sources WILL operate! );

– Computer driven devices ( ATM cash machines, computers, facsimiles, telephones, cellphones, satellite phones, memory thumb-sized drives, internal hard drives, external hard drives, printers, facsimile machines ( FAX ), electric typewriters, etc. – ‘manual typewriters’ WILL operate! );

– Satellites, refrigerators, freezers, gasoline station pumps, charged coupled devices ( CCD ) in digital cameras and other CCD dependent equipment, memory chips, memory thumb-sized drives, computer driven internal and external hard drives, plus an incredibly long list of many other items relied upon to operate normally and ‘retain important information’ for us all.

Should we begin practicing our writing skills?

Should we begin ‘printing everything onto paper’ from computers?

What about the U.S. government ‘Paper Reduction Act’?

What about ‘innocent trees from being ‘butchered into pulp’ for ‘paper’?

Top 7 Global Profiteers from a ‘significant’ Solar Maximum SEPE?

1 – Water ( fresh drinking ) suppliers;

2 – Farms – a. fish b. nut c. agriculture c. poultry d. pig e. mule f. horse g. dairy;

3 – Salt producers ( food preservative );

4 – Coal companies ( furnace heat );

5 – Wood & Paper manufacturers ( supports 1, 2, 3, & 4 above );

6 – Pharmaceutical manufacturers ( medicine ); and,

7 – Chemical companies ( supports 2, 3, 4, 5 & 6 above ).

Top Global Loser from a ‘significant’ Solar Maximum SEPE?

1 – Public ( in general ).

Before considering, preparing and instituting anything by anyone, there is a ‘basic matter of survival’ given the fact that mass reliance on what is about to be extinguished for an indeterminate period of time has not been addressed to-date by governments to citizens.

Supplying a few answers, to the following questions, would provide a better chance of survival:

1. Where does one go to understand?

2. What must one consider after understanding?

3. How does one prepare plans after considering?

4. When does one institute plans after considering?

Answers to the aforementioned four ( 4 ) questions, if left up to themself, most would not know and consequently not survive mass public panic by such an event.

Understanding what one is facing, exercising common sense and good judgment is the most important thing anyone can do to properly survive. A ’few’ will figure it out – the hard way – but while many have absolutely no clue as to what they are about ready to face, a few confident souls may only ’think they know how to get through it’. The harsh reality is less than 1% will survive. Those whom ‘do survive’ will either ‘know how to properly survive’ or ‘learn how to survive’ by ‘other means’ of ‘chance’ or ‘circumstance’. Many will unfortunately fail to ‘exercise good judgment’ at only one ( 1 ) ’critical moment’ when a little ‘common sense’ and ‘understanding’ a few ‘key critical’ elements – about ’people’ and ‘science’ – could have pulled them through.

Unfortunately, ‘socio-economics’ and the ‘science of surviving without common comforts’ is not required studies in ‘basic elementary school’ so, one must set-out to exercise their own crash course elementary study in these subjects where they will learn something about psychology, finance, government, and how to forage for themself. Subjects, such as these – typically deemed ‘boring to most’ – in ‘this SEPE event’ will determine ‘survivability’.

– Understand ‘precisely what will occur’ – Do not rely on anyone;

– Consider an ‘initial survival plan’ – Do not rely on anyone;

– Consider a ‘contingency survival plan’ – Do not rely on anyone;

– Prepare ‘both plans’ – Do not inform anyone;

– Patience – Do not hurry – Do not waste time; and,

– Institute ‘Plan 1′ ( initial ) – Wait to go to ‘Plan 2′ ( contingency );

– Consider ‘alternate transportation’ – Prepare for ‘mule’ or ‘feet’;

– Consider ‘foods and liquid’ – Prepare for ‘nuts’ and ‘condensation’; and,

– Consider ‘remote living’ – Prepare for a ‘natural or makeshift shelter’.

[ click to enlarge image ( above ) FEMA COP ( Mount Weather, Virginia ) ]

Alternatives to ‘inability’ or ‘unwillingness’ to ‘self-survive’ leaves hope otherwise flapping in the breeze fluttering about while waiting for government replies through its ‘application process’ for ’distressed people’ believing they will be going to a ‘continuity community center’ ( CCC ) or better yet drug away to what ‘conspiracy theorists term’ a “Deep Underground Military Base ( DUMB )” but in either event only ‘after a cataclysmic event’ has taken place.

Might only be for a month ( 30-days ), but in the event of more permanency it ties-in remarkably to yet even ‘other conspiracy theorist’ claims as to ’why’ the ‘internet addiction’ was allowed to ‘publicly fester’.

Internet preoccupation has ‘distinct self-programming qualities’ supporting human isolationism theories, and may very well be an ‘advanced emergency preparedness tool’ that has certainly hooked millions rather uniquely but socially destructive ‘creative self-indoctrination programming’ methodology scheme.

Internet junkies spending entirely too many hours day and night online are certainly prime candidates whom will adjust more rapidly to ‘socially isolated environmental living’ ( e.g. under ground, beneath the sea, or in outer space’ ) than those continuing semblances of normalcy by exercising extended periods of daily interpersonal relationship socializing. There are plenty willing to argue those issues, but none willing to alert anyone as to Solar Maximum SEPE effects coming and what they should prepare for.

Major news organizations worldwide note how ‘interpersonal communication skills’ between individuals meeting to communicate is on a ‘quality landslide downward’ because so many millions of people no longer maintain frequent interpersonal communicating skills between each other and are now consequently ’unable to successfully function with what was known previously as a normal daily life’ without their current ‘interference addiction’ of internet grand scale usage.

Today, many ‘survive’ on ‘social security benefits’ and ‘disability benefits’ from the government so, what if you are in a DUMB for a 2-year stay – until everything is ready?

Believing government will work ‘very fast’ – using horses, mules, covered wagons, buggies, and /or bicycles – to speed your recovery back to some semblance of normalcy is intelligent thinking, no? Government wouldn’t just forget about ‘you’ would it? There are ‘laws’ protecting citizens from this sort of event, aren’t there?

What “roadmap” has government already ‘strategically planned’?

Remember how important ‘understanding’ is going to be for survival?

National Space and Atmospheric Administration ( NASA ) third ( 3rd ) ‘strategic element’ Solar Terrestrial Probe ( STP ) for the NASA Sun Earth Connections Roadmap ( SECR ) mission of the NASA Solar Terrestrial Energetic Relations Observatory ( STEREO ) involved Sun Earth Connection Coronal and Heliospheric Investigation ( SECCHI / EUVI / COR ) telescopic camera imaging sensors tell Earth what is headed our way by using five ( 5 ) scientific telescopes programmed to observe Coronal Mass Ejections ( CME ) from the Sun to the Earth.

Current ‘up to the moment color photographs’ of Coronal Mass Ejection ( CME ) solar flares can be instantly viewed at this official website link for SECCHI displays those coming  to Earth. Additionally, the National Oceanic and Atmospheric Administration ( NOAA ) ‘Earth time and date impact chart’ can be viewed on its Space Weather Alerts and Warnings Timeline ( SWAWT ).

[ click to enlarge image ( above ) ]

NASA Solar Terrestrial Energetic Relations Observatory ( STEREO ) primary goal is an in-advance three-dimensional ( 3-D ) study of Coronal Mass Ejection ( CME ) dynamics on Earth interplanetary evolution.

NASA Solar Terrestrial Energetic Relations Observatory ( STEREO ) SECCHI and SWAVES provide imaging observation of the Sun’s heliosphere ( varies temporally ) with unprecedented ( record setting ) use of combined ‘imaging experiments’ and ‘in-situ experiments’ ( IMPACT and PLASTIC ).

NASA SECR ( Sun Earth Connections Roadmap ) STEREO ( Solar Terrestrial Energetic Relations Observatory ) SECCHI suite of five ( 5 ) cameras consist of three ( 3 ) primary components:

– Sun Centered Imaging Package ( SCIP ) consisting of three ( 3 ) telescopes; – Heliospheric Imager ( HI ) consisting of two ( 2 ) telescopes; and, – SECCHI Electronics Box ( SEB ) or ( COR ) consists of command, control, communications, and computer intelligence ( C4I )  imaging electronics.

[ PHOTO ( above ): NASA STEREO spacecraft ( click to enlarge image ) ]

NASA Solar Terrestrial Energetic Relations Observatory ( STEREO ) consisted of two ( 2 ) spacecraft, nicknamed:

STEREO AHEAD spacecraft ( STEREO A ); and,

STERO BEHIND spacecraft ( STEREO B ).

SECCHI telescopes, mounted on both spacecraft ( STEREO-A and STEREO-B ) flanked Earth orbits of the Sun.

The two ( 2 ) [ STEREO AHEAD ( STEREO-A ) and STEREO BEHIND ( STEREO-B ) ] spacecraft launched, used ‘lunar gravity assist’ – slingshot effect from travelling around the moon several times – placing both into a heliocentric orbit.

The first ( 1st ), entering heliocentric orbit, was the STEREO AHEAD ( STEREO-A ) spacecraft, then 2-weeks later the STEREO BEHIND ( STEREO-B ) spacecraft, both drifted away ( at an average rate of about 22.5 degrees per year ) from Earth – whereupon after their 2-year nominal operations phase ( 2008 ) – each spacecraft was then positioned about 90 degrees apart – about 45 degrees from Earth.

STEREO AHEAD spacecraft drifted ‘ahead’ of Earth.

STEREO-B spacecraft drifted ‘behind’ Earth.

To accomplish drift for each craft:

STEREO AHEAD ( STEREO-A ) spacecraft traveled ‘faster’ – than the solar orbit of Earth;

STEREO AHEAD ( STEREO-A ) spacecraft orbited ‘closer distance’ to the Sun – than the solar orbit of Earth;

STEREO BEHIND ( STEREO-B ) spacecraft traveled ‘slower’ – than the solar orbit of Earth; and,

STEREO BEHIND ( STEREO-B ) spacecraft orbited ‘further distance’ from the Sun – than the solar orbit of Earth.

[ click to enlarge image ( above ) ]

On January 7, 2011 a solar Coronal Mass Ejection ( CME ) – detected earlier by the STEREO-A spacecraft – is just now ( 09JAN11 ) arriving on Earth from a solar flare that is only the beginning of what is being ‘officially claimed not a serious threat to Earth’, however it is expected to ‘increase in intensity’, however this is ‘not’ the Solar Maximum Solar Energetic Particle Event ( SEPE ) expected between June 2011 but by no later than the end of July 2011.

[ click to enlarge image ( above ) ]

NASA has ‘at least’ two ( 2 ) spacecraft solar programs going on in outer space that will hopefully be able to catch a glimpse ( for only a little while ) of a Solar Maximum SEPE, however ‘all’ such spacecraft share one ( 1 ) primary and extremely similar problem best outlined by learning more about NASA antiquated solar spacecraft still floating around in space hoping to continue providing information to Earth. For ‘how much longer’ is about to be answered.

NASA? We Have A Problem!

Early Warning Data Black Outs

[ click to enlarge image ( above ) ]

In 1997, NASA Earth solar space weather 15-minute to 45-minute ’early warning’ probe Advanced Composition Explorer ( ACE ) detects inbound solar geomagnetic storm steams of ‘highly excited and disorderly flowing electron’ particle solar plasma ( invisible to the human eye ) until it becomes overloaded by a Solar Maximum because even ACE onboard sensing camera detectors rely on the ‘orderly flow of electrons’ for its technologies to function.

ACE black-out will take NASA longer than 15-minutes to determine ‘why’ ACE even blacked-out.

NASA Advanced Composition Explorer ( ACE ) was only built to endure under ‘average solar wind conditions’ but nothing as intense as a Solar Maximum plasma flare.

NASA Advanced Composition Explorer ( ACE ) sensing detectors are now 13-years old, no longer as sensitive, plus ACE has already exceeded its NASA calculated life expectance. NASA knows ACE 13-year old sensors will ‘cease to function before’ a Solar Maximum plasma flare ‘passes by’ ACE in space.

Upon a Solar Maximum event, predicted by NASA between 2010 – 2013, NASA Advanced Composition Explorer ( ACE ) probe will become overloaded and burn-out from solar electron particle disruption and immediately go dead in space ‘before’ a Solar Maximum flare even passes its sensors.

No time soon could even a replacement for the NASA Advanced Composition Explorer ( ACE ) ever be launched into place.

There is no intelligence in relying on ACE because in addition to all of the aforementioned, ACE is positioned 92,000,000 million miles away from the Sun and only 900,000 miles away from Earth that is 93,000,000 million miles away from the Sun so, this means that within a few seconds after the Solar Maximum flare passes detection sensors onboard ACE the largest geomagnetic storm will have already hit Earth.

Other solar observation satellites, such as Solar and Heliospheric Observatory ( SOHO ) will only provide ‘some warning’ but with ‘far less detailed information’; especially in the wake of the NASA ‘predicted’ Solar Maximum SEPE effects in 2010 – 2013 on Earth; depending on ‘which NASA report’ one reviews.

NASA STEREO mission spacecraft ( STEREO AHEAD and STEREO BEHIND ) – in heliocentric Earth orbits – provide unique observation vantage points where ‘both enable’ a ‘stereoscopic view’ of ‘solar activity’, ‘reactions’ and ‘effects’ on Earth environments.

NASA STEREO four ( 4 ) instrument suites, supplying data measurements ( later assessed on Earth ), are named:

IMPACT – Energetic Particles and Magnetic Field Vectors; PLASTIC – Solar Wind Plasma; SECCHI – EUV Imager, Coronagraphs and Heliospheric Imagers [ U.S. Office of Naval Research ( ONR ) Naval Research Laboratory ( NRL ) ]; and, SWAVES – Radio Burst Tracker.

Space Weather Beacon ( SWB )

Space Weather Beacon ( SWB ) feeds compressed, binned, subset data continually broadcasted from NASA STEREO mission spacecraft instrumentation through SECCHI ( Sun Earth Connection Coronal and Heliospheric Investigation ) where downlink  ’image data’ ( not easily accessible via internet ) is ‘made available only to forecasters’.

National Oceanic and Atmospheric Agency ( NOAA ) Space Weather Prediction Center ( SWPC ) ‘only provides the public’ with ‘near-real-time’ ( ‘daily update still frame images’ ) from NASA STEREO mission spacecraft ( 2 ) sensor imaging camera telescopes retrieved through the STEREO project by using the STEREO Science Center ( SSC ).

– NOAA is ‘not the general provider’ of STEREO data.

In the future, NOAA expects viewing of ‘real-time image data’.

– NASA STEREO project ‘is the general provider’ of STEREO data.

Space Weather Beacon Shutdown

On January 8, 2011 ‘unexpected high current problems disrupting data flow’ forced shutdown of two ( 2 ) NASA spacecraft STEREO B ( STEREO BEHIND ) instruments:

– IMPACT [ In-situ Measurements of Particles and Coronal Mass Ejection ( CME ) Transients ] – measuring ‘solar energetic proton particles’ and ‘magnetic field vectors’; and,

– PLASTIC [ PLAsma and SupraThermal Ion Composition ] measuring ‘solar wind geomagnetic storm plasma’.

Measurements are of utmost significance for NASA and NOAA for space weather forecasts.

Late this week, NASA reported ‘both instruments’ ( IMPACT and PLASTIC ) recovered after having been shut down.

These two ( 2 ) instruments ( IMPACT AND PLASTIC ) provide significant space weather forecast data measurements, of:

– Solar wind ( geomagnetic storm ) plasma; – Solar energetic ( electrons, protons, etc. ) particles; and, – Magnetic ( field strength ) vectors.

While populations of the world yawns, at solar storms ‘now’, until a Solar Maximum SEPE hits Earth-  in 2013 –  thinking the day is too far away for them to be immediately concerned – many people do not even realize ‘what Solar Maximum SEPE effects are’ or ‘how a Solar Maximum SEPE effects them’. People simply ‘do not comprehend the impact on their life’ a Solar Maximum SEPE  solar storm effect contains.

The sad part for people is they are ‘not even preparing’ for a Solar Maximum SEPE – even after millions of people were warned by NASA and many major news organizations. Are people too preoccupied, living in their own little worlds? If not, then ‘why’ are people ‘not planning to do anything’? Why are people ’not prepared’ for a Solar Maximum SEPE?

People were ‘warned in 2010’ by both ‘government agencies’ and ‘news organizations’ about the Solar Maximum SEPE coming in 2012 or 2013 so, that only ‘registered a distant date’ for most – giving them a false sense of security for what could happen in 2011 as NASA previously warned people in “2006″ of the Solar Maximum SEPE coming in “2010 – 2011.”

What will it take to wake populations up?

News broadcasts already warned people of the Solar Maximum SEPE coming by 2012.

News broadcasts even warned people of the ‘effects’ a Solar Maximum SEPE will have on their lives.

What will it take to wake people up?

This report provides people with detailed information about ‘aftermath effects’ of a Solar Maximum SEPE, provides people with ‘official government background information’ on ‘official government problems’ facing a Solar Maximum SEPE, and provides people with ‘official government website internet links’ where people can begin ’learning to understand’ how ‘important emergency preparedness’ is with a Solar Maximum SEPE in 2012.

What will it take to wake people up? Solar Maximum SEPE will hit Earth ‘soon’.

Only a few know what NASA did and what other branches of government have done, but ask yourself what ‘you’ are ‘doing today’ to ‘educate yourself’ about ‘your basic survival’?

Forget about asking yourself where you’ll be by the end of the next 5-years. NASA already answered ‘that’ for ‘you’. Government is ingenious. To figure a way to ‘get masses of people ( young and old ) educated’ in ‘socio-economics’ as ‘cataclysmic events’ occur – without using any taxpayer money to accomplish that feat.

On March 11, 2011 a motion picture film will be released at theatres across America.

Many will not even realize how much they are going to receive from buying their own ticket to such an ‘event’.

Here ( below ) is an ‘initial compilation’ of ‘factual video clips only’ surrounding the “Solar Maximum” so ‘you can really understand what is important for you’.

[ click on each video ( below ) then click on next video ( below ) ]

After watching the aforementioned videos, which official organization do you think is telling the truth about the Solar Maximum arrival date? What about the lone ( 1 ) NASA ‘astrobiologist’ ( David Morrison – video above ) who claims – contrary to what NASA Administrative Leadership claims “we all need to be concerned about” – that somehow David Morrison somehow feels there “is “nothing to be worried about” regarding a “Solar Maximum” ‘event’ and claiming there was ‘not going to be any’ ”Pole Shift” either?

Still convinced “internet doomsday prophets” and “intertnet conspiracy theorists” talking about ‘these particular same subjects’ are ‘totally wrong’?

One may wish to ‘re-think’ a few things because ‘now’ ( below ) there is ‘even-more factual evidence’ proving ‘more specifically’ what NASA now says, “we all need to be concerned about,” as the ‘absolute truth’.

1. NATIONAL AERONAUTIC AND SPACE ADMINISTRATION ( NASA ) video indicates the Solar Maximum ’peak’ by ’2012′ – ’2013′;

2. NATIONAL SCIENCE FOUNDATION ( NSF ) funded NATIONAL CENTER FOR ATMOSPHERIC RESEARCH ( NCAR ) video indicates the Solar Maximum ‘earlier’ by ’2011′;

3. Government embedded news reports ( 1 above ), news guest reports ( 1 above ), and historical documentary ( 1 above ) indicates the Solar Maximum, only according to “NASA,” ’peaking’ by ’2012′ – ’2013′;

4. Private organizations indicate a Solar Maximum SEPE ’starting-up’ in ’2011′;

5. NASA AMES RESEARCH CENTER (ARC ) astrobiologist David Morrison video indicates nothing significant will occur in ’2012′;

6. Loosely wrapped unstable maligned and misunderstoods babble about ”Doomsday” being ‘every year’ has been since ’1920′; and,

7. Juveniles don’t stick to one ( 1 ) story, after lying, when cornered by an adult.

Why can’t ‘government officials’ get their ‘official storyline’ straight?

Why are officials bobbing and weaving around questions by providing only ‘remarks of uncertainty’? Keeps everyone ‘guessing’ their ‘life’ and the ‘lives’ of their ‘loved ones’ too. How can this be tolerated by the public? Easy, the ‘public actually believes the government has no clue’ as to ‘when’ the Solar Maximum SEPE will likely strike Earth. Gee? How did the public receive ‘that impression’?

The facts ( below ) prove how ‘wrong’ that ‘public perception’ really is.

Official comments such as, ”We’re not sure when” and “We’re not sure what the effects will be,” have ‘no place in the public ear’ because if the public knew more about the technology government was using – the public would know that officials are blatantly lying in ‘both’ of the ‘aforementioned comments’.

The fact is, officials ‘know’ they are currently – and have been for several years now – using technologies in space and on the ground that can determine – within a few days – ‘when’ the Sun will create a ‘significant solar flare’ that will explode Solar Particle Elements ( SPE ) and ‘what mapped region number’ of the Sun will explode. Officials are now doing this technology with special optical and heat sensors that map the Sun’s ’internal magmatic flows’ to ‘easily determine’ high-energy build-ups ‘within the Sun’ and ’before’ a ‘significant solar flare erupts’ on the ‘surface of the Sun’. Sound unbelieveable? Review the official NASA website links ( below ) in “References” provided at the bottom of this report.

Now, which ‘official’ is ’in charge’ of ‘telling the public’ the ‘complete truth’ about all this?

The public should have wised-up to the oldest government cover-up tactic where the public is ‘fed the official reason why an official did not tell the truth’ because, “The official didn’t want to appear ‘instantly stupid’ to a public audience.”

Does the public actually think government justified trillions of taxpayer dollars spent since 1980 ( 31-years ) to ‘research the Sun for no major reason’ except ‘curiousity’? The Solar research program continues to this day and beyond. There is even a U.S. Presidential Directive ordering NASA Programs.

This U.S. Presidential Program Directive is named, “Living With A Star” ( LWS ). Sounds real nice, huh? But, what if the U.S. Presidential Directive Program was otherwise known as, “Dying With A Star” ( DWS )?

This U.S. Presidential Program Directive also orders ‘space missions to find other planets for inhabitation by Earth individuals’. Sound unbelieveable? Review the official NASA website links ( below ) in “References” provided at the bottom of this report.

What prominent U.S. government official announced ( 2010 ) to the world that an Executive Office ( Office of the President of the United States of America ) was made to ‘disband’ the U.S. National Aeronautic and Space Administration ( NASA )? This fact is now a matter of public record and was reported by many U.S. news organizations around the world. Are ‘they’ lying about this?

Why has NASA so suddenly become,‘non-important’? What event took place or is about to take place? The official storyline is that NASA will – for a while – team-up with the European Space Agency ( ESA ) and that later-on NASA will step-aside to let the “private-sector” contractors take-over all U.S. space missions. The official storyline claims that with the U.S. economy in such a slump that it cannot afford the huge taxpayer budget of NASA. Does anyone in their right-mind believe any size group of private government contract companies can ‘out-fund’ the U.S. taxpayer budget for NASA space missions?

Knowing this, it is easy to see why even conspiracy theorists already tossed “plausible deniability” onto even their back-burners because ‘this major Earth event’ has ‘something far more going on than what is being officially being announced’.

They are definitely correct about at least one ( 1 ) thing and that is government is not saying anything further than, “we all need to be concerned” about a Solar Maximum ‘significant solar flare’ coming to Earth very soon.

Conspiracy theorists believe some ’other upcoming event’ in 2011 may actually be brewing that makes the Solar Maximum flare event appear insignificant.

Why is so very little being told, to the public, by officials?

[ click to enlarge image ( above ) ]

Some theories  are now confident about where huge amounts of taxpayer dollars disappeared. Funding, they say, for black budget operations built secret ‘escape spaceships’ or ‘underground luxury safe haven cities’ but ‘for only select members of the global elite’.

How will masses of people survive a Critical Infrastructure Key Resource ( CIKR ) breakdown during a Solar Maximum SEPE? Will masses be treated no differently than during legacy eras of war? Decades ago, during World War II in Germany, in order to even travel, make certain purchases, and gain access to certain facilities ’all people’ had to produce “papers” ( ‘official identification’ ) proving ‘who they were’ and ‘showing how they had been rated’ by government priorities. Has anything changed since the 1940s?

Well, we see government authority officials utilizing ‘computer screens’ to ‘verify information’ as to who we are plus more.

The book entitled ”1984,” written in 1948 by George Orwell, envisioned a period in the future where the public en masse led like sheep to slaughter by a wolf government that would broadcast across ‘flat screen televisions sets in walls at home’ ( remember this book was written in “1948″ ) subtle clichés, short phrases Orwell identified as ‘doublespeak’ coming from government ( “big brother” ) that sounded peaceful and logical but meant something entirely different, nevertheless entire populations were being programmed to understand something that really wasn’t true but designed surreptitiously to through the entire citizenry off the track as to what their government was actually doing to them behind their back.

It might become important for people to begin recognizing what government really means when referring to ‘screens’, ‘levels’, ‘prioritizing assets’ and ‘sectors’ on the topic of major national emergency security preparedness responses.

While government is undoubtedly only referring to ‘critical infrastructure facilities’ as being “assets” in “sectors” and how they intend on prioritizing them. How difficult would it be for government to apply the same measures to ‘people’ as “assets” in “sectors” that must be prioritized as well?

Is the vast majority of the public being kept out of the information loop? Whom are considered as “key assets” to government? Whom are the “key assets” that government considers “key”? Are those being labelled “conspiracy theorists” and “doomsday prophets” correct on anything “we all need to be concerned about”? Enquiring minds want to know, but more importantly, those wishing to survive want to know whether they should begin preparing ‘now’, and if so, for ‘what’.

[ click to enlarge image ( above ) Consequence-Based Top-Screen ( CTS ) Prioritization of Assets ( circa: 2010 ). Not from television series “24” ]

Why are so many people being kept in the dark by government, and just how does government plan on going about sorting out all the ‘little people’ – as to whom ‘live’ and whom ‘die’ in the aftermath of a ‘significant emergency’?

Who does government consider being rated at varying levels as “asset priorities” and just how was that already sorted out? Sorry, but I didn’t get that government memo either so, don’t feel too left out-of that particular survival loop planning stage.

Many already believe to know the ‘answer in a nutshell’, but are either too unwilling to openly admit where the masses will end-up in the aftermath or due to controlling intimidation factors would just rather not say anything publicly.

Will our citizenry populace be left to flounder? Will you and your family be left standing or laying alongside everyone whom have been purposely kept ignorant as to the facts of the Solar Maximum SEPE?

Which official government entity is in-charge of providing people with correct information?

U.S. National Laboratory at Los Alamos, New Mexico ( LANL ), where the hydrogen bomb ( H-Bomb ) was tested, indicates the U.S. National Oceanic and Atmospheric Administration ( NOAA ) has the job of informing the public about severe space weather headed to Earth.

U.S. Department of Homeland Security ( DHS ) website link proves that Homeland Security ( HS ) coordinates ‘aftermath national emergencies’, ‘preparedness’ and ‘responses’  for ‘government’ ( ‘organizations’ and ’staff support’ ), ‘national critical key business and other organization’ security, plus ‘threat interdiction’.

Homeland Security ( HS ) is ‘not tasked with warning the public en masse’ but ‘coordinating security responses’ to protect the public.

Those whom climbed onto the back of DHS by pounding wild rumors across the internet, posting public announcements on some retail store neighborhood friendly bulletin boards, and even parking lot vehicle windows and doors to the public have been advised to climb off the back of DHS so HS can do its job for everyone – a good thing to have provided them.

Homeland Security Protection & Intelligence sees and handles far more in-advance than just horses and mules pulling covered wagons to establish delivery supply lines to people left in the wake of a national disaster as the Solar Maximum SEPE in 2012 is feared to be carrying for everyone.

People need to understand about ‘being notified as to national emergencies. Notification warnings to the public en masse ‘do not come directly from DHS’ and to-date Homeland Security has not yet been ’mandated by the U.S. Congress’ to provide ’public en masse warnings’.

U.S. Department of Homeland Security (aka) Homeland Security ( HS ) ’is mandated to perform only functions accorded it by U.S. Congress’ in relationship to security, amongst a few other things, a few of which are for DHS to ‘notify’ and ‘maintain contact’ with ‘national critical key infrastructures’ involving ‘industries’, ‘utilities’, ‘highways’, ‘transportation’, ‘laboratories’, ‘health centers’, ‘agriculture’, ‘water’, ‘local governments’, ‘law enforcement community’, ‘intelligence agencies’ and much more, however DHS is ‘not assigned the task of notifying or warning people en masse’.

Notifications and warnings to people en masse is assigned under U.S. Congressional mandate to another ‘official government organization’ and its ‘official leadership’ as the ‘sole official responsible’ for ‘issuing the order to people en masse’.

Trying to ‘positively identify the sole official responsible party’ is impossible due to the ‘official government bureaucracy labyrinth answer avoidance cloak over knowledge’ kept away from the public en masse.

Ask ‘any official’ to ‘precisely answer’ the question, “who is the ’sole responsible individual’ assigned to ‘order notification of the public en masse’ about a ‘pending national disaster’?” You will not receive a ‘precise answer’ perhaps, but an ‘official knee-jerk reaction’ of, “Why the President of the United States! Who else could give such an order to the entire U.S. population?” Not to say ‘there wouldn’t be a President’, but ‘what if the public en masse was unable to quickly recognize from within the U.S. government labyrinth chain of command’ who that person really is’?

According to U.S. Congress mandate, the U.S. Federal Emergency Management Agency ( FEMA ) has a long list of ’secretly held individuals already qualified’ to fill even the U.S. Executive Branch command post who can give such an order to the public en masse.

First, there are books or manuals to be read, even by that person, should such a national emergency take place.

Not reported here is what is officially referred to as an “Extinction Level Event” ( ELE ), and this report only reports a “Solar Energetic Particle Event” ( SEPE ) predicted by U.S. government officials since 2006 as going to occur sometime ‘now’ between ’2011′ – ’2013′ so, what should “everyone need to be concerned about” ( NASA statement ) when ‘all electricity ceases to function’?

Even the ‘President of the United States of America’ is ‘advised by someone else’ as to ‘whether people en masse should be notified’ and ‘how they may be notified’ or warned of a pending disaster’ such as in the case of the Solar Maximum SEPE so, whoever FEMA has lined-up will also be advised, unless the advisor(s) cannot reach that ‘person in command of the nation’.

Who orders NASA? The President of the United States issues Executive Orders. NASA recently notified people en masse that in “2012 or 2013 Earth will experience effects of a Solar Maximum we all need to be concerned about.”

Did the “U.S. President of the United States of America” say ‘that’? No, “NASA” said it, but NASA only ‘announces’ what Executive Orders ( E.O. ) and Presidential Directives are, which actually originate the decision-making process from within the “Office of Central Intelligence White House” ( OCIWH ) that sees the U.S. President make the ‘announcement’ after ‘signing’ the Office of the White House Executive Branch Order ( EO ).

Okay so, ‘who’ is actually pulling the strings of the U.S. leadership – besides a ‘collective of decison-makers’ based on what they are fed from a ‘directorate committee’ of advisors? That information is in another report – not discussed here.

What ‘more notification or warning’ is the public ‘waiting for’? Something like, “Better locate the nearest fresh water supply source to drink from!”? Might be a good idea because, ‘most household tap water is supplied from pumps operated from electricity.

Locating a ‘freshwater well’ equipped with a ‘hand pump’ or ‘bucket with a long rope’ ( to pull water from a nearby fresh water lake or mountain stream ) is a very good ‘basic plan’ for everyone.

People en masse are ‘not likely to be herded’ via ‘horse drawn covered wagon trains’ across the open plains of America by the U.S. government federal officials presenting them with ’new residential living facilities’ as any type of a ’public emergency contingent’ shelter, or sending them to ’public dining halls’ for ’feeding’, or to ’medical clinics’ and ‘surgical centers’ for ‘treating’ various and sundry physical maladies, but probably ‘not mental illness treatment centers’ for which there will certainly be an ‘outbreak’.

One can bet that just as soon as transport supply lines encounter problems that even simple conveniences like medicines – even  ’aspirin’ – will soon become short on supply.

Hand-held can openers may become a convenient luxury to some when pop-top cans begin disappearing.

Farms where crops are, sounds nice but many will have already figured that out.

Boiling ( might want to think about a renewable source for creating a flame ) farm produce requires being able to consistently light a fire to to heat water – another good reason to have good supply of fresh water nearby!

People en masse will be left to remain where they are to deal with all of the socio-economic aftermath effects of a Solar Maximum SEPE. Governments will simply not be able to do anything ‘immediately’ for people, and although many should be able to comprehend such a burden on government, few will remain patient for very long.

Police and military functionally reductions to protect and serve will immediately spur local “Neighborhood Watch” into becoming ‘grass roots law and order’, unfortunately “Bill the barber” ( from down the street ) may not keep his leadership position very long once a ‘mob mentality’ decides he’s wrong and takes over the neighborhood.

Unpredictablility factors can and do usually and quite easily shift lives rather switfly with only a small variety of interferences or distractions turning a once seemingly well-organized group of level-headed people on the Neighborhood Watch Committee, whom thought they had everything figured out and going well for everyone, into an absolute nightmare once the ‘gang mentality’ enters the picture. After a few key gunshots, an entirely ‘new set of rules’ becomes ‘even more difficult to stomach’. Maybe “Bill the barber” leadership ideas weren’t so bad afterall.

Fewer people will comprehend why leaving – post aftermath – ‘where they are’ may likely ’not be such a good idea’. Those venturing out into once familiar environments may see how quickly those changed as mobs or gangs have a tendency to follow people to new sources for a variety of things there may nothing left after a mob is finished with it.

While there is still time left, ‘before’ the Solar Maximum SEPE effects are felt by people en masse, there ‘are current official government websites’ providing information, but unfortunately only as to ‘how national emergency preparedness’ will be ‘conducted’ on ‘general types of emergencies’ ( Weapons of Mass Destruction – WMD ) that ‘most are somewhat familiar with’, except when it comes to something entirely ‘different’.

Solar Maximum SEPE disaster leaves ’millions of people alive’ but ‘walking around’ until ‘quickly running out-of supplies’, which then becomes the ‘ultimate socio-economic disaster’ this Earth has ‘never experienced’.

Governments, to be safe, are ‘not’ prepared to deal with the public en masse during such an event, and will likely remain conveniently out-of-touch with the people – until after most everyone has settled down. Governments will then mount ‘counter-assaults’ – on any ‘mobs and gangs’ – easily taking over those few that remain. As for the remaining public? Such will likely ‘not’ be very pleasant to experience.

NOAA is currently notifying only people whom know ‘specifically where to look’ on the NOAA ’internet website’ about the Solar Maximum SEPE, but NOAA is ‘not advising anyone’ on ‘how to prepare’ for the Solar Maximum SEPE. Real nice, huh?

Everyone has already been ‘officially notified by NASA’ that the Solar Maximum SEPE is ‘coming’ and  ‘when’, by this ‘incredibly vague statement – “sometime in 2012 or 2013.”

NASA has ‘not warned’ anyone about ‘all electronic devices and systems stopping’, but NASA experts already realize ‘all electrically operated devices and systems will stop functioning normally’ when ‘bombarded by wildly excited electrons’ coming from the Solar Maximum that NASA has already termed to become SEPE ( Solar Energetic Particle Event ) effects.

Solar Maximum SEPE wildly charged-up electrons will effect ‘all forms of electrical current’, which means all ‘alternating current’ ( A-C ) power supplies, ‘direct current ( D-C ) batteries’, ‘solar collector’ power supplies, ‘coil storage devices’, charged coupled devices ( CCDs ) for camera and television displays,  computers, cellphones, and even radios will no longer function for people en masse to even learn anything what is affecting everyone or where to go for emergency assistance.

One must realize the government already considered Solar Maximum SEPE ‘effects on people en masse’ so, having already been secretly discussed behind closed doors there is no need to waste their breath any further on the subject because there is literally ‘nothing anyone can do’ to stop it but there ‘are all kinds of things that can be done to prepare and survive’ a Solar Maximum SEPE.

The problem is, government won’t provide that type of information in-advance for the public en masse because ‘that announcement alone’ will ’trigger panic’.

This report is undoubtedly considered by government as ‘not likely to be believed by the public en masse’ therefore perhaps, ’not an immediate threat to public panic en masse’. Some may ’at least be able to understand what that means for them’.

Whether everyone will be happy about knowing that, once they realize what that really means, may later see everything change. Even this report, has been changed more than ’30 times’ since first published on January 2, 2011 to provide even more current news updates herein.

Sociologist experts ‘knowing public reactions en masse’ frequently advise U.S. government leadership, agencies of the U.S. government and even private-sector companies – including undustrials – on a variety of subjects that may negatively affect global foreign security in-addition to U.S. domestic national security.

Mass public notification from ‘only a remote government entity’ mentioning a ‘publicly unfamiliar impact’ ( e.g. Solar Energetic Particle Event – SEPE )  with ‘government refusing to state an approximation of the extent of an unknown national emergency future impact’ does ‘absolutely nothing to reduce public panic in the future’ because that type of government behavior does ‘nothing to prepare the public for any impact whatsoever that comes’. Government claiming ‘no knowledge as to future impact effects’ is ‘prevalent in history’.

Using a ’remote government entity, historically trusted by the public’, reduces but does not curtail future civil unrest and public uprisings against authorities including emergency first responders and endurance by releasing that ’initial official dual statement’ – the ‘initial public announcement’ – purposefully designed to ‘carry a little more social understanding’ from the public ( en masse ) after the Solar Maximum SEPE impact strikes everyone with its effects.

That ‘initial official dual statement’ will buy everyone a ’little more peacetime’ than had the public en masse received no warning at all so, government ( run by people who have families also ) is doing its best – given what everyone will experience from the Solar Maximum SEPE NASA indicates is ”coming to Earth in 2012″ when “we all should be concerned.”

National Oceanic and Atmospheric Administration ( NOAA ) information on the subject of solar geomagnetic storms ’currently allows public viewing’ of its Space Weather Prediction Center ( SWPC ), a ’7-day forecast’ provided by its Space Weather Alerts and Warnings Timeline ( SWAWT ) chart, at: http://www.swpc.noaa.gov/alerts/warnings_timeline.html

NOAA claims its space weather ‘prediction’, alerts and warning ‘vision’ is for, ”A nation prepared to mitigate the effects of space weather through the understanding and use of actionable alerts, forecasts, and data products.” [ http://www.swpc.noaa.gov/AboutUs/index.html ] Understanding? First, one requires ‘information’. When? Six ( 6 ) days ‘after’ a Solar Maximum flare travels from the Sun to the Earth in 18-hours; less than 1-day? What good will it do to know about it 6-days after it hits Earth?

NOAA indicates it can ’predict’ space weather based on scientific data if ’all tell-tale signs are in line with their assessment’, but for anyone to use the word “predict” is only a misnomer when the proper word is ‘forecasts based on existing factual assessments’.

The Solar Maximum ‘does not even exist yet’ so how is it ‘predicted’ already by NASA?

No one can ‘predict’ the future, unless ‘they somehow know’ or ‘have been told by a higher authority’ in-advance of ’something critical going to happen’.

Since we’re on the subject of NOAA ‘predicting space weather’ let’s not forget our friendly meteorologist weather men whom ‘forecast the weather’ but have been doing an extremely poor job for decades now on ‘predicting weather’ right here on Earth where ‘little can be predicted’ but gravity and a few other ‘factual understandings’ about ‘science’ and ‘life’ as we’ve come to know it.

Weather, on Earth or in Space, ’can only be assessed from facts, but only forecasted’. All those brilliant professionals need to stop trying to fool the public fed the ‘word’ “prediction” because they cannot ‘predict when or how much’ of a Solar Maximum SEPE will take place; unless – of course – they ‘know catastrophic will definitely happen in-advance’.

What is it that ‘they know’ that ‘we do not know’ yet? Why are so many officials so shy about saying anything more? No one hears any officials saying, “Prepare now for a catastrophic event coming!” Officials are only heard by a few of the public. And ‘only those few officials’ are ‘saying the same thing over and over again’, “The Solar Maximum is coming in 2012 and we all need to be concerned.” Isn’t that ‘enough’ of a public hint?

Unfortunately, people need more than a ’7-day warning’ from NOAA.

People also need more than a ’1-year warning’ with a smoke and mirrors vague statement from NASA.

Now in preparing this report for the public to learn about and after considerable research it was just learned that only 4-months ago NASA released its previously secret plan but hid it in an obscure area of a remote but official internet webpage, and ‘that information’ had to be ‘even further researched’ before being better understood to just begin realizing what is actually going on. What’s NASA’s plan?

Remember what was said about “predictions” and how they’re really “assessments?” The public is now going to be introduced to the government’s “Godfather of all assessments” hidden deep within an entity known as “SOMA.”

[ click to enlarge image ( above ) ]

In a secret joint effort between NASA and the European Space Agency ( ESA ) plans are now underway to launch what is called ”Solar Probe +” ( also known as ) “Solar Probe Plus” spacecraft directed into the Sun’s environment – like a ‘neutrino particle’ – where it is expected to withstand unbelievable temperatures plus far more than most could imagine while at the same time providing scientists and physicists here on Earth with new types of information never captured before from such a very dangerous new frontier in outer space.

Solar Probe Plus will be facilitating what is now a cosmic-sensitive investigation into retrieving Coronal Mass Ejection ( CME ) in-advance to understand far more about how to protect the Earth from a Solar Maximum SEPE ‘solar geomagnetic storms’ filled with ‘high-energy’ electrons, helium ions, protons and alpha particles plus more analyzed through highly sophisticated three dimensional ( 3-D ) color imaging sensors and more.

Remember the aforementioned two ( 2 ) NASA STEREO spacecraft SECCHI 3-D sensing camera telescopes now watching the Sun? Well, think of ‘that old technology’ pumped-up on ‘advanced technology steroids’ the likes of which is unimaginable by even most current day technological savvy people.

Solar Probe Plus technology, in just ‘advanced materials science’ alone, is amazing and reads like something out-of a science-fiction book about distant future capabilities, except for one thing. This new technology is already ‘here’, ‘very much real’, and very highly classified.

The Sun, also being an ‘extremely loud and noisy environment’ as well as an explosive place to be, will be having Solar Probe + transmit ‘intense solar shock waves’ measured ‘after capture’ devices aboard Solar Probe Plus by a program not ‘led by’ but ‘overseen by’ the NASA Langley Research Center ( LaRC ) organizational element Science Mission Directorate ( SMD ) Heliophysics Division that is being controlled by ”SOMA.” Now we’ve arrived to just learn what ‘secret organization’ is ‘driving this Earth mission behind closed doors’.

National Aeronautic and Space Administration ( NASA ) headquarters Science Office For Mission Assessments ( SOMA ) acquires, investigates, conducts secret studies and reviews ‘all outgoing announcements of opportunity ( AO ) solicitations’ and what they call ‘missions of opportunity’ ( MO ) for the NASA Langley Research Center ( LaRC ) secret element organization vaguely termed only as the “Science Mission Directorate ( SMD ).” The ‘term’ “directorate” is exclusively used by ‘only the leadership office’ of the U.S. Central Intelligence Agency ( CIA ).

To understand where all this secrecy is coming from, one must have already conducted a dangerous undertaking ( Enterprise ) – from a distance – to just look at what all is behind the CIA In-Q-Tel Corporation secret IT directorate over highly advanced technologies hidden beneath the CIA Office of Science & Technology ( S&T ) ’branch leadership’ over In-Q-Tel Corp. ‘front’ of ’know nothing’ Board of Advisors. Now, while all this may only be as clear as mud, this is why the above two ( 2 ) webpage links provide exclusive coverage.

To keep ‘highly advanced technologies’ and its ‘related information’ about it an extremely well-guarded secret, the CIA even removed the ‘element of human beings from guarding over it’ by utilizing yet another little known about application by which that is even being done called Quantum Cryptographic control measure keys that few are able to even manipulate because even that technology is so well buried. One must keep in-mind that, contrary to popular belief, the CIA is very well experienced and quite adept at keeping secrets – extremely secret – unless otherwise releasing information in such a fashion such would also be quite advantageous. Common folks are not aware, nor are they aware of what a Solar Maximum SEPE is, and they are certainly ‘not being told what to expect from it either’.

NASA LaRC SMD Science Office For Mission Assessments ( SOMA ) acquires Earth and Space science instruments and develops Earth and Space missions from those solicitations and enables cost evaluations performed on both ‘technical’ and ‘management’ ( TMC ) ‘proposals’ submitted in response thereto, plus handles what is referred to as ”Phase A concept studies.”

NASA LaRC SMD Science Office For Mission Assessments ( SOMA ) leads secret ‘research, assessments and analysis’ it performs ‘acquisitions and investigations’ for the Science Mission Directorate ( SMD ), a ‘secret element’ of the NASA Langley Research Center ( LaRC ).

Science Office For Mission Assessments ( SOMA ) is principal interface for the NASA Academy of Program, Project and Engineering Leadership ( APPEL ) developing and implementing special training programs for principle investigator teams ( PIT ) as well as employees of NASA.

Solar Probe Plus is a spacecraft designed to go where no Earth spacecraft has ever gone before so we can touch, taste and smell particles of our Sun. Solar Probe Plus mission is part of the NASA “Living with a Star Program,” a program designed to assist Earth’s understanding as to how people may survive future Solar Maximum events.

There’s just one problem with Solar Probe Plus, It won’t be ready until 2018.

People want to ‘understand facts’, and they want to know ‘now’ because NASA ‘predicts’ this Solar Maximum being a ‘major global event’, ‘not’ just some science-fiction motion picture show we can all leisurely sit around eating popcorn, candy, and drinking soda pop while watching it.

Now people can pick up a TV Guide magazine and know what is forecasted for broadcast on TV, but with a Solar Maximum major event people are unable to learn anything more about what NASA and others know. Why?

NASA, NOAA, other federal government organizations, satellites, sensors, scientists, and some of the smartest people on Earth ‘know something’ that most people do not know ’yet’ and we likely may never know until this major event is too late to prepare a plan for and institute ahead of time. “Let’s see, now Mable, how much food and water do we have stored up at our vacation cabin in the mountains? Think we can get by with what we have for a couple years if need-be?”

One would think that with this Solar Maximum SEPE major global event coming, that news about it from NASA, NOAA and other government organizations would be continually plastered over every major news broadcast around the world. Well, it already ‘was’, however most people never saw it, did not listen to it, forgot it, or never heard about it from those they know.

While this particular report alerts many – at least those reading it thoroughly – as to an upcoming ‘real life major global event’ pertaining to everyone’s life, most are nevertheless carrying on their daily lives – preoccupied in their own tiny world of distraction serving events – in total blackout from comprehending what is about to occur ‘very soon’.

What will most do when the time comes, and it becomes too late to prepare a survival plan? Oh yes, let us not forget about all the survival fanatics. Even the famous religious organization of The Church of Latter Day Saints ( Mormon ) teach members all about ‘storing survival supplies’. Many are ready, however ‘how long are they prepared to survive for’? One ( 1 ) week? One ( 1 ) month? One ( 1 ) year? What if they need to survive for 2-years? Do they have enough stored?

Will people really need more than one ( 1 ) form of transportation or multiple properties? How’s their supply of aspirin, medicine and toilet paper? No basic can really be overlooked when it comes to all the basic comforts we’ve taken for granted for so long.

Knowledge as to what the effects of the impact will soon be, from this Solar Maximum SEPE on us all, is simply ‘not being focused on’ by any more than just a few people.

What most are going to do after looking at at the pretty pictures in this report is get themself something to eat, drink, watch some television, doze off to sleep and then awake to rush right back into their daily life – never giving this report a second thought about what could have otherwise been  learned to survive.

Most will not even mention this report to anyone. Some will ‘plan to wait’.  Some will ‘wait before doing absolutely nothing to prepare themself’.

Most people, after nose-picking ( for a real long time ), will still be ‘waiting’ and ‘picking’ when the Solar Maximum SEPE finally comes. These sharp as a marble ( civic-minded ) individuals belong to the “Should-a Could-a Would-a” group of folks whom are card-carrying members of the world’s most distracted people too busy ( doing nothing ) to make any intelligent decision much less anything else for their own good. They are, however quite excellent at dying while waiting for something to happen.

While some may possess a master’s degree in knowledge about sporting events and fashion, they flunk the test on simply asking a librarian to show them how to research the Solar Maximum ( Solar Energetic Particle Event ) plus ‘how to prepare’ using the  internet ‘now’ at the local public library.

The real disaster “we all should be concerned about” is ‘people not prepared for the Solar Maximum SEPE effects on Earth’.

NASA is ‘preparing protection’ but for only a few.

April 2011 is currently scheduled – but this may change to an earlier time – by NASA as America’s ‘very last U.S. Space Shuttle flight mission’ ( 134 ) with Endeavor spacecraft commander U.S. Navy Captain Mark E. Kelly ( 56 ) delivering to his twin brother Scott Kelly – commander of the International Space Station ( ISS ) – ‘advanced technology protection shields’, supplies and more.

– –

Source: Thomson Reuters News Agency

Shuttle Leaves Space Station To Begin Trip Home by, Irene Klotz ( Reuters )

May 30, 2011 15:28:36 GMT ( Updated ) May 30, 2011 08:28 a.m. ( PST )

CAPE CANAVERAL, Florida — Space shuttle Endeavour has departed the International Space Station, clearing the way for a final cargo run to the outpost before NASA retires its three ( 3 ) ship fleet.

As the spacecraft sailed 215-miles above Bolivia, pilot Greg Johnson gently pulsed the Endeavour steering jets on Sunday May 29, 2011 at 11:55 p.m. EDT ( Monday, May 30, 2011 at 03:55 GMT ) to back-away from the docking port that has anchored the shuttle since its arrival on May 18, 2011.

Endeavour delivered the International Space Station ( ISS ) premier science experiment — the $2,000,000,000 billion Alpha Magnetic Spectrometer ( AMS-02 ) particle detector, and a pallet of spare parts – intended to tide over the orbital outpost after the U.S. Space Shuttle program ends.

“Endeavour departing,” radioed International Space Station ( ISS ) flight engineer Ron Garan. “Fair winds and ‘following seas’, guys.”

“Thanks Ron,” replied Endeavour commander Mark Kelly, “Appreciate all the help.”

Afterward, Endeavour maneuvered to within about 950-feet of the International Space Station ( ISS ) to test the ‘new’ Automated Rendezvous System ( ARS ), that is being developed for NASA’s next spaceship known as the Multi-Purpose Crew Vehicle ( MCV ) that is intended to fly astronauts to the Moon, asteroids, and Mars – eventually.

“We’re now separating. That’s the closest we’re going to get,” Mark Kelly radioed International Space Station ( ISS ) flight engineer Ron Garan.

Then Mark Kelly fired U.S. Space Shuttle Endeavour thrusters, and pulled away from the International Space Station ( ISS ) for the last time.

NASA plans to decommission Space Shuttle Endeavour, the youngest Space Shuttle ( with 25 voyages ), sending Endeavour to a Los Angeles, California museum for display.

One ( 1 ) final U.S. Space Shuttle mission is planned, before the United States ends its 30-year old Space Shuttle program, the U.S. Space Shuttle Atlantis due to launch ( July 8, 2011 ) with a 1-year quantity of supplies – for the International Space Station ( ISS ) – as a ‘contingency plan’ in-case ‘commercial companies’ ( hired to takeover supply-runs to the International Space Station ( ISS ) ‘encounter delays’ with ‘their new spacecraft’.

The U.S. Space Shuttles are being retired – to save $4,000,000,000 billion in annual operating expenses – as NASA ‘develops new spacecraft’ to travel further beyond the International Space Station ( ISS ) positioned at a 220-mile altitude ( 355-kilometers ) Earth orbit.

During its past 12-days, at the International Space Station ( ISS ), the U.S. Space Shuttle Endeavour crew conducted four ( 4 ) spacewalks to complete construction of the U.S. portion of the International Space Station ( ISS ) outpost – a $100,000,000,000 billion project of sixteen ( 16 ) nations – being assembled since 1998 in Earth orbit.

STS 134 Endeavour Space Shuttle is due back on Earth at Kennedy Space Center ( Florida, USA ) on Wednesday June 1, 2011 2:35 a.m. EDT ( 06:35 GMT ) – the same day its sister spaceship Atlantis STS-135 is scheduled to reach the launch pad for NASA’s 135th final flight.

Reference

http://www.msnbc.msn.com/id/43216270?GT1=43001

– –

U.S. Navy Capt. Mark Kelly’s wife was United States Congress House of Representatives Subcommittee on Space and Aeronautics Chairman Gabrielle Giffords who additionally held seats on the U.S. House of Representative Armed Services Committee on Science and Technology.

On February 9, 2011 NASA provided this video ( below ) about its STEREO spacecraft news event, but provided no update about anything else:

March 2011 official solar interpolated magnetogram forecast displays the following ( below ):

This report might begin to move people, with lead assets, to obtain some straight answers for some very difficult questions. Failing that, with bad news ahead, people should ‘prepare for impact’ because NASA says this is something “we all need to be concerned about.”

Interestingly, the only zeroing-in on a date came years ago ( 2006 ) when the NASA Goddard Space Flight Center Heliospheric Division began mapping magnetic gas plasma flows ‘beneath’ the surface of the Sun’s corona, and from the Solar and Heliospheric Observatory ( SOHO ) Michelson Doppler Imager ( MDI ) instrumentation magnetic data and over 80-years of solar historical background data, officials ‘discovered’ the Sun possesses a “magnetic memory” it always returns to from 200-years previously, and based on what official scientists and astrophysicists uncovered will shock you ( below ).

– –

Source: NASA Goddard Space Flight Center

Scientists Gaze Inside Sun, Predict Strength of the Next Solar Cycle by, Bill Steigerwald ( NASA Goddard Space Flight Center )

March 6, 2006

The next solar activity cycle will be 30% to 50% stronger than the previous one, and up to 1-year late in arriving, according to a breakthrough forecast by Dr. Mausumi Dikpati and colleagues at the National Center for Atmospheric Research ( NCAR ) in Boulder, Colorado.

The scientists made the first [ 1st ] “Solar Climate Forecast” using a combination of groundbreaking ‘observations of the solar interior’ from space, and computer simulation.

NASA Living With a Star ( LWS ) Program and the National Science Foundation ( NSF ) funded the research.

The Sun goes through an approximate 11-year cycle of activity, from stormy to quiet and back again.

Predicting the Sun’s cycles accurately – years in advance – will help societies plan for active bouts of solar storms, which can disrupt satellite orbits and electronics, interfere with radio communication, and damage power systems.

The forecast is important for NASA long-term Vision for Space Exploration plans, since solar storms can be hazardous to unprotected astronauts as well.

Solar storms begin with tangled magnetic fields ( generated by the Sun’s churning electrically charged gas – plasma ). Like a rubber band that has been twisted too far, solar magnetic fields can suddenly snap into a new shape – releasing tremendous energy as a ‘solar flare’ or Coronal Mass Ejection ( CME ).

Solar flares are explosions ( in the Sun’s atmosphere ) with the largest ‘equaling billions of one-megaton nuclear bombs’.

Solar magnetic energy can also blast billions of tons of plasma into space – at millions of miles ( kilometers ) per hour as a Coronal Mass Ejection ( CME ).

This violent solar activity often occurs near SunSpots ( dark regions on the sun caused by concentrated magnetic fields ).

SunSpots and stormy solar weather follow the eleven-year cycle, from few sunspots and calm to many sunspots and active, and back again.

Key to predicting the solar activity cycle is an understanding of the ‘flows of plasma in the Sun’s interior’.

Magnetic fields are ‘frozen’ into the solar plasma, so plasma currents ( within the Sun ) transport, concentrate and help dissipate solar magnetic fields.

“We understood these flows – in a general way – but the details were unclear, so we could not use them to make predictions before,” said Dikpati, who published a paper on this research in the online version of Geophysical Review Letters March 3, 2006.

The new technique of “helioseismology” revealed these details by allowing researchers to ‘see inside the Sun’.

Helioseismology traces sound waves ( reverberating inside the Sun ) to build up a ‘picture of the interior’ – similar to the way an ‘ultrasound scan’ is used to create a picture of an unborn baby.

Two [ 2 ] ‘major plasma flows’ govern the solar cycle, the:

FIRST ( major plasma flow ) acts like a conveyor belt ( deep beneath the surface ), plasma flowing from the poles to the equator where the plasma rises and flows back to the poles, where it sinks and repeats.

SECOND ( major plasma flow ) acts like a taffy pull ( surface layer of the Sun ), rotates faster at the equator than it does near the poles. Since the large-scale solar magnetic field crosses the equator ( as it goes from pole to pole ) it gets wrapped around the equator ( over and over again ) by the faster rotation there. This is what periodically ‘concentrates the solar magnetic field’ leading to ‘peaks’ in ‘solar storm activity’; solar fireworks ( in the form of flares and Coronal Mass Ejections ) dissipate some of the magnetic field, and remnants are carried ( by the conveyor belt flow ) toward the poles. That material then becomes the ‘input for the next cycle’.

“Precise helioseismic observations of the conveyor belt ‘plasma flow speed’ by the Michelson Doppler Imager ( MDI ) instrument on the Solar and Heliospheric Observatory ( SOHO ) gave us a breakthrough,” said Dikpati. SOHO is a project of international collaboration between NASA and the European Space Agency ( ESA ).

“We now know it takes two [ 2 ] cycles to fill half [ 1/2 ] the conveyor belt with magnetic field ( the part where it sinks at the poles and flows toward the equator, reaching mid-latitudes ), plus another two [ 2 ] cycles to fill the other half [ 1/2 ] – from the bottom at mid-latitudes, then rising at the equator and flowing toward the poles again.

Because of this, ‘the next solar cycle depends on characteristics’ from ‘as far back as 40-years previously’ – the Sun has a magnetic ‘memory’.”

The magnetic data input comes from the Solar and Heliospheric Observatory ( SOHO ) Michelson Doppler Imager ( MDI ) instrument, plus ‘historical records’.

To test the model, researchers took ‘magnetic data from the past eight [ 8 ] solar cycles’, fed it into the computer, and ‘results matched actual observations over the last 80-years’.

The team then ‘added current magnetic data’ and ran the model ahead [ projected ] 10-years [ into the future ] to get their ‘prediction for the ‘next solar cycle’.

We are currently back in the quiet period for the current cycle ( cycle 23 ).

The next ‘solar cycle’ will ‘begin with a rise in solar activity’ in late 2007 or early 2008 ( according to the team ) and there will be 30% to 50% more SunSpots, flares and Coronal Mass Ejections ( CMEs ) in cycle 24 – about ’1-year later than the prediction’ ( using previous methods relying on statistics like the strength of the large-scale solar magnetic field and number of SunSpots to make estimates ) for the ‘next cycle’.

Dikpati’s team includes Dr. Giuliana De Toma and Dr. Peter A. Gilman, all scientists of the NCAR High Altitude Observatory ( Boulder, Colorado ) whose primary sponsor is the National Science Foundation ( NSF ).

Reference

http://www.nasa.gov/centers/goddard/news/topstory/2006/solar_cycle.html

– –

Source: NASA

Science News

Solar Storm Warning by, Dr. James Anthony Phillips [ James.A.Phillips@EarthLink.Net ] – NASA Science News Production Editor

March 10, 2006

It’s official:

Solar ‘Minimum’ has arrived. SunSpots have all but vanished. Solar flares are nonexistent. The Sun is utterly quiet.

Like the quiet before a storm, this week researchers announced that a solar storm is coming, ‘the most intense Solar Maximum in 50-years’.

The prediction comes from a team led by Ms. Mausumi Dikpati of the National Center for Atmospheric Research ( NCAR ).

“The next SunSpot cycle will be 30% to 50% stronger than the previous one,” she says.

If correct, the years ahead could produce a burst of solar activity second only to the historic Solar Maximum of 1958. That was a Solar Maximum!

The space sge was just beginning:

– Sputnik ( the first Russian satellite ) was launched in October 1957; and,

– Explorer 1 ( the first U.S. satellite) in January 1958.

In 1958 you could not know a solar storm was underway, just by looking at bars on your cellphone, because cellphones didn’t exist!

Even so, people knew something big was happening when “Northern Lights” [ Aurora Borealis ] were sighted three [ 3 ] times [ as far ‘south’ as ] in Mexico.

A similar Solar Maximum today, ‘would be noticed by its effect’ on:

– Cellphones; – GPS; – Satellites ( weather ); and, – Many other modern technologies.

Dikpati’s ‘prediction is unprecedented’. In ‘nearly 200-years’ since the 11-year SunSpot cycle was discovered, scientists have struggled to predict the size of ‘future Solar Maximums’ but failed.

Solar Maximum can be intense, as in 1958, or barely detectable as in 1805 – ‘obeying no obvious pattern’.

The key to the mystery, Dikpati realized years ago, is a ‘conveyor belt’ on the Sun.

We have something similar here on Earth with the “Great Ocean Conveyor Belt” – popularized in the science fiction movie “The Day After Tomorrow.”

On Earth, it is a ‘network of currents’ carrying ‘water’ and ‘heat’ – from ocean to ocean.

In the science fiction movie, the Earth’s ‘ocean conveyor belt’ suddenly stopped, throwing Earth’s weather into chaos.

The Sun’s conveyor belt is a ‘current’ ( not of water ) of ‘electrical conducting gas’ ( plasma ) that flows in a loop, from the Sun’s Equator to the Poles and back again.

Just as the Earth’s Great Ocean Conveyor Belt ocean currents control weather on Earth, the Sun’s ‘solar conveyor belt’ controls ‘solar weather’ on the Sun.

Specifically, it controls the “SunSpot cycle.”

Solar physicist David Hathaway of the National Space Science & Technology Center ( NSSTC ) explains:

“First, remember what SunSpots are – tangled knots of magnetism ( generated by the Sun’s inner dynamo ). A ‘typical’ SunSpot ‘exists for just a few weeks’, then it decays – leaving behind a ‘corpse’ of weak magnetic fields.”

Enter the conveyor belt.

“The top of the ( Sun’s ) conveyor belt skims, the surface of the Sun, sweep ‘dead magnetic field knots ( from old SunSpots ) down to the Sun’s Poles where at a depth of 200,000 kilometers the Sun’s magnetic dynamo recharges / regenerates these old magnetic knots until they become powerfully buoyant rising to the Sun’s surface where they reappear as new SunSpots.”

All this happens with massive slowness.

“It takes about 40-years for the Sun’s conveyor belt to complete just one [ 1 ] loop,” says Hathaway.

The speed varies “anywhere from a 50-year pace ( slow ) to a 30-year pace ( fast ).”

When the Sun’s conveyor belt turns “fast,” it means ‘alot of magnetic fields are being swept-up’, and that a ‘future SunSpot cycle’ is going to be ‘intense’. This is a basis for forecasting.

“The Sun’s conveyor belt was turning ‘fast’ in 1986 – 1996,” says Hathaway.

“Old magnetic fields swept-up then ( from 1986 thru 1996 ) should ‘re-appear as huge SunSpots’ in 2010 – 2011.”

Like most experts in the field, Hathaway has confidence in the Sun’s ‘conveyor belt model’, agreeing with Dikpati that the next Solar Maximum should be a doozy, but he disagrees on one [ 1 ] point:

Dikpati’s forecast puts Solar Maximum at 2012.

National Space Science & Technology Center ( NSSTC ) solar physicist David Hathaway believes it will arrive sooner, in 2010 or 2011.

“History shows that ‘big SunSpot cycles’ ramp-up faster, than small ones,” Hathaway says.

“I expect to see the first [ 1st ] SunSpots of the ‘next solar cycle’ [ Solar Minimum ] appear in late 2006 or 2007, and Solar Maximum to be underway by 2010 or ’2011′.”

Who’s right?

Time will tell. Either way, a storm is coming.

Reference

http://science.nasa.gov/science-news/science-at-nasa/2006/10mar_stormwarning/

– –

Source: U.S. Federal Emergency Management Agency ( FEMA )

National Situation Update: Monday, April 9, 2007 Homeland Security Threat Level: YELLOW ( ELEVATED ).

National Weather

… [ EDITED-OUT FOR BREVITY ] …

Global Positioning System Is Impacted By Solar Radio Burst

During an unprecedented solar eruption last December [ 2006 ], researchers at Cornell University confirmed solar radio bursts can have a serious impact on the Global Positioning System ( GPS ) and other communication technologies using radio waves. The findings were announced on April 4, 2007 in Washington, D.C., at the first Space Weather Enterprise forum – an assembly of academic, government and private sector scientists focused on examining the Earth’s ever-increasing vulnerability to space weather impacts. Solar radio bursts begin with a solar flare that injects high-energy electrons into the solar upper atmosphere. Radio waves are produced which then propagate to the Earth and cover a broad frequency range. The radio waves act as noise over these frequencies, including those used by GPS and other navigational systems which can degrade a signal.

The National Oceanic and Atmosphereic Administration ( NOAA ), National Aeronautic Space Agency ( NASA ), and partner agencies in the National Space Weather Program are looking to the future needs of our highly technical society, and are anticipating seamless specification and prediction of the atmosphere from the ground to the edges of the Earth magnetosphere and beyond to the Moon and Mars. The NOAA Space Environment Center is the nation’s ‘first alert of solar activity’.

Forecasters from the NOAA Space Environment Center in Boulder, Colorado, observed two [ 2 ] powerful solar flares on December 5, 2006 and December 6, 2006. These violent eruptions originated from a ‘large sunspot cluster’ identified by NOAA.

The December 6, 2006, solar flare created an unprecedented intense solar radio burst causing large numbers of receivers to stop tracking the GPS signal.

Using specially designed receivers – built at Cornell University – as sensitive space weather monitors, Cornell scientists were able to make the ‘first quantitative measurements of the effect’ of earlier solar radio bursts on GPS receivers. ‘Extrapolations from a previous moderate event led to the prediction that larger solar radio bursts – expected during Solar Maximum – would disturb GPS receiver operation for some users.

“In December, we found the effect on GPS receivers were ‘more profound and wide spread than we expected’,” said Paul M. Kintner Jr., Ph.D., professor of electrical and computer engineering at Cornell University.

“Now ‘we are concerned more severe consequences will occur during the next Solar Maximum’ ( approxmately 2011 ).”

“NASA wants to better understand this solar phenomenon ‘so we can limit the adverse impacts on real-time systems’,” said Tony Mannucci, Ph.D., principal technical staff and supervisor, Ionospheric and Atmospheric Remote Sensing [ IRAS ] Group at the NASA Jet Propulsion Laboratory [ JPL ].

There are three [ 3 ] key points to remember about solar radio bursts:

“First, ‘society cannot become overly reliant on technology’ without an awareness and understanding of the effects of future space weather disruptions,” said Anthea Coster, Ph.D. of the MIT Haystack Observatory.

Second, the December 6, 2006 event dramatically shows the effect of ‘solar radio bursts is global and instantaneous’.

“Third, and equally important, ‘the size and timing of this burst were completely unexpected’ and ‘the largest ever detected’.

We do not know how often we can expect solar radio bursts ‘of this size or even larger’.” ( Excerpt from http://www.noaanews.noaa.gov/stories2007/s2831.htm )

Reference

http://www.fema.gov/emergency/reports/2007/nat040907.shtm

– –

So, in addition to other federal scientists and astrophysicists, we caught the U.S. Federal Emergency Management Agency ( FEMA ) in 2007 predicting “2011″ as being the year NASA says is something “we all need to be concerned about.”

During the entire month of March 2011, the fact of the matter is the ‘incredibly active’ solar cycle SunSpot activity ‘spiked vertically higher’ and far outside its ‘officially predicted bell-shaped graph’ thought it would follow, plus it ‘still has not stopped peaking’ for April 2011 ( see below ):

Helioseismology, studying the inner regions of the Sun, provides advanced predictions as to when and where on the Sun an outburst explosive solar flare is to erupt. Unfortunately, the data is ‘not provided to the public’.

A quick color video clip will serve to familiarize you with a very basic small part of what actually goes into mapping the Sun to provide classified Space Weather Prediction Report Alerts – more accurate than what many are ever told about today ( see below ):

Solar Maximum predictions for 2011 were bad enough, but then NASA THEMIS satellites discovered the Earth’s Magnetosphere began developing holes in it, which is officially documented by this short color video clip ( below ):

– –

Source:  SPACE WEATHER, Vol. 9, S04003, 12 PP., doi: 10.1029/2009 SW000537

A Tool For Empirical Forecasting Of Major Flares, Coronal Mass Ejections, And Solar Particle Events From A Proxy Of Active-Region Free Magnetic Energy

by, David Falconer – – Space Science Office, NASA Marshall Space Flight Center ( Huntsville, Alabama, USA ); – Physics Department, University of Alabama ( Huntsville, Alabama, USA ); and, – Center for Space Plasma and Aeronomic Research, University of Alabama ( Huntsville, Alabama, USA ).

by, Abdulnasser F. Barghouty – – Space Science Office, NASA Marshall Space Flight Center ( Huntsville, Alabama, USA ).

by, Igor Khazanov – – Center for Space Plasma and Aeronomic Research, University of Alabama ( Huntsville, Alabama, USA ).

by, Ron Moore – – Space Science Office, NASA Marshall Space Flight Center ( Huntsville, Alabama, USA ).

Published: April 7, 2011

This paper describes a ‘new forecasting tool developed’ for and ‘currently being tested’ by the NASA Space Radiation Analysis Group ( SRAG ) at Johnson Space Center, responsible for the monitoring and forecasting of radiation exposure levels of astronauts.

The new software tool is designed for the empirical forecasting of M-Class and X-class Solar Flares, Coronal Mass Ejections ( CME ), and Solar Energetic Particle Events ( SEPE ).

For ‘each type of event’, the algorithm is based on the empirical relationship between the ‘event rate’ and a ‘proxy of the active region Free Magnetic Energy ( FME ).

Each empirical relationship is determined from a ‘data set’ consisting of ∼40,000 ‘solar active-region magnetograms’ from ∼1300 ‘solar active regions observed’ by the Solar Heliospheric Observatory ( SOHO ) spacecraft Michelson Doppler Imager ( MDI ) instrument possessing ‘known histories’ of Solar Flares, Coronal Mass Ejection ( CME ), and Solar Energetic Particle Event ( SEPE ) production.

The new tool automatically extracts ‘each strong-field magnetic area’, from an Michelson Doppler Imager ( MDI ) full-disk magnetogram, ‘identifies each’ as a National Atmospheric and Oceanic Administration ( NOAA ) active region, and ‘measures the proxy of the active region Free Magnetic Energy ( FME )’ from the ‘extracted magnetogram’.

For each solar active region, the empirical relationship is then used to convert the Free Magnetic Energy ( FME ) proxy into an ‘expected event rate’. The ‘expected event rate’ in-turn can be ‘readily converted into the probability’ that ‘the active region will produce such an event’ in a given ( queried ) forward ( ‘future’ ) ‘time’ window.

Descriptions of the ‘data sets’, algorithm, software, sample applications, and validation test are presented.

Further ‘development and transition of the new tool’ – in anticipation of SDO HMI – are briefly discussed.

Citation: Falconer, D., A. F. Barghouty, I. Khazanov, and R. Moore ( 2011 ), A tool for empirical forecasting of ‘major flares’, ‘coronal mass ejections’, and ‘solar particle events’ from a proxy of active region free magnetic energy, Space Weather, 9, S04003, doi:10.1029/2009SW000537.

Received: October 23, 2009. Accepted: January 27, 2011.

Reference

http://www.agu.org/pubs/crossref/2011/2009SW000537.shtml

– –

5th Factor –

Solar Energetic Particle Event ( SEPE ) effects, significantly impacting the way people now live on Earth will come from only one ( 1 ) ’sunspot’ eruption of a ’solar flare’ that ‘must contain at-least’ the following four ( 4 ) factors ( below ):

1. Greater ‘amount’ of ‘particles’ ( visible and invisible Matter and AntiMatter ) ‘erupting from under the surface of the Sun’;

2. Greater ‘excitation’ of elements ( High Field Strength Elements / HFSE ) ‘combining elements’ from ‘inside’ the Sun with those leaving ’outside’ nthe Sun ) become even more highly energetic ( energized ) upon leaving the Sun;

3. Greater ‘speed’ / ’distance’ moving large Mass of particles containing highly excited ( HFSE ) elements; and,

4. Earth-Directed.

5th factor SEPE is rarely discussed in public because it makes such an Earth Event ‘even worse’ ( than it was in 1864 ) because information provided through NOAA, NASA and astrophysicists around the world indicate, the Earth’s protective solar shield, known as the Magnetosphere, has developed “holes” in it.

Well understood by U.S. government Global Environmental Intelligence ( GEI ) specialists and with NASA indicating that the current Solar Maximum period ( Cycle 24 ) is something, “we all need to be concerned about,” what will it take for people to want to understand how to prepare to survive the effects of a Solar Energetic Particle Event ( SEPE )?

ESA Solar Activity Data Plotting

Do ‘you’ think anything is funny about something that could jeopardize ‘your life’ or the ‘life of any of your family members, loved ones or close friends’?

Before concluding this knowledgeability brief, based on what NASA says “we all need to be concerned about” ), this news broadcast video clip ( below ) shows how news personalities present the public with information on this subjectny acting whimsical, asking silly questions, laughing and joking around on this subject so, after watching this you should seriously ask yourself how the public could accept this information as being something serious they “all need to be concerned about?”

[ CARV NOTE: video clip – previously inserted ( here ) – has mysteriously disappeared off this KB. ]

For up-to-date official video and text information broadcasts and forecasts, click the research reference links provided in this knowledgeability brief ( below ).

Submitted for review and commentary by,

Paul Collin, Host

E-MAIL: KentronIntellectResearchVault@GMAIL.Com

CARV WWW [ NOTE: To attempt to review the now archived ‘official webpages’ providing the ‘former in-depth research References’, formerly contained within the ‘originally published report’, please click this link —> ]: http://ConceptActivityResearchVault.WordPress.Com

/

/

REFERENCES [ NOTE: some links ( below ) may no longer be reviewable so, refer to the former Concept Activity Research Vault ( CARV ) WWW link ( above ) ] –

http://www.lmsal.com/forecast

http://sdowww.lmsal.com/suntoday

http://stereo.nascom.nasa.gov/beacon/beacon_secchi.shtml

http://www.lmsal.com/hek/her

http://www.lmsal.com/isolsearch

http://lmsal.com/hek/her?cmd=home

http://lmsal.com/hek/her?cmd=view-ratings

http://sdo.gsfc.nasa.gov

http://jsoc.stanford.edu

http://hmi.stanford.edu

http://science.nasa.gov/science-news/science-at-nasa/2009/21jan_severespaceweather

http://stereo.nascom.nasa.gov/where.shtml

http://www.swpc.noaa.gov/stereo/index.html

http://www.swpc.noaa.gov/tiger/index.html

http://www.swpc.noaa.gov/tiger/Meped.html

http://soma.larc.nasa.gov

http://lws.larc.nasa.gov/solarprobe/PDF_FILES/10-208_B_Solar_Probe_Plus.pdf

http://www.ivoa.net/Documents/PR/VOE/VOEvent-20060810.html

http://www.springerlink.com/content/25mv11225702v520/fulltext.pdf

http://www.fema.gov/rebuild/recover/place.shtm

http://www.dhs.gov/files/prepresprecovery.shtm

http://www.disasterassistance.gov/disasterinformation

http://en.wikipedia.org/wiki/Post_Attack_Command_and_Control_System

http://conceptactivityresearchvault.wordpress.com/2011/03/17/earth-events-forecast

http://conceptactivityresearchvault.wordpress.com/2011/03/28/global-environmental-intelligence-gei

http://www.af.mil/information/factsheets/factsheet_print.asp?fsID=157&page=1

https://login.afwa.af.mil/register/

https://login.afwa.af.mil/amserver/UI/Login?goto=https%3A%2F%2Fweather.afwa.af.mil%3A443%2FHOST_HOME%2FDNXM%2FWRF%2Findex.html

http://www.globalsecurity.org/space/library/news/2011/space-110225-afns01.htm

http://www.wrf-model.org/plots/wrfrealtime.php

http://conceptactivityresearchvault.wordpress.com/2011/03/17/solar-maximum-sepe-effects

Secret HFSE Properties Part 1

[ PHOTO ( above ): HFSE Super-Conducting Electro-Magnetic Pyroclastic Magma – Iceland volcano eruption ( click image to enlarge ) ]

Secret HFSE Properties – Part 1
Emerging Superconducting Magnetic Element Properties
by, Concept Activity Research Vault ( CARV )

December 1, 2011 16:08: 42 (PST ) Updated ( Originally Published: December 9, 2010 )

CALIFORNIA, Los Angeles – December 1, 2011 – In 1961, BELL LABORATORIES ( USA ) physicist Eugene Kunzler and co-workers discovered that niobium–tin continued exhibition of superconductivity while in the presence of strong electric currents and magnetic fields, which made niobium-tin the first [ 1st ] material used to support High Field Strength ( HFS ) electrical currents and magnetic field strengths necessary for use in high-power electro-magnets and electrical power machines.

20-years later, the aforementioned discovery allowed production of niobium doped metallic wire wrapped into multi-strand form cables wound into coils creating powerful electro-magnetic force applications seen in particle accelerators, particle detectors, and rotating machines.

High Field Strength Element ( HFSE ) Niobium ( Nb ) holds far greater use from such properties than most professionals and the public realize so, before advancing any further on this subject, a basic foundation of understanding on such materials, elements and their valuable properties must be realized about this ‘extremely old Earth energy resource’, ‘how to harvest it ( from its original molten state )’, ‘how to capture and contain its volatile properties’, and then ‘transform it into applications’ for transportation – ‘yet unrealized factual knowledge’ on this extremely low-cost extremely powerful high-energy resource.

Niobium ( Nb ) –

[ photo ( above ): Niobium ( .9995 fine) crystals ( click to enlarge ) ]

Titanmagnetite (aka) Titanomagnetite mineral, under Fluorescent X-Ray Spectrography ( XRS ), holds detected high measurements [ from 350 ppm ( parts per million ) up to 1,000 ppm ] of the geochemical High Field Strength Element ( HFSE ) Niobium [ Nb ].

‘Where’ does such a ‘mineral’ ( titanmagnetite ) ‘element’ ( niobium ) originate? Volcanic magma.

‘Where’ are specific locations holding volcanic magma Niobium element properties that possess even ’higher field strength’ magnetics – ‘at least’ six ( 6 ) times greater? The Mariana Trench or other ultra-deep sea continental plate arc locations where volcanoes exist either ‘quite active’ or ‘somewhat dormant’.

Readers may begin to awaken from any slumber when they realize that ultra-deep sea ground trench ( arc ) volcano magma, naturally located nearer to the planet Earth molten liquid superconducting magnetic High Field Strength Element ( HFSE ) ’core’, sees element properties of Niobium absorbing far purer forms of High Field Strength Element ( HFSE ) magnetic properties while within its ‘natural molten liquid state’. Interestingly, nowhere else ‘above ultra-deep sea Earth elevations’ obtain more natural energy power.

How can such a volatile molten resource be ‘excavated’ while simultaneously being isolated from any exposure to contaminations from ‘seawater’, ‘oxygen’ or other environmentals to harness high-energy properties?

The molten material would  additionally have to be placed into a likewise uncontaminated ‘ultra-clean chamber’, whereupon after transport, ’high-energy properties extraction’ must undergo a ‘seamless application process’ for utilization.

What type of use? More basic fundamentals must be initially understood so, let’s begin by taking a look at what’s bubbling out of ultra-deep sea volcano vents and instantly becoming contaminated by seawater ( below ):

[ photo ( above ): Ultra-Deep Sea Trench Arc Volcanic Magma Vent ( click to enlarge ) ]

Preliminary report Part 1 ( herein ) displays a photograph of an above-ground volcano eruption in Iceland experiencing plenty of superconducting High Field Strength Element ( HFSE ) property fireworks, however far too little information is ever realized by the public about what ultra-deep sea trench arc volcano magmatic material element properties hold ‘naturally’ as primary key elements to a variety of other advancements.

Volcanic magma, in later stage differentiation, sees Niobium [ Nb ] and Titanium [ Ti ] ratios ‘increase’ five [ 5 ] to six [ 6 ] times above normal ( dry magma melts ).

To comprehend this, along with the importance of harnessing natural Earth high-energy magnetic properties, a basic understanding must be reached from what science, physics and astrophysics indications always seem to avoid for the public.

The purpose of this preliminary report Part 1 is to bring rare knowledge into better public understanding while stimulating solution-minded professionals wishing more done.

Have scientists, physicists, and astrophysicists ‘missed some major solution’ in their discoveries? Are portions of certain discoveries ‘kept quite’ considering serious repercussions? While highly doubtful that any small discovery having a major impact missed any application outlook, ramifications nevertheless are considered by professionals on whether a socio-economic impact will allowed to be proven helpful or otherwise.

Good advice may be to buckle your seatbelt because you are about to venture into some information few have ever known about. Whether most of those reading this may be able to absorb this information ( below ) now, or later, it is suspected that after much easier reading comes in Part 2 through Part 5, most will probably refer back to Part 1 ( herein ).

– – – –

Preliminary Report ( Part 1 of 5 )

Introduction –

High Field Strength Elements ( HFSE )

Niobium

Niobium enrichment, is possible, using two ( 2 ) natural rock-forming minerals:

– Titanmagnetite [ 350 ppm – 1000 ppm Nb ( Niobium ) ]; and, – Kaersutite [ 38 ppm – 50 ppm Nb ( Niobium ) ].

General Information:

Rock-forming Minerals: Titanmagnetite ( element symbols: Ttn/Mag ), and Kaersutite ( Krs ). Elements [ symbols ]: Titanite ( Ttn ), Magnetite ( Mag ) Geochemical Element [ symbols ]: Niobium ( Nb )

Studies & Resolutions –

Subject: Natural High Field Strength Element ( HFSE ) Materials

Titanmagnetite [ Ttn/Mag ] is a natural super-magnetic mineral, that later experiences a geochemical alteration reducing its magnetic High Field Strength Element ( HFSE ) Niobium ( Nb ) properties, as it exits ultra-deep sea arc volcanic magma influenced by hydrothermal fluid seawater ( see “Findings” below ). In its natural magma state where Titanmagnetite ( Ti/Mag ) holds its 6 [ X ] times greater High Field Strength Element ( HFSE ) Niobium ( Nb ) than above-ground.

Niobium ( Nb ) High Field Strength Element ( HFSE ) properties can be enhanced even greater by adding only one ( 1 ) mineral, Kaersutite ( Krs ).

For applications, extracting this natural combinatoric high-energy power may not easily be obtained.

Extraction Processes: Capturing Natural High Field Strength Elements ( HFSE )

Obtaining these properties may be resolved, however while natural initial ingress of high-temperature ( Tave 713° C to Tave 722° C ) fluid occurences at liquid melts [ magma ] have seen amphibole-plagioclase thermometry suggests ‘fracture and grain’ boundary ‘permeability’ – with seawater derived fluids – ‘open’ over similar temperature interval, however venturing into ultra-deep sea trench arcs and then burroughing into natural state volcanic magma dome vents capturing natural essence of titanmagnetite, and then containing it for processing HFSE properties further may be difficult so, is there an alternative to this type of extraction?

Above-ground experiment observations see major trace geochemical element fractionation trends in bulk rocks and minerals reproduced by Rayleigh fractional crystallization from dry [ magma ] melts ( < 0.5 wt. % H2O ) with oxygen fugacities of one [ 1 ] unit below the Quartz [ Qtz ] – Fayalite [ Fa ] – Magnetite [ Mag ] buffer ( QFM – 1 ).

[ photo ( above ): NASA Astrophysics Data System ( ADS ) ]

Preliminary Findings ( A – F  – below ) –

– – – –

A.

[ NOTE: Titanmagnetite mineral geochemical element Niobium … ]

Source: NASA Astrophysics Data System ( NADS ) [ Harvard University, access: http://adsabs.harvard.edu/cgi-bin/nph-abs_connect ]

Geochimica et Cosmochimica Acta, vol. 29, Issue 8, pp.807-820

DOI: 10.1016/0016-7037(65)90081-5

Bibliographic Code: 1965GeCoA..29..807H

Die verteilung des niobs in den gesteinen und mineralen der alkalibasalt-assoziation der hocheifel by, Hans Gerhard Huckenholz

Publication Date: August 1965

Abstract –

Niobium [ Nb ] contents are determined by the method of fluorescent X-ray spectrography [ FXRS ] for several rock types from the Tertiary Hocheifel volcanic province ( Western Germany ).

Alkalic olivine basalt, contains:

65 ppm Niobium [ Nb ] ( 7 samples );  69 ppm hawaiite ( 4 samples );  86 ppm mugearite ( 3 samples );  95 ppm trachyte ( 5 samples ); 77 ppm basanitoid ( 6 samples );  65 ppm ankaramite ( 2 samples ); 110 ppm hauyne-bearing alkali basalt ( 2 samples ); and,  86 ppm a monchiquite dike.

Niobium enrichment, is observed in the alkalic olivine basalt trachyte [ magma ] series where, in later stage differentiation, Nb [ Niobium ] / Ti ratio increases five [ 5 ] to six [ 6 ] times.

Alkalic olivine basalt magma, however becomes poorer [ less rich ] in niobium [ Nb ] by accumulation of:

– olivine [ 10 ppm Nb ( niobium ) ]; and, – clinopyroxene [ 20 ppm Nb ( niobium ) ].

Enrichment of niobium [ Nb ] is possible by taking-up titanmagnetite ( 350 ppm – 1000 ppm Nb [ Niobium ] ) and kaersutite ( 38 ppm – 50 ppm Nb [ niobium ] ) without olivine and clinopyroxene ( ankaramite HF 5, and the basanitoids ).

The most Nb [ Niobium ] bearing rock-forming mineral is titanmagnetite, containing 65%  to 85% niobium ( in volcanic rock ), however ground-mass clinopyroxene has contents of Niobium [ Nb ] up to 45 ppm, and feldspars have contents of Nb [ Niobium ] up to 67 ppm.

Reference

http://adsabs.harvard.edu/abs/1965GeCoA..29..807H

– – – –

B.

[ NOTE: ICP-MS analysis on trace geochemical element ( Niobium, etc. ) enrichment ( 5X greater than normal ) … ]

Source: NASA Astrophysics Data System ( NADS ) [ Harvard University, access: http://adsabs.harvard.edu/cgi-bin/nph-abs_connect ]

Publication Date: December 2008

Title: Trace Element Geochemistry including the HFSE in Magnetites of Calc-Alkaline Plutons: the Tanzawa Complex of the Izu – Bonin – Mariana Arc and the Ladakh Batholith Complex, NW Himalaya

Authors: Basu, A. R.; Ghatak, A.; Arima, M.; Srimal, N.

Affiliation:

AA ( University of Rochester, Department of Earth and Environmental Sciences, 227 Hutchison Hall, Rochester, NY 14627, United States; abasu@earth.rochester.edu ); AB ( University of Rochester, Department of Earth and Environmental Sciences, 227 Hutchison Hall, Rochester, NY 14627, United States; arun@earth.rochester.edu ); AC ( Yokohama National University, Division of Natural and Environmental Information, 79-1 Tokiwadai, Hodogaya-ku, Yokohama, 240-8501, Japan; arima@ed.ynu.ac.jp ); AD ( Florida International University, Department of Earth Sciences PC 344, University Park 11200 SW 8th Street, Miami, Fl 33199, United States; srimal@fiu.edu ).

Publication: American Geophysical Union, Fall Meeting 2008, abstract #V33C-2228

Origin: AGU

AGU Keywords: 1020 Composition of the continental crust, 1031 Subduction zone processes ( 3060, 3613, 8170, 8413 ), 1036 Magma chamber processes ( 3618 ), 1042 Mineral and crystal chemistry ( 3620 ), 1065 Major and trace element geochemistry

Bibliographic Code: 2008AGUFM.V33C2228B

Abstract

In this study we attempt to contribute to the understanding of a prominent feature, namely the Nb [ Niobium ] – Ta [ Tantalum ] depletion, in arc magmatic trace element geochemistry.

Traditionally, this depletion is explained by residual mantle-wedge phases with Nb [ Niobium ] and Ta [ Tantalum ] affinities, such as titaniferous ilmenite [ Ilm ], rutile [ Rt ] or titanite [ Ttn ], or by an amphibole.

Here, we propose a mechanism – long advocated – to explain the calc-alkaline trend ( Bowen vs. Fenner ) in MgO – FeO ( total Fe ) – ( Na2O + K2O ) ternary diagram by early crystallization and separation of magnetite in ‘subduction zone magmas’ associated with ‘high oxygen’ fugacity ‘environments’.

In support of our hypothesis, we provide high-precision multiple trace element data, including the High Field Strength Elements ( HFSE ), in separated magnetites and mafic mineral phases from mafic ‘magmatic enclaves’ associated with ‘tonalite suites’ of two [ 2 ] different ‘magmatic arcs’, the:

– Tanzawa Complex of the Izu-Tanzawa Collision Zone in Japan; and, – Ladakh Batholith Complex of NW Himalayas.

The Tanzawa Complex is composed of diverse rock suites with SiO2 varying from 43% – 75%, ranging from hornblende gabbro through tonalite to leuco-tonalite. The geochemical characteristics of low K – tholeiites, enrichment of Large Ion Lithophile Elements ( LILE ), and depletion of HFSE [ High Field Strength Elements ] in rocks of this plutonic complex are similar to those observed in the volcanic rocks of the IBM arc.

The Ladakh batholith Complex is one of the granitic belts exposed north of the Indus-Tsangpo suture zone in Ladakh, representing calc-alkaline plutonism related to the subduction of the Neotethys floor in Late Cretaceous. This batholith comprises predominantly I-type granites with whole rock delta delta 18O values of 5.7-7.4 per mil, without major contribution from continental crustal material.

In separated magnetites, from five [ 5 ] gabbros of the Tanzawa tonalite-gabbro complex and from three [ 3 ] tonalitic gabbros of the Ladakh batholith, we analyzed 22 trace elements by ICP-MS, including:

Nb [ Niobium ]; Ta [ Tantalum ]; Hf [ Hafnium ]; and, Zr [ Zirconium ].

In NMORB [ N Mid-Ocean Ridge Basalt ] normalized plots, the trace element patterns of all the magnetites analyzed show enrichment ( 5X NMORB ), in:

Nb [ Niobium ]; Ta [ Tantalum ]; Pb [ Lead ]; Sr [ Strontium ]; and, ( 2X NMORB ), in: Zr [ Zirconium ] with characteristically high Nb [ Niobium ], Ta [ Tantalum ] and Zr[ Zirconium ] / Hf [ Hafnium ] ratios.

In contrast, the patterns show anomalously low ( less than 0.1 NMORB ), in:

La [ Lanthanum ]; Ce [ Cesium ]; Pr [ Praseodymium ]; Nd [ Neodymium ]; Sm [ Samarium ]; and, Hf [ Hafnium ] concentrations.

It is noteworthy that in the normalized trace element plot, all the magnetites show ‘high’ Nb [ Niobium ] / Ta ratios, and in contrast with high Ta / Nb [ Niobium ] ratios were observed in typical arc [ volcanic ] magmas.

These data support our hypothesis, that:

Magmatic crystallization, of Fe [ Iron ] – Ti [ Titanium ] oxides ( under high oxygen fugacity conditions ) during ‘initial crystallization and formation’ ( of the Izu-Bonin and Ladakh-type arc batholiths ) may be the primary cause of depletion of HFSE [ High Field Strength Elements ] in later magmatic differentiates of less mafic and more felsic granitic arc rocks.

Query Results from the ADS Database

Retrieved 1 abstracts, starting with number 1.

Total number selected: 1.

@ARTICLE{2008AGUFM.V33C2228B,

author = {{ Basu }, A.~R. and {Ghatak}, A. and { Arima }, M. and { Srimal }, N. }

title = “{ Trace Element Geochemistry including the HFSE in Magnetites of Calc-Alkaline Plutons: the Tanzawa Complex of the Izu-Bonin-Mariana Arc and the Ladakh Batholith Complex, NW Himalaya }”

journal = {AGU Fall Meeting Abstracts},

keywords = {1020 Composition of the continental crust, 1031 Subduction zone processes (3060, 3613, 8170, 8413), 1036 Magma chamber processes (3618), 1042 Mineral and crystal chemistry (3620), 1065 Major and trace element geochemistry},

year = 2008, month = dec, pages = { C2228+ }

ADS url = http://adsabs.harvard.edu/abs/2008AGUFM.V33C2228B ADS note = Provided by the SAO/NASA Astrophysics Data System

Reference

http://adsabs.harvard.edu/abs/2008AGUFM.V33C2228B

– – – –

C.

[ NOTE: multi-domain topographic elevation studies on magnetite [ Mag ] and hydrothermal fluid affection … ]

Publication Date: December 2007

Title: Spatial Distribution of Magnetic Susceptibility in the Mt. Barcroft Granodiorite, White Mountains, California: Implications for Arc Magmatic Processes

Authors: Michelsen, K. J.; Ferre, E. C.; Law, R. D.; Boyd, J. D.; Ernst, G. W.; de Saint-Blanquat, M.

Affiliation:

AA ( Virginia Tech, Department of Geosciences, Blacksburg, VA 24061, United States ; kmichels@vt.edu ); AB ( Southern Illinois University, Department of Geology, Carbondale, IL 62901, United States ; eferre@geo.siu.edu );’ AC ( Virginia Tech, Department of Geosciences, Blacksburg, VA 24061, United States ; rdlaw@vt.edu ); AD ( Southern Illinois University, Department of Geology, Carbondale, IL 62901, United States ; jdboyd77@yahoo.com ); AE ( Stanford University, Department of Geological and Environmental Sciences, Stanford, CA 94305, United States ; ernst@geo.stanford.edu ); AF ( Universite Paul Sabatier, LMTG, Toulouse, 31400, France ; michel@lmtg.obs-mip.fr ).

Publication: American Geophysical Union, Fall Meeting 2007, abstract #T11B-0567   Origin: AGU

AGU Keywords: 1020 Composition of the continental crust, 3640 Igneous petrology, 3660 Metamorphic petrology, 8104 Continental margins: convergent, 8170 Subduction zone processes ( 1031, 3060, 3613, 8413 )

Bibliographic Code: 2007AGUFM.T11B0567M

Abstract

The petrographic or chemical zonation of plutons, has been widely studied and used to constrain petrogenetic processes and emplacement mechanisms.

The time involved in modal data collection, as well as the cost of chemical analyses, makes the search for pluton-scale zoning patterns the exception rather than the norm in ‘magmatic arc studies’, however magnetic susceptibility ( Km ) of plutonic rocks – both magnetite bearing and magnetite free – can be an invaluable tool to quickly assess the internal organization of any pluton.

New field observations, new magnetic mineral data, and reprocessed Km data on the Barcroft granodiorite pluton ( White Mountains, California ) are presented.

The average Km of 660 specimens from 76 stations ranges from 140 x 10-6 [ SI ] to 75000 x 10-6 [ SI ] with an average at about 16800 x 10-6 [ SI ].

The distribution of Km is unimodal.

The hysteresis parameters of the Barcroft rocks indicate that Km is controlled mainly by multi-domain magnetite.

The contribution of mafic silicates ( biotite and hornblende ) to Km ranges from 0.4 to 99%, with an average at about 1.8%.

As in many other ferromagnetic ( i.e. magnetite – bearing ) plutons, Km variations reflect different amounts of magnetite which itself results from petrographic variations.

This is supported by the positive correlation between major oxide variations ( e.g., SiO2, FeO ) and Km.

A new Km map of the Barcroft pluton shows several important features including, a:

(a) Low Km zone in the SW corner of the pluton, near areas that exhibit economic mineralization possibly related to hydrothermal fluids;

(b) Few isolated anomalies that may be attributed to transformation of normal magnetite into lodestone;

(c) North south high Km ridge that could possibly result from local mingling between the main granodiorite rock type and syn-plutonic mafic dikes;

(d) Broad reverse Km zonation ( i.e. higher Km in the centre ); and,

(e) Possible ‘positive correlation between Km’ and ‘topographic elevation ( between 5,000 and 13,000 feet )’, which could be explained by a higher fO2 at a higher structural level in the chamber.

These preliminary results suggest that, the:

(1) Syn-plutonic diking may play a significant role in the geochemical differentiation of granodiorite plutons;

(2) Classic dichotomy between ilmenite [ Ilm ] series and magnetite [ Mag ] series of granitoids might at least – to some extent – depend on the ‘exposure level’ if such intrusions are confirmed to be vertically differentiated; and,

(3) Mapping Km in a ferromagnetic pluton can be an efficient tool to constrain its internal organization.

Reference

http://adsabs.harvard.edu/abs/2007AGUFM.T11B0567M

– – – –

D.

[ NOTE: geochemistry and petrology of deep oceanic crust geothermal high-temperature ( Tave 713° C to Tave 722° C ) fluids … ]

Source: NASA Astrophysics Data System ( NADS ) [ Harvard University, access: http://adsabs.harvard.edu/cgi-bin/nph-abs_connect ]

Publication Date: November 2002

Title: Petrology and geochemistry of the lower ocean crust formed at the East Pacific Rise and exposed at Hess Deep: A synthesis and new results

Authors: Coogan, L. A.; Gillis, K. M.; MacLeod, C. J.; Thompson, G. M.; Hékinian, R.

Publication: Geochemistry, Geophysics, Geosystems, Volume 3, Issue 11, pp. 1, CiteID 8604, DOI 10.1029/2001GC000230 (GGG Homepage)

Origin: AGU [ http://www.agu.org ]

AGU Keywords: Marine Geology and Geophysics: Midocean ridge processes, Mineralogy, Petrology, and Mineral Physics: Igneous petrology, Mineralogy, Petrology, and Mineral Physics: Metamorphic petrology, Mineralogy, Petrology, and Mineral Physics: Minor and trace element composition.

Bibliographic Code: 2002GGG….3kQ…1C

Abstract

The geochemistry and petrology of the lower oceanic crust record information about the compositions of melts extracted from the mantle, how these melts mix and crystallize, and the role of hydrothermal circulation in this portion of the crust.

Unfortunately, lower oceanic crust formed at fast spreading ridges is rarely exposed at the seafloor making it difficult to study these processes.

At Hess Deep, crust formed at the East Pacific Rise ( EPR ) is exposed due to the propagation of the Cocos-Nazca spreading center westward.

Here we review our state of knowledge of the petrology of lower crustal material from Hess Deep, and document new mineral major and trace element compositions, amphibole-plagioclase thermometry, and plagioclase crystal size distributions.

Samples from the deeper parts of the gabbroic sequence contain clinopyroxene that is close to being in trace element equilibrium with erupted basalts but which can contain primitive ( moderate Cr, high Mg# ) orthopyroxene and very calcic plagioclase.

Because primitive Mid-Ocean Ridge Basalts ( MORB / MORBs ) are not saturated with orthopyroxene or very calcic plagioclase this suggests that melts added to the crust have variable compositions and that some may be in major but not trace element equilibrium with shallow depleted mantle.

These apparently conflicting data, are most readily explained if some of the melt – extracted from the mantle – is fully aggregated within the mantle but reacts with the shallow mantle during melt extraction.

The occurrence of cumulates, with these characteristics, suggests that ‘melts added to the crust’ do not all get mixed with normal MORB [ Mid-Ocean Ridge Basalts ] in the Axial Magma Chamber ( AMC ), but rather that ‘some melts partially crystallize’ in isolation within the lower crust.

However, evidence that primitive melts fed the AMC [ Axial Magma Chamber ], along with steep fabrics in shallow gabbros ( from near the base of the dyke complex ), provides support for models in which crystallization within the AMC followed by crystal subsidence is also an important process in lower crustal accretion.

More evolved bulk compositions of gabbros ( from the upper than lower parts of the plutonic section ), are due to greater amounts of ‘reaction with interstitial melt’ and not because their parental melt had become highly fractionated through the formation of large volumes of cumulates deeper in the crust.

Amphibole-plagioclase thermometry confirms, previous reports, that the initial ingress of fluid occurs at high-temperatures in the shallow gabbros ( Tave 713° C ) and show that the temperature of amphibole formation was similar in deeper gabbros ( Tave 722°C ).

This thermometry also suggests that fracture and grain boundary permeability for seawater derived fluids was open over the same temperature interval.

Reference

http://adsabs.harvard.edu/abs/2002GGG….3kQ…1C

– – – –

E.

[ NOTE: analysis of new major data and trace element data from minerals … ]

Source: NASA Astrophysics Data System ( NADS ) [ Harvard University, access: http://adsabs.harvard.edu/cgi-bin/nph-abs_connect ]

Publication Date: December 2010

Title: The `Daly Gap’ and implications for magma differentiation in composite shield volcanoes: A case study from Akaroa Volcano, New Zealand

Authors: Hartung, E.; Kennedy, B.; Deering, C. D.; Trent, A.; Gane, J.; Turnbull, R. E.; Brown, S.

Affiliation:

AA ( Geological Sciences, University of Canterbury, Christchurch, New Zealand; eha63@uclive.ac.nz ); AB ( Geological Sciences, University of Canterbury, Christchurch, New Zealand; ben.kennedy@canterbury.ac.nz );

AC ( Earth and Space Sciences, University of Washington, Seattle, WA, USA; cdeering@u.washington.edu );

AD ( Geological Sciences, University of Canterbury, Christchurch, New Zealand; ajt121@pg.canterbury.ac.nz ); AE ( Geological Sciences, University of Canterbury, Christchurch, New Zealand; jtg29@uclive.ac.nz ); AF ( Geological Sciences, University of Canterbury, Christchurch, New Zealand; ret26@student.canterbury.ac.nz ); AG ( Geological Sciences, University of Canterbury, Christchurch, New Zealand; stephen.brown@canterbury.ac.nz ).

Publication: American Geophysical Union, Fall Meeting 2010, abstract #V52B-02

Origin: AGU

Keywords: [ 3610 ] MINERALOGY AND PETROLOGY / Geochemical modeling, [ 3618 ] MINERALOGY AND PETROLOGY / Magma chamber processes, [ 3620 ] MINERALOGY AND PETROLOGY / Mineral and crystal chemistry, [ 3640 ] MINERALOGY AND PETROLOGY / Igneous petrology

Bibliographic Code: 2010AGUFM.V52B..02H

Abstract

The origin of compositional gaps in volcanic deposits that are found worldwide, and in a range of different tectonic settings, has challenged petrologists since Daly’s first observations at mid-ocean ridges.

In the shield-forming Akaroa Volcano ( 9.6 – 8.6 Ma ) of Banks Peninsula ( New Zealand ), a dramatic compositional gap exists in both eruptive and co-genetic intrusive products between basalt and trachyte, and between gabbro and syenite respectively.

Rock compositions display mildly alkaline affinities ranging from picritic basalt, olivine alkali basalt, and hawaiite, to trachyte.

Intermediate mugearite and benmoreite ( 50 – 60 wt. % SiO2 ) are not exposed or absent.

Equivalent plutonic diorite, monzodiorite, and monzonite ( 45 – 65 wt. % SiO2 ) are also absent.

Previously, the formation of the more evolved trachyte and syenite has been ascribed to ‘crustal melting’, however our analysis of new major data and trace element data from bulk-rocks and minerals – of this hy-normative intraplate alkalic suite – provide evidence based on crystal fractionation and punctuated melt extraction for an alternative model.

In bulk rocks observed major and trace element fractionation trends can be reproduced by Rayleigh fractional crystallization from dry melts ( < 0.5 wt. % H2O ) at oxygen fugacities of one [ 1 ] unit below the Quartz [ Qtz ] – Fayalite [ Fa ] – Magnetite [ Mag ] buffer ( QFM – 1 ).

The results of our MELTS models are in agreement with experimental studies, and indicate a fractionation generated compositional gap where trachytic liquid ( 62 – 64 wt. % SiO2 ) has been extracted after the melt has reached a crystallinity of 65% – 70 %.

The fractionated assemblage, of:

– clinopyroxene [ depletes High Field Strength Element ( HFSE ) Neobium ( Nb ) ]; – olivine [ depletes High Field Strength Element ( HFSE ) Neobium ( Nb ) ]; – plagioclase; – magnetite; and, – apatite.

All [ of the aforementioned ] are left in a mafic cumulate residue ( 44 – 46 wt. % SiO2 ).

Calculated values of specific trace and minor elements ( Sr [ Strontium ], Cr [ Chromium ], P [ Phosphorus ] ) from a theoretical cumulate are consistent with measured concentrations from cumulate xenoliths. ‘ Compositional trends from individual mineral analysis are also supportive of fractional crystallization, but illustrate a disrupted ‘liquid line of decent for each mineral phase.

Olivine [ depletes High Field Strength Element ( HFSE ) Neobium ( Nb ) ] compositions progressively decrease in Mg [ Magnesium ] concentration ( Fo83-42 ) in basaltic [ magma ] melts and shows high Fe [ Iron ] concentration in trachytic melts ( Fo5-10 ).

Clinopyroxene [ depletes High Field Strength Element ( HFSE ) Neobium ( Nb ) ] analyses also displays higher Fe [ Iron ] / Mg [ Magnesium ] ratios in more evolved rocks.

Ternary feldspar [ depletes High Field Strength Element ( HFSE ) Neobium ( Nb ) ] compositions shift from plagioclase ( An84-56 ) in basalt to alkali feldspar ( Or8-65Ab53-33An39-2 ) [ depletes High Field Strength Element ( HFSE ) Neobium ( Nb ) ] in trachyte, but also lack the intermediate compositions.

On the other hand, analysis of mafic cumulate xenoliths reflect more evolved mineral compositions – towards the rim than volcanic equivalents – and complete observed fractionation trends.

In summary, our results indicate that these compositional gaps formed from punctuated melt extraction within an optimal crystal fraction window ( 60% – 70 % crystallinity ).

Reference

http://adsabs.harvard.edu/abs/2010AGUFM.V52B..02H

– – – –

F.

Source: NASA Astrophysics Data System ( NADS ) [ Harvard University, access: http://adsabs.harvard.edu/cgi-bin/nph-abs_connect ]

Publication Date: October 1996

Title: The Aurora volcanic field, California-Nevada: oxygen fugacity constraints on the development of andesitic magma

Authors: Lange, R. A.; Carmichael, Ian S. E.

Affiliation:

AA ( Department of Geological Sciences, University of Michigan, Ann Arbor, MI 48109, USA ); AB ( Department of Geology and Geophysics, University of California, Berkeley, CA 94720, USA ).

Publication: Contributions to Mineralogy and Petrology, Volume 125, Issue 2/3, pp. 167-185 (1996). (CoMP Homepage)   Origin: SPRINGER

DOI: 10.1007/s004100050214

Bibliographic Code: 1996CoMP..125..167L

Abstract

The Aurora volcanic field, located along the northeastern margin of Mono Lake in the Western Great Basin, has erupted a diverse suite of high-K and shoshonitic lava types, with 48 to 76 wt. % SiO2, over the last 3,600,000 million years.

There is no correlation between the age and composition of the lavas.

Three-quarters of the volcanic field consists of evolved ( < 4 wt. % MgO ) basaltic andesite and andesite lava cones and flows, the majority of which contain sparse, euhedral phenocrysts that are normally zoned; there is no evidence of mixed, hybrid magmas.

The average eruption rate over this time period was ˜200 m3/km2/year, which is typical of continental arcs and an order of magnitude lower than that for the slow-spreading Mid-Atlantic Ridge.

All of the Aurora lavas display a trace-element signature common to subduction-related magmas, as exemplified by Ba [ Barium ] / Nb [ Niobium ] ratios between 52 and 151.

Pre-eruptive water contents ranged from 1.5 wt. % in plagioclase – rich two-pyroxene andesites to ˜6 wt. % in a single hornblende lamprophyre and several biotite-hornblende andesites.

Calculated oxygen fugacities fall within 0.4 and + 2.4 log units of the Ni-NiO [ Nickel and Nickel-Oxygen ] buffer.

The Aurora potassic suite, follows a classic calc-alkaline trend in a plot of FeOT / MgO vs. SiO2 and displays linear decreasing trends in FeOT and TiO2 with SiO2 content, suggesting a prominent role for Fe [ Iron ] – Ti [ Titanium ] oxides during differentiation.

However, development of the calc-alkaline trend – through fractional crystallization of titanomagnetite – would have caused the residual liquid to become so depleted in ferric iron that its oxygen fugacity would have fallen several log units below that of the Ni [ Nickel ] – NiO [ Nickel – Oxygen ] buffer.

Nor can fractionation of hornblende be invoked, since it has the same effect – as titanomagnetite – in depleting the residual liquid in ferric iron, together with a thermal stability limit that is lower than the eruption temperatures of several andesites ( ˜1040 1080°C ; derived from two-pyroxene thermometry ).

Unless some progressive oxidation process occurs, fractionation of titanomagnetite – or hornblende – cannot explain a calc-alkaline trend in which all erupted lavas have oxygen fugacites ≥ the Ni-NiO [ Nickel – Nickel-Oxygen ] buffer.

In contrast to fractional crystallization, closed system equilibrium crystallization will produce residual liquids with an oxygen fugacity that is similar to that of the initial melt.

However, the eruption of nearly aphryic lavas argues against tapping from a magma chamber during equilibrium crystallization, a process that requires crystals to remain in contact with the liquid.

A preferred model involves the accumulation of basaltic magmas at the mantle crust interface, which solidify and are later remelted during repeated intrusion of basalt.

As an end – member case, closed – system equilibrium crystallization of a basalt, followed by equilibrium partial melting of the gabbro will produce a calc-alkaline evolved liquid ( namely, high SiO2 [ Silicon-Oxygen / Oxide ] and low FeOT / MgO ) with a relative f O 2 ( corrected for the effect of changing temperature ) that is similar to that of the initial basalt.

Differentiation of the Aurora magmas by repeated partial melting of previous underplates in the lower crust, rather than by crystal fractionation in large stable magma chambers, is consistent with the low eruption rate at the Aurora volcanic field.

Reference

http://adsabs.harvard.edu/abs/1996CoMP..125..167L

– – – –

While the aforementioned information material concludes the introductory portion of Part 1 in this preliminary report, more easily understood materialize is intended to be documented in Part 2 through Part 5 ( previewed below ):

Part 2, of this Preliminary Report ( Part 1 of 5 ), is intended to focus on new ‘methods of capturing’ natural essence High Field Strength Elements ( HFSE );

Part 3, of this Preliminary Report ( Part 1 ), is intended to focus on new ‘discoveries of properties’ from captured natural High Field Strength Elements ( HFSE );

Part 4, of this Preliminary Report ( Part 1 ), is intended to focus on new ‘applied theories’ of natural High Field Strength Elements ( HFSE ); and,

Part 5, of this Preliminary Report ( Part 1 ), is intended to focus on new ‘transports’ having captured natural High Field Strength Elements ( HFSE ).

 

Submitted for review and commentary by,

 

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com

/

/

NASA Elenin Coming

NASA Elenin Coming

 

[ PHOTO ( above ): April 15, 2011 Comet Elenin position and trajectory Earth dates ( click to enlarge ) ]

NASA Elenin Coming by, Concept Activity Research Vault

May 17, 2011 16:42:08 ( PST ) Updated ( May 16, 2011 11: 20 )

CALIFORNIA, Los Angeles – May 17, 2011 – After much controversy over a celestial body referred to as “Elenin,” said to be entering our solar system, NASA admits there is a ‘comet’ it calls “Elenin” that ‘is’ in-fact ‘coming very soon’ toward Earth that will simply go around our Sun and slingshott off into outerspace, however there seems to be a few missing pieces to the NASA public report.

The first ( 1st ) problem is the ‘planetary or solar body slingshot fly-by effect’, which NASA spacecraft have used for decades to ‘increase speed’ and consequently propel – ‘faster than we can engineer a rocket motor to do’ – satellites toward destinations using various forms of ’staring plane mosaic’ technology incorporated into ’electronic telemetry guidance systems’. Comet Elenin, will also be experiencing the ‘slingshot fly-by effect’, but with ‘no electronic telemetry guidance systems’ to guide it in any particular direction.

The second ( 2nd ) problem is NASA’s Jet Propulsion Laboratory ( JPL ) Near Earth Object ( NEO ) Program Office solar system plotting system ( see below ) pictures Elenin ’on a direct collision course’ with “Mercury” around May 21, 2011 but NASA reports ‘nothing publicly’ about Elenin colliding with, hitting or glancing off-of ”Mercury” so, you be the judge:

[ IMAGE ( above ): May 16, 2011 Elenin solar system slingshot pathway  ( NOTE: distance indicated from Earth ) – View #1 ( click to enlarge ) ]

[ IMAGE ( above ): May 16, 2011 Elenin solar system slingshot path to Mercury ( NOTE: distance indicated from Earth ) – View #2 ( click to enlarge ) ]

[ IMAGE ( above ): May 21, 2011 – Elenin solar system slingshot Mercury collision path ( NOTE: distance indicated from Earth ) – View #3 ( click to enlarge ) ]

NASA news ( below ) and NASA plots for May 16, 2011 do not appear to match NASA actual plots for May 21, 2011 ( above ) when looking at “Mercury.”

– –

Source: NASA Space.Com

Comet Elenin Trajectory 22,000,000 Miles Close To Earth

Don’t Fear Comet Headed Our Way — It’s A Wimp

May 10, 2011 6:37:23 PM ET Updated ( May 10, 2011 22:37:23 )

A comet first discovered just 6-months ago will be making a visit to the inner solar system soon, but don’t expect to be completely dazzled. This comet is a bit of a wimp, NASA says.

Comet Elenin ( also known, by its astronomical name: C/2010 X1 ), was first detected on December 10, 2010 in Lyubertsy, Russia by observer Leonid Elenin who found the comet while using the remote controlled ISON Observatory near Mayhill, New Mexico, USA.

At the time of its discovery, the comet was about 401,000,000 million miles from Earth.

Over the past 4-1/2 months, the comet has closed the distance to Earth’s vicinity as it makes its way closer to perihelion ( its closest point to the Sun ).

As of May 4, 2011 the Comet Elenin distance is about 170,000,000 million miles.

“That is what happens with these long-period comets that come in from way outside our planetary system,” said NASA Jet Propulsion Laboratory ( JPL located in Pasadena, California ) Near-Earth Object ( NEO ) Program Office Don Yeomans in a statement. “They make these long majestic speedy arcs through our solar system and sometimes put on a great show, but not Elenin; right now that comet looks kind of wimpy.”

The comet doesn’t offer much of a view and is quite dim to behold.

“We’re talking about how a comet looks, as it safely flies past us,” said Yeomans. “Some cometary visitors, arriving from beyond the planetary region like the Hale-Bopp comet in 1997, have really lit-up the night sky – where you can see them easily with the naked eye – as they safely transit the inner solar system. But Elenin is trending toward the other end of the spectrum, you’ll probably need a good pair of binoculars, clear skies and a dark secluded location to see it – even on its brightest night.”

Comet Elenin, An Icy Run

Comet Elenin should be at its brightest, shortly before the time of its closest approach to Earth on October 16, 2011 when its closest point will be 22,000,000 million miles from Earth.

Even at such a distance, the Elenin comet will ‘not’ be able to shift tides or tectonic plates here on Earth – as some Internet rumors have suggested.

Some have even wondered if the comet could possibly be pushed closer to Earth than usual.

“Comet Elenin will not encounter any ‘dark bodies’ that could perturb its orbit, nor will it influence Earth in any way,” said Yeomans. “It will get no closer to Earth than 35,000,000 million kilometers.”

Not only is the Elenin comet far away but it is also on the ‘small side’ for comets, said Yeomans.

“So you’ve got a modest sized icy dirtball getting no closer than 35,000,000 million kilometers,” said Yeomans.

“It [ Elenin, the comet ] will have an immeasurably minuscule influence on our planet. By comparison, my sub-compact automobile exerts a greater influence on the ocean tide than will comet Elenin ever.”

This Fall – Cosmic Comet Show

But just because the Elenin comet ‘will not change much’ here on Earth, ‘does not mean skywatchers should not pay attention’.

“This comet [ Elenin ] may not put on a great show, just as certainly it will not cause any disruptions here on Earth, but there is a cause to marvel,” said Yeomans. “This intrepid little traveler [ Elenin, the comet ] will offer astronomers a chance to study a ‘relatively young comet’ that ‘came here from well beyond our solar system planetary region’. After a short while, it [ Elenin ] will be headed back out again, and we will not see or hear from Elenin for thousands of years. That’s pretty cool.”

NASA detects, tracks and characterizes asteroids and comets – passing relatively close to Earth – using both ground-based and space-based telescopes, and its Near Earth Object ( NEO ) Observations Program ( called: SpaceGuard ) discovers these objects, characterizes a subset of them and predicts their paths to determine if any could be potentially hazardous to Earth.

References

http://ssd.jpl.nasa.gov/sbdb.cgi?sstr=Elenin;orb=1;cov=1;log=0;cad=1#cad http://ssd.jpl.nasa.gov/sbdb.cgi#top http://ssd.jpl.nasa.gov http://www.jpl.nasa.gov/news/news.cfm?release=2011-138 http://www.jpl.nasa.gov/asteroidwatch/newsfeatures.cfm?release=2011-129 http://www.jpl.nasa.gov/news/news.cfm?release=2010-144 http://www.space.com/11617-comet-elenin-wimpy-solar-system.html

– –

How big is comet Elenin? What does comet Elenin look like real close? Well, that just seems to be the third ( 3rd ) and fourth ( 4th ) problem because NASA is ‘not reporting anything about those two ( 2 ) additional items’ either.

The only public cross-section comparative information analysis on comet Elenin was found contained in a privately produced video by a ‘good ole boy’ whose factual references appear fairly represented ( below ) – less having ‘overlooked his own mis-typing input’ of ”1,700,000,000,000″ ( without ‘commas’ – as seen ‘here’ ) indicating ‘billion’ ( miles ) rather than ”170,000,000″ ‘million’ ( miles ) he ‘thought’ as he indicated typing-in in his diligent attempt trying to make a calculation demonstration:

Submitted for review and commentary by,

Concept Activity Research Vault E-MAIL: ConceptActivityResearchVault@Gmail.Com WWW: http://ConceptActivityResearchVault.WordPress.Com

 

X-CIA Files Archives 2

Xcia Files main page image

[ NOTE: Legacy ( circa 1998 – 2003 ) X-CIA FILES website reports, briefs and images ( below ) ]

WARNING & DISCLAIMER: THIS IS NOT A GOVERNMENT WEBSITE

X-CIA Files Archive 2

” Details, Usually ‘Unavailable’ Elsewhere, Are Typically ‘Available’ Here! “

ExtraTerrestrial Technologies ( ETT )

Plasma Torch Rectenna Propulsion

3D Penrose Tiling Structures

Quasi-Crystal Materials Sciences

Lenticular Tactical Aerospace Vehicles ( LTAV )

Unmanned Combat Aerial Vehicle ( UCAV ) Linear Engines

Single-Stage To Orbit ( STO ) Vehicle Propulsion

Space-Time Continuum Manipulations

INTEL ( DOCILE ) Digital IC Orbit Communication Technologies

ExtraTerrestrial Biological Entities ( EBE )

Rare Unidentified Flying Objects ( UFO )

Linear AeroSpike Engines for Unmanned Combat Aerial Vehicles ( UCAV )

ROCKETDYNE linear AeroSpike XRS-2200 ( RS-2200 ) engines utilize a design first developed in the early 1970s incorporating Apollo space mission era hardware from J-2S engines. Although this AeroSpike engine began almost 30-years ago it is of strategic importance today in the F-117-E Stealth reconnaissance aircraft it will be for future aerospace vehicles now under development.

21st Century propulsion technology was derived from a combination of 1960s era hardware developed from several decades of engine designs plus 1990s era design / analysis tools and fiscal realities that ushered in an entirely new era of commercial and military space flight where ‘old technology’ was found to be a primary key to developing even newer technological advancements.

The AeroSpike team located vendors, who more than 30-years ago, manufactured the original J-2S legacy engine hardware that AeroSpike based its turbo-machinery on.

Vendors, still in existence, were contracted to provide the Program with newly built J-2S hardware.

In cases where vendors had gone out-of business, new generation vendors were identified for producing new hardware from 30-year old legacy designs.

Panels, that make up the unique AeroSpike nozzle, presented a huge design challenge.

AeroSpike nozzles have a form like a large curved ramp – unlike traditional bell shaped nozzle characteristics for most rocket engines.

AeroSpike nozzle thermal and structural loads required development of new manufacturing processes and toolings to fabricate and assemble AeroSpike nozzle hardware.

The AeroSpike team found a way to accommodate huge thermal expansions and induced mechanical forces in mechanical and brazed joints of the assemblies where appropriate materials and attachment techniques were identified so effective manufacturing processes were developed.

In early 1997, a small half span model packaged equipped with an AeroSpike 8 thrust-celled nozzle engine equipped Lift-Body Vehicle [ SR-74 Scramp ( see further below ) ] – called the LASRE experiment – was piggy-back mounted onto a LOCKHEED SR-71 BlackBird high-altitude reconnaissance aircraft that was then tasked to operate like a ‘flying wind test tunnel’ to determine a Single-Stage-To-Orbit ( STO ) Reusable Launch Vehicle ( RLV ) onboard AeroSpike engine plume would affect aerodynamics of the Lifting Body Vehicle shape at specific altitudes and speeds initially reaching approximately 750-mph.

The interaction of the aerodynamic flow with the engine plume could create drag and design refinements minimized that interaction.

The lifting body model, eight [ 8 ] nozzled AeroSpike engine and, canoe collective was called the “pod.” The entire pod was 41-feet in length and weighed 14,300 pounds. The experimental pod, mounted onto an Air Force LOCKHEED SR-71 BlackBird stealth reconnaissance aircraft loaner, was completed in November 1998.

Successfully completion braze of the first flight ramp was accomplished as well as the fabrication and assemblies of the parts for the first thrusters.

With completion of those milestones the AeroSpike engine proceeded through fabrication, test and, delivery enabling support for the planned first flight in 1999.

Now, the AeroSpike XRS-2200 linear thrust-vectoring engine’s eventual “new placement’ was set for the sub-orbital technology air/space vehicle – referred to as the X-33 VentureStar – had a Rocketdyne team anticipating delivery of their first XRS-2200 AeroSpike flight engine by September of 1999.

Also, a “combined industry and government team” at LOCKHEED-MARTIN Skunk Works ( Palmdale, California ) was developing the X-33 for its AeroSpike XRS-2200 engine flight out of Edwards Air Force Base ( California ) scheduled for December of 1999.

The Linear Aerospike XRS-2200 ( RS-2200 ) engine was developed by the ROCKETDYNE PROPULSION AND POWER UNIT of the BOEING COMPANY indicating they completed the engine in early 2000 although the F-117A Stealth fighter was already secretly flying ‘long before’ their official public information release.

The difference between the linear AeroSpike engine and conventional rocket engines are the shape of the nozzle – unlike conventional rocket engines using a bell shaped nozzle to constrict expanding gases – the Aerospike nozzle is V-shaped and called a “ramp”.

The performance of Electro-Mechanical Actuators ( EMA ) are used in propellant valving for these engines. EMA is seen as a technology of choice in new rocket engines that would be developed under the Space Launch Initiative ( SLI ).

The XRS-2200 gas generator operated successfully in the flow-rate range of proposed X-33 operating conditions. The gas generator essentially was a J2 gas generator modified for the higher chamber pressure and flow-rate required for the XRS-2200.

The gas generator must be able to operate in conditions significantly higher than normal J2 operating conditions.

A review of the data showed the gas generator operated in these conditions but that also, the combustor shell wall temperatures were within acceptable tolerances. Post test inspections also found the hardware to be in good operating condition which showed signs of marked improvements from past hardware weakening. Engineers at Marshall Space Flight Center ( MFSC ) were able to demonstrate the gas generator could be started with a softer ramp, to minimize overpressure of the combustor shell, by accurately sequencing valve timings and ramps on the XRS-2200 AeroSpike engine.

Successful component tests followed a series of AeroSpike multi-cell engine tests at Marshall Space Flight Center that successfully demonstrated hydrogen-oxygen combustion at full power, emergency power, and low throttle conditions.

The pressure fed thrusters and AeroSpike nozzles were developed at the Rocketdyne Division of Boeing under a technology agreement with NASA and Lockheed-Martin who was set to build the VentureStar X-33 transport aerospace vehicle.

The XRS-2200 AeroSpike engine shoots hot gases, from multiple linear placed chamber nozzles along the outside of the ramp surface. This unusual design allows the engine to be more efficient and effective than today’s rocket engines by ‘modulating the thrust to various positioned sets of these nozzels acting in concert with vectoring – shaping direction of engine propulsion / thrust.

Hot test firings were performed with tests on the powerpack at the John C. Stennis Space Center, which included the turbo-machinery and gas generator that ran a program duration of 45-seconds with a start to the 80% power level transition to mainstage operation at 100% power and then throttled down to 57% power.

Test data indicated normal shutdown with no anomalies for the ROCKETDYNE AeroSpike XRS-2200 linear engine designed for use onboard the VentureStar X-33 – Reusable Launch Vehicle ( RLV ) – prior to delivery at the LOCKHEED-MARTIN VentureStar X-33 assembly facility ( Palmdale, California ) where the X-33 was to be flown from Edwards Air Force Base ( California ) into outerspace followed by a return touchdown at one ( 1 ) of two ( 2 ) landing sites ( i.e. Utah or Montana ).

Some VentureStar X-33 and XRS-2200 ROCKETDYNE engine project participants, were:

– Gene Austin, Program Manager for NASA VentureStar X-33 at Marshall Space Flight Center; – Cleon Lacefield, Vice-President LOCKHEED-MARTIN Space Systems ( Palmdale, California ) VentureStar X-33; – Don Chenevert, Program Manager, NASA X-33, Aerospike Engine Testing at Stennis Space Center, MS; – Mike McKeon, Program Manager X-33 Aerospike Engine, ROCKETDYNE Propulsion and Power Unit, BOEING ( Canoga Park, California ); and, – Steve Bouley, Division Director, Propulsion Development, ROCKETDYNE Propulsion & Power Unit, BOEING.

Instead of hydraulics, future propulsion systems may use EMAs to control major propellant valves so gaining performance data in ‘real world testing’ has significant value.

There are six ( 6 ) EMAs – on each AeroSpike test engine – used to deliver propellants to the thruster banks and gas generators. Two ( 2 ) engines will use forty ( 40 ) thrusters – 20 per XRS-2200 AeroSpike engine achieves aircraft velocities exceeding Mach 13 +.

A total of seven ( 7 ) variations of the Advanced Linear AeroSpike Engines – built by the ROCKETDYNE DIVISION of BOEING – were to power the X-33 VentureStar RLV to have been built by LOCKHEED-MARTIN.

There were three ( 3 ) additional powerpack assemblies and four ( 4 ) full-up AeroSpike XRS-2200 linear engines – including two ( 2 ) flight units that existed during the remainder of the development program.

ROCKETDYNE developed the XRS-2200 Aerospike linear engine at its Canoga Park, California facility for the later cancelled ( 2001 ) VentureStar X-33 Single-Stage To Orbit ( STO ) Reusable Launch Vehicle ( RLV ) space transport program. A joint BOEING and NASA team at Stennis Space Center did the final XRS-2200 AeroSpike engine assembly.

The RS-2200 Linear Aerospike Engine is being developed for use on the LOCKHEED-MARTIN Skunk Works Reusable Launch Vehicle ( RLV ).

The Aerospike linear engine allows the smallest lowest cost RLV ( Reusable Launch Vehicle ) to be developed because the engine fills the base ( reducing base drag ) and is integral to the vehicle – reducing installed weight when compared to a bell shaped conventional rocket engine.

The Aerospike is somewhat the same as bell shaped rocket engines, except for its nozzle open to the atmosphere. The open plume compensates for decreasing atmospheric pressure as the vehicle ascends – keeping engine performance very-high along the entire trajectory.

This altitude compensating feature allows a simple low-risk gas generator cycle to be used. Over $500,000,000 million has been invested to-date in AeroSpike engines, and full size linear engines have accumulated seventy=three [ 73 ] tests and over 4,000 seconds of operation.

Following the series of tests, XRS-2200 AeroSpike engines were removed from the test stand facility and put into storage at the Stennis Space Center – awaiting NASA instructions on engine final dispositions.

The precursor to the AURORA Transport-Lift Vehicle X-43 placed three ( 3 ) such X-43A aerospace vehicles inside the Dryden Space Flight facility at Edwards Air Force Base, California where a 12-foot-long under wing test vehicle existed for the NASA “Hyper-X” multi-year hypersonic research program [ AURORA ] to demonstrate “airframe integrated and air breathing ( AeroSpike ) engine technologies” that promise to increase payload capacity for future vehicles by consuming ambient oxygen at altitudes higher than previously possible.

This will remove the need for carrying oxygen tanks onboard to promote combustion, as traditional rockets must do now.

Two flights are planned at Mach 7 ( approximately 5,000 mph ) and one ( 1 ) flight at Mach 10 ( almost 7,200 mph ) to a top speed of Mach 13 +.

By comparison, the world’s fastest “air-breathing plane” – to date – was the LOCKHEED SR-71 Blackbird that could fly at an ‘unclassified airspeed’ of Mach 3 + to an extimated top speed of Mach 7.

Future generations of EMAs will be even more compact – than those currently in operation – that will pave the way for linear AeroSpike acceleration thrust-vectoring engines to be deployed onboard ‘newly designed’ Unmanned Combat Air Vehicles ( UAV ).

Some X-33 and ROCKETDYNE XRS-2200 AeroSpike engine project participants, were:

– Gene Austin, NASA X-33 Program Manager, Marshall Space Flight Center; – Cleon Lacefield, Lockheed-Martin Space Systems Company Vice President for X-33, Palmdale, CA; – Don Chenevert, NASA X-33 Program Manager, Aerospike Engine Testing, Stennis Space Center, MS; – Mike McKeon, X-33 Aerospike Engine Program Manager, Rocketdyne Propulsion and Power, The Boeing Company, Canoga Park, CA; and, – Steve Bouley, Division Director, Propulsion Development, Rocketdyne Propulsion & Power Unit, the Boeing Company.

August 8, 2001 – The NASA Second Generation Reusable Launch Vehicle Program – also known as the Space Launch Initiative ( SLI ) – is making advances in propulsion technology with this third and final successful engine hot-fire designed to test electro-mechanical actuators. Information learned from this hot-fire test series about new electro-mechanical actuator technology – which controls the flow of propellants in rocket engines – could provide key advancements for the propulsion systems of future spacecraft. The test of twin ( 2 ) Linear Aerospike XRS-2200 engines originally built for the X-33 program, was performed Monday, August 6, 2001 at the NASA Stennis Space Center, Mississippi where the engines were fired for the planned 90-seconds and reached a planned maximum power of 85%. The test was originally slated to attain full power during 100-seconds of testing. Prior to the test, engineers determined the necessary results could be achieved at reduced duration and power. Based on this determination, both planned duration and planned power were reduced. Two [ 2 ] shorter hot-fires of the AeroSpike engines were performed last month [ July 2001 ] in preparation for the final test firing on August 6, 2001.

The Second Generation Reusable Launch Vehicle ( RLV ) Program, led by the NASA Marshall Space Flight Center in Huntsville, Alabama is a technology development program designed to increase safety and reliability while reducing costs for space travel.

“Because every engine proposed by industry for a second generation vehicle has electro-mechanical actuators, we took advantage of these AeroSpike engines already on the test stand to explore this relatively new technology now – saving us valuable time later,” said Garry Lyles, Propulsion Projects Office manager of the Second Generation Reusable Launch Vehicle Program at the Marshall Center. “This data is critical toward developing the confidence required to support the use of these actuators on future launch vehicles.”

Electro-mechanical actuators electronically regulate the amount of propellant (fuel and oxidizer) flow in the engine. The new technology is a potential alternative and improvement to the older pneumatic and hydraulic fluid systems currently used by the aerospace industry to drive and control critical rocket engine valves.

“This series of engine firings tested the actuator control system in what we call a ‘real condition of use’ environment,” said Dr. Donald Chenevert, electro-mechanical actuator project manager at the Stennis Center. “Firing allows us to see how the integrated system handles the extreme cold of cryogenic propellants, the stress loads of the propellants pushing through the valves, and the dynamic response to commanded flow rate changes. Additionally, we have many other unique conditions such as shock and vibration loads not found in a lab, so we capture more realistic data about the true performance of the actuators.” Engineers are performing engine post-test inspections, and early indications are that all test objectives have been met, Chenevert said.

The final data is to be fed directly into the engine systems being considered for a second-generation reusable launch vehicle, Lyles said. “Propulsion is one of the highest and most critical technology areas that we are exploring,” said Dennis Smith, manager of the Second Generation Reusable Launch Vehicle Program Office at the Marshall Center. “Our goal also is to find, improve or develop technologies such as airframes, avionics, health management systems and ground operations – all to make getting people and payloads into space safer and cheaper.”

The Rocketdyne Propulsion and Power Unit of The Boeing Company in Canoga Park, California developed the AeroSpike engine with engine test support conducted at Stennis Space Center.

RESEARCH ( Full Photo Gallery ): https://web.archive.org/web/20081020050929/http://unwantedpublicity.media.officelive.com/Gallery.aspx

– –

[ PHOTO ( Insert Here ): No photo ( to-date ) exists, to provode even an approximate exemplification ( for display as a header image ) depicting two ( 2 ) certain large dark triangle crafts I have personally witnessed as a trained intelligence professional and observer, that I have only briefly described within some of my reported publishings ) ]

Southern California High Desert Community Town Hall Meeting On ExtraTerrestrial Events –

USA, California, Hesperia – 1997 – Local area residents of the Victorville Valley southern California area formed a collective having focused new attentions toward the sky where a rash of ‘satellite failures’, unidentified flying objects ( UFO ) including one ( 1 ) that stopped 2-way traffic for 10-minutes at night along a popular California interstate where a very large triangle craft hovered while actually blocking out starlight, and – further south – a few extraterrestrial biological entity ( EBE ) sightings.

Local newspapers only reported that local area residents, curious from too many unexplained sightings, were holding an open to the public Town Hall Meeting ( Main Street in Hesperia, California ) to discuss and compare what they were encountering.

The town hall meeting, held in a small retail center with a fast food store, unfortunately did not discuss UFo sightings because a woman – operating an overhead transparency slide projector – placed images of ‘foreign’ ( Mexico ) area Chupacabra sightings for discussion. Frustrated by the obviously lengthy Chupacabra distraction, not taking local area UFO questions, most residents in-attendance abandoned the town hall office meeting to stretch their legs outside where some began talking amongst themselves about ‘UFO and alien local encounters’ they thought the town hall meeting was supposed to be allowing for open discussions.

Noticing town hall meeting attendees pouring outside, freelance reporter Paul Collin interviewed the disenfranchised residents whom left their off-topic town hall meeting inside. Eyewitnesses, came with family members, some providing additional eyewitness accounts. Providing one-on-one interviews for only first-hand reports describing details, residents were also allowed to personally sketch drawings of personal UFO and alien entity encounters.

A good investigative journalist may play unknowledgeable while subtly and quite effectively being able to quickly assess normal human frailties from purposeful deceit in getting to the bottom of the truth. Easy to spot, are armchair storytellers ( with plenty of time on their hands who invariably stray from the topic to talk about what they did or do for a living ), narcissists ( rambling on about themselves while exhibiting rather odd personal quirks ), weird-os and opportunists ( some wearing partial Star Trek or Wonder Woman costumes, alien face masks, spring-wired tinfoil antenna balls sprouting from headbands, or constantly looking in their compact mirrors to see if their make-up is still on their face correct ), and then move-on to interview others whose purpose stems from serious concerns as a resident member of the community.

Even then, trying to detrmine fact from fictionalized accountings is not an easy task. You look deep into these people’s faces as they convey their stories. “Did they really see what they’re claiming?” Look at their faces, closer, any micromomentary facial expressions? Also look carefully at their eyes and the direction they quickly snap ‘just before beginning to answer your question’. Look carefully at their reactions after throwing their own statement back at them, but with a purposeful small inaccuracy, to see whether they correct it, become exacerbated by your having just twisted what they just conveyed, or continue as though that’s what they said – but actually didn’t. Can they provide details as to what they were doing ‘just before the time of their encounter’? Do they appear to be easily disturbed emotionally or do they offer light-hearted concerns while discussing their more serious concerns on-topic?

Most were rather ‘original’, several did not match what was mostly being reported, some interviewees were very apolegetic for not having more than just a little to report. The culmination of many reports served to quickly narrow the scope of ‘believeable encounters’ from those ‘otherwise’.

Analysis of all boiled down to the following six ( 6 ) essential facts:

1. High-volume ( PUBLIC ) sightings;

2. Short-term duration ( 30-DAY ) reportings;

3. Small region ( HIGH DESERT ) locations;

4. Near ground Low Earth Orbit ( LEO ) altitudes.

5. Limited design ( UFO ) triangles; and,

6. Incident ( MAJOR ) highways.

Over all reports, only four ( 4 ) really stood-out:

A. There was the Hesperia, California family in their minivan – homebound east on Main Street ( Hesperia, California ) with a clear sunset behind them having just left a soccer game when all occupants began to comment about what appeared outside their windshield in the low horizon distance where a slimline triangle shaped UFO just lingered ( for about 5-minutes ) but then suddenly ( in seconds ) snapped its location due south and shot upward where all of a sudden – and in’mid-sky’ – just blinked-out before even reaching the upper darkening sky. The triangle UFO exhibited ‘no contrails’, ‘no sound barrier boom’, nothing. Just a brief low earth hover, quick snap south and then up out of sight in the blink of an eye. While the kids were all excited, the parents tried calming them down – along with their own unsettled nerves – explaining it all away as only being some new Air Force jet. Deep inside, the parents knew it was ‘not any aircraft’, but only one ( 1 ) of other unexplained sightings plaguing yet other residents over the past month;

B. All alone, a middle-aged man traveling west on Main Street ( Hesperia, California ) homebound for Phelan, California spotted in the southwest sky over the Wrightwood mountains a large triangle craft slowly moving upward. Stopping at California state highway 395 traffic light. He looked back up out his windshield and saw nothing there anymore, but it gave him something to tell his wife when he arrived home. The wife, rolling her eyes, put dinner on the table, but interestingly was also present by his side at the town hall meeting as well. Residents wanted to know what was going on in their local community, especially after local UFO sightings appeared to begin registering in their local paper;

C. Further southwest and beyond the Wrightwood mountains – in Azusa, California – a grandmother and her live-in daughter nurse both witnessed – on two ( 2 ) separate occasions while driving home slowly down their semi-rural neighborhood street at night – two ( 2 ) glowing red eyes in the head of what appeared to them to be a small 2-legged ape-like creature hunched down by the side of their road where although well ahead of their vehicle the creature suddenly darted across the street but with what they both claimed was at a ‘frightening blur’ of a pace. The women also spotted what they believed was the same 2-legged ape-like creature with red eyes three ( 3 ) additional times but inside the furthest corner of their backyard where it seemed to be glaring at them both through the rear kitchen window. Immediately scared to death, both residents – whom by the time they thought to call police – then witnessed the creature pop-up – rather unusually – bounding over and outside their backyard fence. I had to ask if they remained in the town hall meeting for the Chupacabra discussion, and they glanced at each other and let me know that what they saw was ‘not’ a Chupacabra. I asked, “Could it have been a baby Chupacabra? They looked at each other and then back at me, shaking their heads in the negative. Their ‘thing’ was ‘not hairy’, did ‘not have head horn spikes’, was ‘not a color shade of grey, blue, or eggshell’. It was ‘black’, ‘short’, and when it moved – it moved ‘extremely fast’ with an ‘odd blur’ you couldn’t focus-in on. I thought to myself, “Probably darn hard to target fire onto;” and,

D. What brought the freelance reporter to the meeting in the first place was his own personal encounter in the same general area of the High Desert of southern California where 1-week earlier at night while 20-minutes northeast of Victorville, California in the middle of the desert on California Interstate 15 ( I-15 ) on his way to Las Vegas, Nevada he noticed traffic on ‘both sides of the that highway pulling over and stopping. He figured a serious accident occured and pulled over to exit his vehicle to look out into the desert along the highway, but didn’t see any vehicles there. He walked back a couple of cars and noticed a group of people talking together and asked where the accident was. He was told to look up just a little off to the east of the intersate highway to see what was stopping traffic, and there ‘it’ was – an incredibly huge black triangle shaped object just hovering without any lights on. The oddest thing about it was that it was so huge that a whole section of the night sky had no starlight while all around the flying object anyone ‘could easily see starlight all around’ but no starlight directly above the behemoth. I asked the group what the thing was doing and what had it been doing. They said they didn’t know what it was doing now, but that it had been exhibiting a low hum, which stopped, and it was just continuing to linger where it had been for what they estimated had been 15-minutes. I called the California Highway Patrol office and they said they were already responding to it. I heard no emergency sirens and saw no red lights. I waited another 15-minutes and nothing happened. It just lingered a few hundred feet above ground off in the desert. Not being too much braver I decided to get back into my car and turn around and go back home.

The following week, I located one ( 1 ) particular Blockbuster video store on Bear Valley Road in southwest Victorville, California. That particular store carried an unusually large selection of UFO documentary videos placed in a special section. I decidely watched over 100 of those videos to determine if anyone else might have seen any huge triangle UFOs. At the time ( 1997 ) there were unfortunately ‘no flying triangle videos’ I could lay my hands on.

Apparently my frequent selections attracted the attention of the store owner, John Pflughoft of MPM INVEST, who eventually approached me and politel asked why I was interested in watching so many UFO videos. I think he knew something had startled me into that habit so, I conveyed what I had seen the previous week.

I also shared my late night experiences during 1972 while assigned to the Intelligence Section Station at George Air Force Base and later Edwards Air Force Base in the High Desert. I told him about strange red, orange, and yellow ‘firelight’ coming out the tops of some of the smaller mountains scattered between George AFB and Edwards AFB out in the middle of the desert. The video proprietor asked if I knew what the ‘firelights’ were. I told him I figured it was just rocket engine testing going on inside some of those small mountains.

He asked if I had ever seen any UFOs before last week. All I had to convey was an experience in 1976 while camping with a couple of my military buddies of mine up in the Iron Mountain range between Randsburg, California and Mojave, California at night, and while we ‘saw nothing’ all three ( 3 ) of us ‘heard’ a very unusual ‘electronic whirring sound’ that seemed to be travelling up and down both sides of the foothills a few hundred feet from where we were trying to sleep. I told him we walked in the direction of where we heard the whirring sound coming from last but saw nothing. Then when we returned to our camp where 30-minutes later we all heard it start back up again so, we left in the middle of the night and drove 90-miles to get home. He smiled and said, “Well, I guess that until last week you’ve been pretty lucky to have remained out-of the UFO experience.”

He then asked if an upcoming town hall meeting in Hesperia, California where residents were going to discuss their own personal UFO and extraterrestrial encounters during the recent month might interest me. I knew nothing of any other sightings so, he suggested I attend and asked if I would report back to him what I learned. I agreed, attended the meeting, but when it began being abandonded, dug-out a yellow legal pad of paper and began interviewing attendees upon exit. A final report was  prepared, and along with resident sketches, placed in a manila envelope sealed-up and dropped-off at the Blockbuster store for his later review.

– –

[ PHOTO ( above ): Circa 12OCT62 – Ames Langley Research Center lenticular vehicle aero-space body designs ( click on image to enlarge ) ]

As far back as October 12, 1962 Ames Langley and Dryden Flight Research Centers began feasibility studies and designs for developing a lenticular design space re-entry air vehicle with speeds capable of reaching Mach 25 + to Mach 50 +.

[ photo ( above ) TR-3B Astra – Flying Triangle ( click to enlarge ) ]

In 1995, at Nellis Air Force Base Test Range S-4 ( near Papoose Lake, Nevada ) the TR3-B ( a lenticular-shaped aerial vehicle ) was seen and reported to be between 300-feet and 500-feet in diameter.

Reportedly, the TR3B flies at speeds of Mach 15 + and reflects a bright blue grey color believed to be biological electro-chromatic 3-D Penrose tiling polymer material providing highly advanced stealth qualities.

TR3B is also believed to have carried the INTEL company Direct Orbital Communication & Intelligence Link Electronics ( DOCILE ) computer processor unit ( CPU ) system.

TR3B is believed to have derived partial funding from Strategic Defense Initiative ( SDI – Star Wars ) links with the super secret AURORA Program Office global security defense operations mission.

TR3-B is believed using a quasi-crystalline molecular property energy containment storage core driving a plasma-fluidic propulsion system employing combinatoric development of its Magnetic Field Disruptor ( MFD ) quantum-flux transduction field generator technology.

TR3B reportedly emits cyclotron radiation, performs pulse detonation acceleration, and carries EPR quantum receivers.

TR3B craft reportedly resembles a ‘very large triangle’ ( shaped ) air vehicle.

TR3B (aka) TR3-B (aka) TIER III B craft in no way resembled the TR-3/A MANTA air vehicle.

The flight-testing of all experimental and first-model military aircraft occurred here along an ancient dry lake now called Rogers Dry Lake, located on the western edge of Southern California’s Mojave desert – south of Highway 58 between the two ( 2 ) towns of Mojave, California and Boron, California ( where the World’s largest open-pit borax mine is ) in Rogers Dry Lake ( one of the first immigrant trails through California ).

The first permanent settlers, to a certain desert region area, was the Corum Family who located near this large dry lake area ( in 1910 ) where later local residents tried to get the local U.S. Post Office to name it “Corum, California” however another city with a similar name “Coram, California” existed so, the name “Corum” was reverse spelled it as “Muroc,” which is where this Mojave desert area saw the U.S. Army Air Corps later name “Muroc Field” and the subsequent naming of the NASA Muroc Flight Test Unit ( MFTU ) in this California area.

This dry lake was extremely ideal as what would become the major site of aviation flight-test history because, at about 2,300-feet above sea level, Rogers Dry Lake not only happens to fill an area of about 44-square miles (nearly half again as large as New York’s Manhattan Island) making it one of the largest and best natural ( flat and hard surfaced ) landing sites on Earth. The arid desert weather also promotes excellent flying conditions on almost every day of the year ( about 320-days out of the year ).

Rogers Dry Lake is the sediment filled remnant of an ancient lake formed eons ago. Several inches of water can accumulate on the lakebed when it rains, and the water in combination with the desert winds creates a natural smoothing and leveling action across the surface. When the water evaporates in the desert sun, a smooth and level surface appears across the lakebed, one far superior to that made by humans.

[ photo ( above ) DOUGLAS AIRCRAFT Black Horse Project Manta ( click to enlarge ) ]

AURORA Program Office consisted of lenticular shaped and wingless aerospace vehicle Projects that industry sleuths speculate secretly held a billion dollar high-speed high-altitude surveillance air space vehicle that leaves a ‘contrail’ behind it resembling ‘doughnut clouds on a string’.

According to some reports, AURORA Program aerospace vehicles are capable of high-speed maneuverability allowing abrupt course change corrections within their own flight path.

Information ( below ) are excerpts from two ( 2 ) individuals, Robert “Bob” Lazar and Edgar Fouche, during different time periods at different locations. These relevant excerpts should serve to familiarize readers and provide interesting relationships between similarities of ETT ( ExtraTerrestrial Technologies ), reverse-engineering ( backward engineering ) of U.S. government military seized extraterrestrial spacecraft, and current day advanced technologies controlled by the U.S. Department of Defense ( DOD ), Defense Advanced Research Projects Agency ( DARPA ) programs and projects worldwide.

Obvious template similarities seem to have been successfully performed in order to produce fully operational high-performance defense and observation flightcrafts for exclusively U.S. government use, examples of which may be viewed in the section “ NEWS ALERTS! “ on this website.

[ photo circa: 1995 ( above ) Area 51, Groom Lake Nevada ( click to enlarge ) ]

While Bob Lazar ( below ) provides his interviews mentioning eyewitness accounts coupled with basically seamless theories covering time and space folding with alien spacecraft ET technology interalia ETT [ ExtraTerrestrial Technologies ] given his previous work at the Nellis Air Force Base, Nevada Test Range Site S-4 Area 51 ( near Groom Lake, Nevada ), Edgar Fouche provides another arena of detailed information coinciding with some areas with what Bob Lazar saw in his own experiences near ExtraTerrestrial technology ( ETT ). Edgar Fouche depicts how ETT was converted into operational use flying craft for United States government arenas.

Skeptics may no longer speculate on what current aerospace lenticular crafts the U.S. has developed and just what is planned for the not too distant future where most of these highly classified Programs and Projects will remain cloaked for some time to come.

[ NOTE: For a look at current lenticular crafts, developed by the U.S., search this website for photos and details on UAV, UCAV, MCAV, MAV and High-Energy Weapons ( HEW ) and Directed Energy Weapon ( DEW ) research and development. ]

Culminations of technology data and relevant associated theories have never before been produced into one ( 1 ) reading area until now ( here ) where at first glance the following data may seem too fictionalized due in large part to its unfamiliarity to most, however these technologies are very much a large part of reality in what research provides of which only a very small percentage is presented in multiple interviews ( below ):

Interview Excerpts of Bob Lazar interview ( 09DEC89 ) on KLAS TV ( Las Vegas, Nevada ), below:

Producer / Host: George Knapp Lazar: The first thing was hands-on experience with the anti-matter reactor. Knapp: Explain what that is, how it works, and what it does. Lazar: It’s a plate about 18-inches in diameter with a sphere on top. Knapp: We have a tape of a model that a friend of yours made. You can narrate along. There it is… Lazar: Inside that tower is a chip of Element 115 they just put in there. That’s a super-heavy element. The lid goes on top. And as far as any other of the workings of it, I really don’t know, you know, [ such as ] what’s inside the bottom of it ( i.e. Element 115 ), sets up a gravitational field around the top. That little waveguide, you saw being put on the top, it essentially siphons off the gravitywave – and that’s later amplified in the lower portion of the craft. But, just in general, the whole technology is virtually unknown. Knapp: Now we saw the model. We saw the pictures of it there. It looks really, really simple, almost too simple to actually do anything. Lazar: Right. Knapp: Working parts? Lazar: None detectable. Essentially what the job was, to back-engineer [ reverse engineer ] everything, where you have a finished product and to step backwards and find out how it was made or how it could be made with earthly materials. There hasn’t been very much progress. Knapp: How long do you think they’ve had this technology up there? Lazar: It seems like quite a while, but I really don’t know. Knapp: What could you do with an anti-matter generator? What does it do? Lazar: It converts anti-matter . . . It DOESN’T convert anti-matter! There’s an annihilation reaction. It’s an extremely powerful reaction, a 100% conversion of matter to energy, unlike a fission or fusion reaction which is somewhere around eight-tenths of one percent conversion of matter to energy. Knapp: How does it work? What starts the reaction going? Lazar: Really, once the 115 [ Element 115 ] is put in, the reaction is initiated. Knapp: Automatic. Lazar: Right. Knapp: I don’t understand. I mean, there’s no button to push or anything? Lazar: No, there’s no button to push or anything. Apparently, the 115 under bombardment with protons lets out an anti-matter particle. This anti-matter particle will react with any matter whatsoever, which I imagine there is some target system inside the reactor. This, in turn, releases heat, and somewhere within that system there is a one-hundred-percent-efficient thermionic generator, essentially a heat-to-electrical generator. Knapp: How is this anti-matter reactor connected to gravity generation that you were talking about earlier? Lazar: Well, that reactor serves two purposes; it provides a tremendous amount of electrical power, which is almost a by-product. The gravitational wave gets formed at the sphere, and that’s through some action of the 115, and the exact action I don’t think anyone really knows. The wave guide siphons off that gravity wave, and that’s channeled above the top of the disk to the lower part where there are three gravity amplifiers, which amplify and direct that gravity wave. Knapp: In essence creating their own gravitational field. Lazar: Their own gravitational field. Knapp: You’re fairly convinced that science on earth doesn’t have this technology right now? We have it now at S-4, I guess, but we didn’t create it? Lazar: Right. Knapp: Why not? Why couldn’t we? Lazar: The technology’s not even — We don’t even know what gravity IS! Knapp: Well, what is it? What have you learned about what gravity is? Lazar: Gravity is a wave. There are many different theories, wave included. It’s been theorized that gravity is also particles, gravitons, which is also incorrect. But gravity is a wave. The basic wave they can actually tap off of an element: why that is I’m not exactly sure. Knapp: So you can produce your own gravity. What does that mean? What does that allow you to do? Lazar: It allows you to do virtually anything. Gravity distorts time and space. By doing that, now you’re into a different mode of travel, where instead of traveling in a linear method — going from Point A to B — now you can distort time and space to where you essentially bring the mountain to Mohammed; you almost bring your destination to you without moving. And since you’re distorting time, all this takes place in between moments of time. It’s such a far-fetched concept! Knapp: Of course, what the UFO skeptics say is, yeah, there’s life out there elsewhere in the universe; it can never come here; it’s just too darn far. With the kind of technology you’re talking about, it makes such considerations irrelevant about distance and time and things like that. Lazar: Exactly, because when you are distorting time, there’s no longer a normal reference of time. And that’s what producing your own gravity does. Knapp: You can go forward or backward in time? Is that’s what you’re saying? Lazar: No not essentially. It would be easier with a model. On the bottom side of the disk are the three gravity generators. When they want to travel to a distant point, the disk turns on its side. The three gravity generators produce a gravitational beam. What they do is they converge the three gravity generators onto a point and use that as a focal point; and they bring them up to power and PULL that point towards the disk. The disk itself will attach ONTO that point and snap back — AS THEY RELEASE SPACE BACK TO THAT POINT! Now all this happens in the distortion of time, so time is not incrementing. So the SPEED is essentially infinite. Knapp: We’ll get into the disks in a moment. But the first time you saw the anti-matter reactor in operation or a demonstration — you had a couple of demonstrations — tell me about that. Lazar: The first time I saw it in operation, we just put — a friend I worked with, Barry — put the fuel in the reactor, put the lid on as, as was shown there. Immediately, a gravitational field developed, and he said, “Feel it!” And it felt like you bring two like poles of a magnet together; you can do that with your hand. And it was FASCINATING to do that, impossible, except on something with great mass! And obviously this is just a . . . And it was a REPULSION field. In fact, we kind of fooled around with it for a little while. And we threw golf balls off it. And it was just a really unique thing. Knapp: And you had other demonstrations to show you that this is pretty wild stuff, right? Lazar: Yeah, they did. They were able to channel the field off in a demonstration that they created an INTENSE gravitational area. And you began to see a small little black disk form, and that was the bending of the light. Knapp: Just like a black hole floating around? Lazar: Yeah, well, a black hole is a bad analogy, but yeah, essentially.

Interview ( MAR – APR 1990 ) Excerpts of Bob Lazar at KLAS TV ( Las Vegas, NV ), below:

Lazar: The craft does not create an “antigravity” field, as some have surmised. “It’s a gravitational field that’s out of phase with the current one,” Lazar explained in a 1989 radio interview. “It’s the same gravitational wave. The phases vary from 180 degrees to zero … in a longitudinal propagation.” Assuming they’re in space, they will focus the three [ 3 ] gravity generators on the point they want to go to. Now, to give an analogy: If you take a thin rubber sheet, say, lay it on a table and put thumbtacks in each corner, then take a big stone and set it on one end of the rubber sheet and say that’s your spacecraft, you pick out a point that you want to go to -which could be anywhere on the rubber sheet – pinch that point with your fingers and pull it all the way up to the craft. That’s how it focuses and pulls that point to it. When you then shut off the gravity generator[s], the stone (or spacecraft) follows that stretched rubber back to its point. There’s no linear travel through space; it actually bends space and time and follows space as it retracts. In the first mode of travel – around the surface of a planet – they essentially balance on the gravitational field that the generators put out, and they ride a “wave”, like a cork does in the ocean. In that mode they’re very unstable and are affected by the weather. In the other mode of travel – where they can travel vast distances – they can’t really do that in a strong gravitational field like Earth, because to do that, first of all, they need to tilt on their side, usually out in space, then they can focus on the point they need to with the gravity generators and move on. If you can picture space as a fabric, and the speed of light is your limit, it’ll take you so long, even at the speed of light, to get from point A to point B. You can’t exceed it – not in this universe anyway. Should there be other parallel universes, maybe the laws are different, but anyone that’s here has to abide by those rules. The fact is that gravity distorts time and space. Imagining that you’re in a spacecraft that can exert a tremendous gravitational field by itself, you could sit in any particular place, turn on the gravity generator, and actually warp space and time and “fold” it. By shutting that off, you’d click back and you’d be at a tremendous distance from where you were, but time wouldn’t have even moved, because you essentially shut it off. It’ s so farfetched. It’s difficult for people to grasp, and as stubborn as the scientific community is, they’ll never buy it that this is in fact what happens.

According to Lazar, the propulsion system he worked on at S-4 gives rise to certain peculiar effects, including INVISIBILITY of the craft: “You can be looking straight up at it, and if the gravity generators are in the proper configuration you’d just see the sky above it – you won’t see the craft there. That’s how there can be a group of people and only some people can be right under it and see it. It just depends how the field is bent. It’s also the reason why the crafts appear as if they’re making 90- degree turns at some incredible speed; it’s just the time and space distortion that you’re seeing. You’re not seeing the actual event happening.” If the crafts look like they’re flying at seven thousand miles per hour and they make a right-angled turn, it’s not necessarily what they’re doing. They can ‘appear’ that way because of the gravitational distortion. I guess a good analogy is that you’re always looking at a mirage – [ it’s only when ] the craft is shut off and sitting on the ground, ‘that is’ what it ‘looks like’. Otherwise, you’re just looking at a tremendously distorted thing, and it will appear like it is changing shape, stopping or going, and it could be flying almost like an airplane, but it would never look that way to you. Knapp: How close do you think you have to get before time distortion takes place? Lazar: It’s tough to say, because it depends on the configuration of the craft. If the craft is hovering in the air, and the gravity amplifiers are focused down to the ground and it’s standing on its gravity wave, you would have to get into that focused area. If you’re directly underneath the craft at any time there’s a tremendous time distortion, and that’s in proportion to the proximity of the craft. Lazar: I don’t know if I mentioned it before, but the amplifiers always run at 100%. They are always outputting a maximum gravity wave, and that wave is phase-shifted from zero to 180 degrees. That’s essentially the attraction and repulsion, and it’s normally at a null setting somewhere in between. It’s a very straightforward system. It looks more like a coal-fired engine than very hi-tech.

Interview ( JUN – JUL 1990 ) Excerpts of Bob Lazar at KLAS TV ( Las Vegas, NV ), below:

Lazar: …And there are two specific different types of Gravity: Gravity A and Gravity B. Gravity A works on a smaller, micro scale while Gravity B works on a larger, macro scale. We are familiar with Gravity B. It is the big gravity wave that holds the Earth, as well as the rest of the planets, in orbit around the Sun and holds the moon, as well as man-made satellites, in orbit around the Earth. We are not familiar with Gravity A. It is the small gravity wave, which is the major contributory force that holds together the mass that makes up all protons and neutrons. Gravity A is what is currently being labeled as the Strong Nuclear Force in mainstream physics, and Gravity A is the wave that you need to access and amplify to enable you to cause space-time distortion for interstellar travel. To keep them straight, just remember that Gravity A works on an atomic scale, and Gravity B is the big gravity wave that works on a stellar or planetary level. However, don’t mistake the size of these waves for their strength, because Gravity A is a much stronger force than Gravity B. You can momentarily break the Gravity B field of the Earth simply by jumping in the air, so this is not an intense gravitational field. Locating Gravity A is no problem because it is found in the nucleus of every atom of all matter here on Earth, and all matter everywhere else in our universe. However accessing Gravity A with the naturally occurring elements found on Earth is a big problem. Actually, I’m not aware of any way of accessing the Gravity A wave using any Earth element, whether naturally occurring or synthesized, and here’s why. We’ve already learned that Gravity A is the major force that holds together the mass that makes up protons and neutrons. This means the Gravity A wave we are trying to access is virtually inaccessible as it is located within matter, or at least the matter we have here on Earth. The most important attribute of these heavier stable elements is that the Gravity A wave is so abundant that it actually extends past the perimeter of the atom. These heavier, stable elements literally have their own Gravity A field around them in addition to the Gravity B field that is native to all elements. No naturally occurring atoms on Earth have enough protons and neutrons for the cumulative Gravity A wave to extend past the perimeter of the atom so you can access it. Even though the distance the Gravity A wave extends is infinitesimal, it IS accessible and has amplitude, wavelength and frequency just like any other wave in the electromagnetic spectrum. Once you can access the Gravity A wave, you can amplify it just like we amplify any other electromagnetic wave. So, back to our power source. Inside the reactor, element 115 is bombarded with a proton that plugs into the nucleus of the 115 atom and becomes element 116 which immediately decays and releases or radiates small amounts of antimatter. The antimatter is released in a vacuum into a tuned tube that keeps it from reacting with the matter that surrounds it. It is then directed toward the gaseous matter target at the end of the tube. The matter and antimatter collide and annihilate, totally converting to energy. The heat from this reaction is converted into electrical energy in a near 100% efficient thermoelectric generator. This is a device that converts heat directly into electrical energy. Many of our satellites and space probes use thermoelectric generators, but their efficiency is very, very low. All of these actions and reactions inside of the reactor are orchestrated perfectly like a tiny little ballet, and in this manner the reactor provides an enormous amount of power. So, back to our original question: What is the power source that provides the power required for this type of travel? The power source is a reactor that uses element 115 as a fuel, and uses a total annihilation reaction to provide the heat which it converts to energy, making it a compact, lightweight, efficient, onboard power source. I’ve got a couple of quick comments, on Element 115, for those of you that are interested. By virtue of the way it’s used – in the reactor – it depletes very slowly, and only 223 grams ( just under ½ pound ) of Element 115 can be utilized for a period of 20 to 30-years. Element 115 melting point is 1740 C. I need to state here that even though I had hands-on experience with Element 115, I didn’t melt any of it down and I didn’t use any of it for twenty to thirty years to see if it depleted. Now when a disk travels near another source of gravity, such as a planet or moon, it doesn’t use the same mode of travel that we learned about in our science lesson. When a disk is near another source of gravity, like Earth, the Gravity A wave, which propagates outward from the disk, is phase-shifted into the Gravity B wave propagating outward ( from the Earth ), creates lift. The gravity amplifiers ( of the disk ), can be focused independently, and they are pulsed and do not remain ‘on’ continuously. When all three [ 3 ] of these amplifiers are being used for travel, they are in the delta wave configuration, and when only one [ 1 ] is being used, for travel, it is in the omicron wave configuration. As the intensity of the gravitational field around the disk increases, the distortion of space-time around the disk also increases. And if you could see the space-time distortion, this is how it would look [ Bob Lazar draws a side-view picture of saucer hovering above ground, with field surrounding it and running straight down to the ground. Picture a disk on the end of a pole, then throw a sheet over it. ] As you can see, as the output of the gravitational amplifiers becomes more intense, the form of space-time around the disk not only bends upward but – at maximum distortion – actually folds over into almost a ‘heart shape design around the top’ of the disk. Now remember, this space-time distortion is taking place 360 degrees around the disk, so if you were looking at the disk from the top, the space-time distortion would be in the shape of a doughnut. When the gravitational field around the disk is so intense, that the space-time distortion around the disk achieves maximum distortion, and is folded up into this heart shaped form, the disk cannot be seen from any angle vantage point – and for all practical purposes is invisible. All you could see would be the sky surrounding it.

Interview ( 28DEC89 ) of Bob Lazar at KVEG Radio Station ( below ):

KVEG Radio Incoming Caller: With the gravity generators running, is there thermal radiation?

Lazar: No, not at all. I was never down on the bottom ‘while’ the gravity generators were running, but the reactor itself – there’s no thermal radiation whatsoever. That was one of the really shocking things because that violates the first law of thermodynamics. Lazar: In fact, I’m in the process of fabricating the gravity amplifier, but then I’m at a tremendous shortage for power. So yeah, I have even tried to do that stuff on my own. Caller: Is there any electronics, as we know it, chips or transistors? Lazar: No, nothing like that. Because, of the tremendous power involved too, there was ‘no direct connection between the gravity amplifiers and the reactor’ itself. Caller: Are the waveguides similar to what we use with microwaves? Lazar: Very similar. Caller: In regard to the long-range method of travel, isn’t a ‘propulsion unit’ the wrong idea? I feel this device is creating a situation where it is diminishing or removing the localized gravitational field, and the long-distance body – that they’re heading toward – is actually ‘pulling’ the vehicle rather than it [ the vehicle ] being pushed. Am I correct in this? Lazar: The vehicle is not being pushed. But being ‘pulled’ implies it’s being pulled by something externally; it’s pulling something else to ‘it’. ‘It’ is ‘creating the gravitational field’. Caller: Is there any relation to the ‘monopoles’, which [ scientists ] have been looking for? Lazar: Well, they’ve been looking for the ‘monopole magnet’, but then this [ the UFO force ] is a gravitational force. Caller: What is the top speed of the craft? Lazar: It’s tough to say a top speed because to say ‘speed’ you have to ‘compare distance and time’. And when you’re screwing around with time, and distorting it, you can ‘no longer judge a velocity’. They’re ‘not traveling in a linear mode’ – where they just fly and cover a certain distance in a certain time. That’s the ‘real definition of speed’. They’re ‘bending and distorting space’ and then essentially snapping it back with the craft so, the ‘distances they can travel’ are phenomenal – in ‘little or no time’. So ‘speed has little bearing’. Caller: You’ve mentioned anti-gravity generator and anti-matter generator. Are they different? Lazar: It’s ‘not a gravity generator’ – it’s a ‘gravity amplifier’. I get tongue-twisted all too often. The ‘anti-matter reactor provides the power’ for the craft and the basic ‘low-amplitude gravitational wave’ – ‘too low of amplitude’ to do anything – is ‘piped into the gravity amplifiers’ – found at the bottom of the craft – amplify that to an ‘extremely powerful wave’, and ‘that is what the craft travels along’. But there is ‘an anti-matter reactor’ that ‘provides the power’. Caller: I understand there’s an antenna section in this device; what is the resonant frequency that that operates at? Lazar: The resonant frequency of the gravity wave I ‘do know’ but I don’t know it off hand – I just cannot recall it just now. Mark: Can you give me a ballpark, like 2,000 kilohertz? Lazar: I really don’t remember. It’s a really odd frequency. Mark: Is it measured in kilohertz or gigahertz or megahertz? Lazar: I really don’t remember. Burt: You were talking about the low- and high-speed modes and the control factors in there. Can you describe those modes and what the ship looks like each time it is going through those modes? Lazar: The low-speed mode — and I REALLY wish I could remember what they call these, but I can’t, as I can’t remember the frequency of the wave –The low-speed mode: The craft is very vulnerable; it bobs around. And it’s sitting on a weak gravitational field, sitting on three gravity waves. And it just bounces around. And it can focus the waves behind it and keep falling forward and hobble around at low speed. The second mode: They increase the amplitude of the field, and the craft begins to lift, and it performs a ROLL maneuver: it begins to turn, roll, begins to turn over. As it begins to leave the earth’s gravitational field, they point the bottom of the craft at the DESTINATION. This is the second mode of travel, where they converge the three gravity amplifiers — FOCUS them — on a point that they want to go to. Then they bring them up to full power, and this is where the tremendous time-space distortion takes place, and that whips them right to that point. Burt: Did you actually bench-test a unit away from the craft itself? Lazar: The reactor, yeah. Burt: About how large is this, and could you describe it? Lazar: The device itself is probably a plate about 18-inches square; I said diameter before but it is square. There’s a half-sphere on top where the gravity wave is tapped off of, but that’s about the size of it. Caller Jim ( Las Vegas, NV ): On TV [ television ], you spoke of observing a demonstration of this anti-matter gravity wave controller device. And you made a mock-up copy? Lazar: A friend made one, yeah. Jim: I heard you speak of bouncing golf balls off of this anti-gravity field? Lazar: Yeah. Jim: And also about the candle, the wax, and the flame stood still? Lazar: Right. Jim: And then the hole that you saw appear – Lazar: It wasn’t a hole; it was a little disk. Jim: Under what conditions did you see this demonstrated? Elaborate on this. And how large was the force field? Lazar: The force field where the candle was? Jim: The force field created by the anti-matter device. Lazar: It was about a 20-inch radius from the surface of the sphere. Jim: Where was this area, just above the device? Lazar: Yeah, surrounding the sphere. Jim: Did the sphere surround the device? Lazar: No, the sphere sits in the center of the device. It’s a half-sphere sitting on a plate, and a field surrounds the half-sphere. Jim: And you just place a candle in there? Lazar: No, no, no. That was a separate demonstration. I’m just telling you from where the field extends. Jim: Oh, that’s what I’m curious about. Lazar: No, they tap the field off using a wave-guide, off of the sphere. And this is a completely different setup, where they had a mockup small gravity amplifier, and there were three focused into a point, and that area of focus was probably nine or ten inches in diameter. Jim: They displaced this area or moved this area? Lazar: No, it wasn’t displaced; it’s just where the field was generated. Jim: And in there you put the candle? Lazar: Right. Jim: And that thing can actually bounce golf balls off of it? Lazar: No, no. The golf ball thing, again, had nothing to do with that setup. The golf ball thing had something to do with just when the reactor was energized, before the wave-guide was put on or anything. We were just pushing on the field; it was being demonstrated to me; and we just bounced a golf ball off the top.

Interview Excerpts of Bob Lazar at Seminar; Rachel, Nevada ( 01MAY1993 ), below:

Question: I’m interested in a little bit more about the physics of the power generation from the development of the anti-matter to the Gravity “A” wave and the amplification and the process of generation of that and being able to fold space. Lazar: Well, it’s… I can give you, I guess, a brief overview of essentially how that works. If you want an in-depth description, you can give me your address and I can send you a paper on it. Essentially, what the reactor does is provide electrical power and the base gravity wave to amplify, and it does that by interacting matter and antimatter, essentially. The way it does that is injecting an accelerated proton into a piece of 115. That spontaneously generates anti-hydrogen, essentially. That’s reacted in a small area. It’s a compressed gas, probably compressed atmospheric gas, and the antimatter reacting with matter produces the energy, mainly heat energy, and that is converted into electrical energy by a thermionic [ thermal ion / thermion ] generator that appeared to be 100% efficient, which is a difficult concept to believe anyway. Also, the reactor has two functions. That’s one of them; the other function is, it provides the basic gravity wave that’s amplified, and that appears at the upper sphere of the amplifier itself, and that’s tapped off with a wave-guide, similar to microwaves, and is amplified and focused, essentially. Question: So how is the electrical energy related to the amplification of the gravitational “A” wave energy? Lazar: The electrical energy is transmitted essentially without wires, and I related it to almost a Tesla setup. It seemed like each sub component on the craft was attuned to the frequency that the reactor was operating at, so essentially the amplifiers themselves received the electrical energy, like a Tesla coil transmits power to a fluorescent tube, and what was the rest of the question? Question: Yeah, in other words, what is the relationship between… I think you basically answered it. Lazar: Yeah, that’s how the amplifiers receive the power and through the wave-guide to receive the basic wave. It’s almost…It’s very, very similar to a microwave amplifier… Question: Was the local means of propulsion the same as these across-space distances? What was the local means of propulsion? Lazar: The local means of propulsion is essentially them balancing on a out of phase gravity wave, and it’s not as stable as you would think. When the craft took off, it wobbled to some degree. I mean a modern day Hawker Harrier or something along those lines of vertical takeoff craft is much more stable than then in the omicron [omicrom?] configuration, which is that mode of travel. The delta configuration is where they use the three amplifiers. Those are the only two methods I know about for moving the craft. Question: When you listen to some abduction reports, whether or not people believe it or not, there seems to be a common thread of people being hit by blue beams of light…. Lazar: Any of the three gravity amplifiers could do that, could lift something off the ground, or for that matter compact it into the ground. That’s not a problem, because the craft can operate on one amplifier, in omicron mode, hovering. That would leave the other three (?) amplifiers free to do anything. So I imagine they could pick up cows or whatever else they want to do. On the craft I worked on there was absolutely no provision for anything to come in through the bottom of the craft, or anything along those lines… Question: So what was the course of energy? How did it go from one area to another area? Lazar: The best guess is essentially it operated like a Tesla coil does. A transmitter and essentially a receiver tuned to the transmitting frequency, receives electrical power. There again, that’s not real advanced technology. Tesla did that in the 30s, I think. Question: You mentioned the photon earlier. Do you think that physics is taking a wrong turn by looking for exchange particles, when you’re talking about the strong force of gravity again? I’m not clear why you’re skeptical about the graviton? Lazar: About the graviton? Question: Every other force seems to have exchange particles connected with it. Lazar: No, not necessarily. I mean, they make it have one, but as time goes on, that really hasn’t held true. The bottom line is, they don’t…First of all, they don’t even believe there’s a graviton anymore, so I’m not the only one. As far as exchange particles, still, though some of them like the zeta particle, maybe that’s an actual thing, but when they’re looking at transfers of energy, I think these are scapegoats for the most part. A lot of experiments that I was doing at Los Alamos essentially were along these same lines, but other exchange particles like the intermediate vector bozon, I don’t believe that thing exists. I really don’t. I think they’re grabbing at straws and just coming up with excuses. Question: What about the small gravity, the Gravity “A”; how can you detect that one? What is the frequency of that? Lazar: Well, the frequency that the actual reactor operates at is like 7.46 Hertz. It’s a very low frequency. Question: That’s the frequency of Earth’s gravity, or universally, all gravity? Lazar: That’s the frequency the reactor operates at. Question: I can understand a reactor functioning – theoretically I can understand a reactor functioning at, say, (unintelligible word) 7.46 Hertz. There’s a wave-guide involved. I don’t buy 7.46… Lazar: No, that’s the basic… The frequency of the gravity wave that’s produced, it has to be higher frequency, because you’re in a microwave range to follow a conduit like that. Question: I understand from Lear’s lecture that it had a tendency to conduct on the outside also of the reactor. Lazar: Right. Well, that’s all… this was the electric field we were talking about. The basic frequency, I think, was the way the reactor was operating. The pulses that we detected out of it were probably, instead of a straight DC power supply, it was more along the lines of a pulse, as if we were getting a burst of particles coming out: An antimatter emission, then a reaction, a pulse of energy, and that would repeat. That’s about seven and a half Hertz, something along those lines. Question: Bob, the microwave frequency going to the wave-guide is electromagnetic, or that’s gravitational? Lazar: They’re one in the same. Question: I don’t understand what you mean by that. Lazar: Gravity is… Unfortunately, physics hasn’t gotten to that part yet, but gravity essentially is part of the electromagnetic spectrum. Question: Then what frequency is it? Lazar: That’s something I’m reserving for myself. Question: Something about the microwave range? Lazar: Something about the microwave range. Well, you can sort of figure it out by the dimensions of the waveguide itself, and that’s about it. Question: Positive energy versus regular photon? Lazar: No, it’s not photon. Question: Electromagnetic Energy? Lazar: Right. I’m not trying to be secret, but this is part of the equipment that I’m working on, and I want to get it operating before… Question: I hope we’ll find out one day. Lazar: Absolutely.

Speech of Edgar Fouché at International UFO Congress ( Summer 1998 ), below:

[ photo ( above ) LOCKHEED SR-71 Blackbird surveillance aircraft ( click to enlarge ) ]

I’m here to speak about government technology, special programs, and the TR-3B Flying Triangle. Thousands of sightings, of the Flying Triangle, have been reported, photographed, and investigated around the world – The USAF denies having such a vehicle. It also denies having replaced the Strategic reconnaissance spy plane – the SR-71 Blackbird. Keep this in mind as I proceed:

Astronauts Edgar Mitchell and Gordon Cooper say that new investigations are warranted in UFOs.

Edgar Mitchell, who became the sixth [ 6th ] man on the moon during the Apollo 14 mission said, “The evidence points to the fact that Roswell [ New Mexico crash of UFO ] was a real incident and that indeed an alien craft did crash and that material was recovered from that crash site.” Mitchell doesn’t say he’s seen a UFO, but he says he’s met with high-ranking military officers who admitted involvement with alien technology and hardware.

Gordon Cooper told a United Nation ( UN ) committee recently; “Every day in the USA, our radar instruments capture objects of ‘form’ and ‘composition’ unknown to us.” Cooper speculates public skepticism, toward UFOs, will shift dramatically.

Now, a little about my background:

I’ve held positions within the United States Air Force ( USAF ) that required me to have Top Secret and ‘Q’ clearances, and Top Secret Crypto access clearances.

I’ll show you pictures of some of aircraft programs I’ve worked. I’ll also show you some pictures of classified aircraft. And I’ll share with you some of the information and stories I’ve gathered through my research in developing [ my book ] Alien Rapture. In many cases I’ve been able to obtain actual details of this [ my book, “Alien Rapture – The Chosen” ] black technology.

I was born to fifth [ 5th ] generation French-Americans, and many of my relatives – for generations – have historically been involved with the government in fields of intelligence, black programs, cryptography, and classified development projects.

This is true, as far back as the French revolution, where Joseph Fouché was Prime Minister under Napoleon. He was the head of the French secret National Police Force and was a direct ancestor of mine. Joseph Fouché started and controlled the world’s first professionally organized intelligence agency with agents throughout Europe.

The CIA, the Russia KGB ( now FSB ), the UK MI-5 ( MI-6 ), Israel’s Mossad, and many other intelligence agencies have used and expanded on his methods of intelligence gathering, networking information, and political survival.

I have also worked intelligence and cryptography related programs, but because of oaths of secrecy, I will ‘not be able to share any details of this work’.

My career background spans 30-years, and since the government isn’t about to support my claims, you will see from the positions I’ve held and the Programs I worked that I was in a position to gather the information I am presenting.

Before, I gave a presentation to the International UFO Congress in Laughlin, Nevada during August [ 1998 ], I brought over 200 documents as an offer of proof to substantiate my credibility. These documents contained information on the positions and assignments I held in the U.S. Air Force and as a DOD [ U.S. Department of Defense ] contractor. They also detailed clearances I held, classified and non-classified ( DOD and military ) technical training I received ( over 4,000 hours ), and performance reviews from 1968 to 1995.

As a civilian, from 1987 to 1995, I performed as engineering program manager, site manager, and Director of Engineering for several DOD contractors.

Ken Seddington, of the International UFO Congress, Jim Courrant, a UFO investigator, and Tim Shawcross and John Purdie, of Union Pictures in London, England viewed these documents. Some of these documents are shown in “Riddle of the Skies” special, which will be on The Learning Channel next month.

With my training and experiences with intelligence equipment, special electronics, black programs, and crypto-logical areas, I received other government opportunities. I filled positions as major command liaison, headquarters manager, and DOD factory representative for TAC, SAC, ATC, and PACAF following the Viet Nam War.

Later in my career, as a manager of defense contractors, I dealt with classified ‘black programs’ developing state-of-the-art electronics, avionics, and automatic test equipment [ ATE ].

I was considered an Air Force expert with classified electronics counter-measures test equipment [ ATE ], certain crypto-logical equipment – owned by the National Security Agency – and Automatic Test Equipment [ ATE ].

I’ve worked with many of the leading military aircraft and electronics manufacturers in the U.S. At different times I participated as a key member in design, development, production, and flight operational test and evaluation in classified aircraft development programs, state-of-the-art avionics, including electronic countermeasures, satellite communications, crypto-logic support equipment.

During my military career, I was ‘hand picked’ ( Development Cadre ) for many of the Air Force newest fighter and bomber development programs. I also represented many of these programs for TAC, SAC, PACAF, and ATC.

Other research and development programs I worked as far back as the 1970s are still classified Top Secret.

My involvement with black programs, developing stealth aircraft, is classified.

I am perhaps the only person who has actually worked at the Top Secret Groom Lake Air Base, within Area 51 of the Nellis Range, and has proved that I had the position, training, and clearances to be there.

[ photo circa: 1974 ( above ) DOD DARPA Have Blue Project F-117 stealth fighter ( click to enlarge ) ]

This [ NOTE: ‘not photo above’ ] is a F-117 Stealth fighter being readied at Groom Air Base at night. Notice the fog engines in work for cover.

My last position for the Air Force was as a Strategic Air Command Headquarters’ Liaison. As a Defense Contractor-Manager, I performed as an engineering program manager and site manager for DOD contractors involved in classified development, logistics support, electronic engineering, and technical data development from 1987 – 1995.

I have completely disassociated myself from the defense industry. I consider myself a writer and inventor now.

I undertook this trip to do research for my book Alien Rapture, which included a meeting with five [ 5 ] close friends who had agreed to release confidential information to me and discuss their closely guarded personal experiences.

I also interviewed other contacts that had worked classified programs or flown classified military aircraft to gather information about UFO sightings and contact.

Later, I was blessed to team-up with a great man and a great writer, Brad Steiger. I had decided to get out of the defense industry, as I felt that fraud, waste, and abuse was rampant – both on the government and contractor sides.

Who were the five [ 5 ] friends and co-conspirators and a host of other insiders?

It started when some old friends of mine met in the spring of 1990 in Las Vegas [ Nevada ]. There were five [ 5 ] of us then; all of us had remained close following the Vietnam War. I’ve always been the networker for my DOD, military, and contractor friends so, I’m the one who set up the meeting with the five [ 5 ]:

1. The first friend, Jerald *, was a former NSA or TREAT Team member. T.R.E.A.T. stands for Tactical Reconnaissance Engineering Assessment Team. Jerald * worked for the DOE [ U.S. Department of Energy ] as a national security investigator. That was his cover, but he really worked for the NSA [ U.S. National Security Agency ]. His job required him to manage a team – to ‘watch employees’ with Top Secret and Q clearances in the mid-west, in Los Alamos, Sandia, and White Sands ( New Mexico ), and in the Nevada Test Site and Nellis Range, which includes Area 51. Area 51 is where the most classified aerospace testing in the world takes place. You may know the base as Groom Lake Air Base, Watertown, The Ranch, or Dreamland. He [ Jerald ] was found dead of a heart attack 1-year after our last meeting.

2. The second friend, Sal *, was a person who had worked directly for the NSA [ U.S. National Security Agency ] with Electronic Intelligence ( ELINT ) and became a defense contractor after his retirement.

3. The third friend, Doc *, was a former SR-71 spy plane pilot and USAF test pilot at Edwards Air Force Base [ California ].

4. The fourth friend, Dale *, and I were in the service together during the Viet Nam conflict [ war ], and I’ve known him [ Dale ] since the early 1970s. His father worked for over 20-years for the NSA [ U.S. National Security Agency ] and he [ Dale ] is the one who sent me the MJ-12 [ Majestic ] documents his father had obtained. These documents, the New MJ-12 Charter signed by proxy during the Reagan [ Ronald Reagan ] Administration and Attachment D to the Eisenhower [ U.S. Army General Dwight D. Eisenhower ] MJ-12 briefing document, which is the Autopsy Report from Roswell [ New Mexico ], are included as attachments in my book Alien Rapture.

5. The fifth friend, Bud *, was a DOD contractor and electronics engineer. He [ Bud ] had worked on Top Secret development programs dealing with Electronic CounterMeasures [ ECM ], radar homing and warning, ECM ( Electronic CounterMeasure ) jammers, and Infra-Red [ IR ] receivers. He [ Bud ] retired as a program manager and later died of a brain tumor within 30-days after his symptoms appeared.

*All names and identifying factors have been changed.

It bothered each of us that we had experiences with unusual phenomena, extremely advanced technology, and witnessed unidentified aerial contact that had not been previously reported. We sat at a table in a dark corner of the Silver Dollar Saloon and Casino in Las Vegas [ Nevada ], discussing our experiences and swapping knowledge.

In 1990, I had no intention of writing about programs I was involved-with due to the Secrecy Act and classification documents I had signed.

Jerald asked me if I had ever heard of the ‘Flying Triangle’.

Of course I had heard rumors of Delta shaped and bat winged shaped prototypes being tested at Groom Air Base.

He [ Jerald ] said that an early test model – of the Flying Triangle – was sighted by hundreds of people over Hudson Valley, in the mid 1980s there was a major flap in Belgium – the year [ 1989 ] before our meeting [ 1990 ] – and that thousands of people had witnessed the Triangle and F-16 chase that followed. He definitely piqued my curiosity.

Over the next 4-years, each member of the group wrote down as much information as he could remember about unusual phenomena and personal sightings.

From my close friends, came their contacts. I agreed to interview these contacts in person. I interviewed four [ 4 ] other [ LOCKHEED ] SR-71 pilots, two [ 2 ] [ LOCKHEED ] U-2 [ Dragon Lady ] pilots, a [ 1 ] TR-1 pilot, and about two dozen [ 24 ] bomber and fighter jocks [ jockeys ]. None, of the people I interviewed, wanted to be known or quoted – and wanted me to swear never to reveal their names. I have and will continue to honor their wishes.

Many were afraid of what the government would do to them for taking about Top Secret ‘Black Programs’ they were involved with, and others were just worried about losing their retirement pensions.

I’ll Share some of these secrets and unusual phenomena with you:

[ photo ( above ) LOCKHEED A-12 ( single and dual cockpit versions ) ]

The SR-71 was designed as a spy plane for the CIA in the 1960s and designated the A-12.

[ extremely rare photo circa: 1954 ( above ) LOCKHEED YF-12 Prototype ( click to enlarge ) ]   [ rare photo circa: 1958 ( above ) LOCKHEED A-12 ( click to enlarge ) – NOTE: USAF brand new 1958 Edsel station wagon ( blue with white top ) and Dodge Power Wagon ( blue with white top ) pick-up truck ( mid image far right ) ]

[ photo ( above ) LOCKHEED A-11 ( click to enlarge ) ]

The Mach 3 plus aircraft first flew in 1962 [ ? ], taking off from Groom AFB [ ? ] in Area 51.

[ rare photo circa: 1960 ( above ) LOCKHEED SR-71 ( click to enlarge ) ]

Later, once the Air Force operated it as a reconnaissance plane, it was designated the SR-71 BlackBird.

My friend Chuck, an SR-71 pilot, related to me an in-flight incident he experienced in the 1970s. He was returning from a reconnaissance flight, and while at an altitude of 74,000 feet at the speed of almost Mach 3, ( 3 times the speed of sound ) he noticed something flickering in his peripheral vision. Hovering over his left wing tip was a ball of dense plasma like light. It was so bright, that when he stared at it for more than a few seconds, his eyes hurt.

Chuck tried to use his UHF-HF and VHF communications sets to no avail. There was nothing but static. Repeatedly glancing briefly at the ball of light, he watched in amazement as it moved effortlessly about his aircraft.

At one point the light positioned itself a few feet in front of the large spiked cone at the air Intake Inlet. The enormous amount of air rushing into the engines should have sucked in and shredded almost anything in its path, but the light orb was mysteriously unaffected.

The light, he noted, acted in a curious manner, if something inanimate could act at all. It moved from time to time to other parts of the vehicle, staying with him until his approach to Beale AFB in California. He was in sight of the Air Base when the light swung away from his aircraft in a wide arch with ever increasing speed.

Of course, after reading his incident report, his operations commander told him not to ever speak about his experience. When Chuck related the story to me, he told me he was absolutely convinced that the ball of light was controlled by some form of intelligence. I have about two dozen [ 24 ] stories from pilots of similar in flight incidents with UFOs and plasma balls. There have been thousands of reported sightings of plasma balls, energy filled orbs, or foo fighters as they were named during World War II.

In 1944, while fighting the Japanese and Germans, pilots started reporting strange flares and bright orange and red lights. These lights moved rapidly, were under intelligent control, and could come to a complete stop, remain stationary, and then disappear in an instant.

Foo means ‘fire’ in French. The pilots coined the term ‘foo fighters’ for the haunting glowing balls that doggedly paced their jets. Most were unnerved by radical maneuvers the foo fighters that could climb vertically, accelerate, and make high G turns at speeds far beyond any known allied aircraft.

Not far from the Royal Air Force base, MacRahanish, a triangular shaped aircraft was spotted off Western Scotland. MacRahanish has been rumored to be a base for black aircraft operations for a number of years. It’s also a NATO standby base.

RAF personnel have admitted that they have witnessed the operation of large triangular aircraft from RAF Boscombe in February 1997.

It was widely reported that a secret U.S. spy plane crash landed at Boscombe Down in 1994. It had been rumored for some time that the Triangle spotted over Belgium was based at Boscombe Down and Royal Naval Air Station ( RNAS ) Yeovilton where other sightings of the Triangle were reported.

British RAF [ Royal Air Force ] have a long history of close involvement with U.S. aerospace black programs. Key RAF officers and British scientists have been involved at Groom Air Base [ Nevada ] since 1957 and the [ LOCKHEED ] U-2 [ DragonLady ] program.

In 1995 and 1996 the National UFO Reporting Center alone received forty-three [ 43 ] reports:

11 in the State of Washington; 8 in the State of California; and, 18 from other states – from the State of Hawaii to the State of New York

Sightings, of a Triangular aircraft.

A few years ago The British Magazine, UFO Reality, published this information:

“A top BBC [ British Broadcast Corporation ] executive let slip recently that there is a D-Notice on media reporting of the so-called ‘Black Triangle. The executive is the former producer of a very popular BBC science program. He told one of our team that the black Triangle ‘craft’ – first witnessed by the hundreds in the Hudson Valley region of the U.S. in the mid-1980s, then by the thousands in Belgium in 1989 – 1990, and more in Britain – has been ‘heavily D-Noticed’ by the government. For this reason the BBC will NOT be reporting on the enigmatic craft, no matter how many witness reports there are. According to this producer, the government’s restrictive notice on reporting the Triangle, was authorized under secrecy laws, in order to protect secret new military projects.”

From 1973 through 1976, I was home based out of Edwards AFB. It is near Lancaster, California and even nearer to the San Andrus [ mountain range zone ] fault [ plate tectonic earthquake demarcation line ].

Edwards [ AFB ] has a long history with secret technology and experimental aircraft. The YB-49 was flown in 1948 at Edwards AFB which looks a lot like the B-2 Stealth Bomber.

[ photo ( above ) LOCKHEED XB-70 Valkyrie with 4 PRATT & WHITNEY engines ( click to enlarge ) ]

The XB-70 flown in 1964 looks a lot like the still Top Secret SR-75 that, the Air Force says doesn’t exist.

Edwards A.F.B. is the home of the U.S. Air Force Test Pilot School and is responsible for Flight Operational Test and Evaluation [ FOTE ] of the Air Force’s newest aircraft.

Edwards [ AFB ] hosts a number of tenant organizations, from NASA to the Jet Propulsion Laboratory [ Pasadena, California ] facility.

Edwards [ AFB ] developed various versions of the flying wing [ shaped aircraft ] from the B-35, YB-49, B-2 [ Spirit ], and exotic aircraft sometimes ahead of their time – like the XB-70, F-117, and YF-22.

I worked with the F-111 swing-wing bomber, the F-15 air superiority fighter, the F-16 fighter, the A-10 [ Warthog ] close air support attack aircraft, and B-1 stealth bomber. I was involved with these and other classified development programs, when they were just a gleam in some pilot trainee’s eyes.

One night – in the mid 1970s – a long time friend of mine and I were standing on top of the Fairchild A-10 [ Wart Hog ] hanger at Edwards AFB in southern California. It was about 02:00 a.m. and a clear night with millions of stars visible to the naked eye. I noticed a group of stars that seemed to be shifting in color. I pointed out to my friend that the three [ 3 ] bright stars in triangular formation were not part of the big dipper.

We watched as the strobing stars shifted from bright blue to a red yellow color. After a period of about 20-minutes we could tell the objects probably weren’t stars because they were getting larger. This was somewhat unnerving.

* It was further unnerving when ‘the space in-between the enlarging lights began blocking out the stars in the background’. [ See, e.g. beginning of this report, i.e. Victor Valley northeast area of California alongside Interstate 15 ( further above ) ]

We decided it probably was a Top Secret Air Force vehicle of some type, still we weren’t sure. The vehicle had gone from, half the size of the big dipper, to twice its size in under a half hour [ 30-minutes ]  and had moved from the west to the east towards the base.

About the time we could make out a silhouette or outline of the triangular vehicle, the lights – or possibly exhausts – flared brighter and vanished from the sky in an instant.

This experience wasn’t my first sighting, but it was one of the few where I had a witness. In the summer of 1976, I relocated to Nellis Air Force Base – north of Las Vegas. I spent the next 3-1/2 years there. I worked primarily with the F-15, electronics countermeasures, and Automatic Test Equipment [ ATE ]. I had heard rumors of airbases located in the desert, at places called:

Mercury; Indian Springs; and, Others, that didn’t even have names.

Before the collapse of the [ Russia ] USSR ( CCCP ), no one talked about their classified work experience, nor did anyone repeat rumors of Top Secret technology and aircraft.

Most of us who had Top Secret clearances never even told our wives what we were doing, and where we were going – when on these type projects. I once spent 6-months, in Viet Nam, while my ex-wife thought I was attending a classified technical school in Colorado.

The military, in a court of law, actually denied the existence of a classified Air Force Base inside the Nellis Range out in the Nevada Desert. Don’t you know, the Plaintiffs – who had worked at Groom – and their lawyer was surprised to hear this, but that’s another story.

I was one of the few personnel at Nellis who had a Top Secret clearance with Crypto access. I was certified to work on Mode 4 IFF, ( an aircraft system which responded to classified, encrypted codes ). I was also certified to work on other Crypto equipment that I cannot discuss.

Due to a combination of coincidences, and my technical experience, I was requested to temporary assignment at a place that had no name. My commander told me I was to report to an office on the base, and he didn’t have a clue where I was going or what I was going to be working on. And let me tell you, he wasn’t too happy about being left in the dark.

I left one Monday morning long before sunrise. It was 4:30 AM when I boarded a dark blue Air Force bus with all of the windows blacked out. There were 28 other people on the bus, not including the 2 security policemen holding M-16 automatic weapons and the bus driver. We were each told when boarding, “Do not speak on this bus unless you are spoken too.” Not one of us uttered a word. Believe me. There is nothing that can inspire compliance like an M-16 sticking in your face, I assure you!

The bus drove through the desert, this much I know from the poor air-conditioning and the amount of fine dust that came through every crack in the old vehicle for several hours, and it was soon obvious where I was.

In the 1950s, the U.S. government started building the super secret Groom Lake facilities – for the CIA U-2 ( DragonLady ), a high-altitude global surveillance aircraft. Acquired in 1951 – with $300,000,000 million in seed money from the CIA – the site was called S-4, the name Area 51 and Site 4 insiders call the Papoose Lake, Nevada facilities south of Groom Lake, Nevada. Area 51, is located in the north central part of the Nellis Air Force Base ( AFB ) Range designated Area 51. Construction of facilities within the Nellis Range continues even to today.

The SR-71 Blackbird surveillance aircraft, TR-1, F-117 stealth fighter, B-2 bomber, TR-3A Manta, and TR-3B Astra or flying triangle were tested at Groom Lake, Nevada. Under certain circumstances had persons sighted these crafts they would have been looked upon perhaps, as unidentified flying object ( UFO ) extraterrestrial spacecraft.

SR-75, replaced the SR-71 Blackbird.

SR-74 SCRAMP is a drone that appears to ride under the SR-75.

TR-3B Flying Triangle are operated there – as well as other Top Secret prototype and operational aerospace vehicles.

Many of these aircraft have been misidentified as UFOs

When we reached Groom Lake, the bus pulled into a hanger and they shut the doors. The security personnel checked me in – while other security personnel dispatched the others to their places of work. I was given a pair of heavy glasses to wear, which can only be described as looking like welder’s goggles. The lenses were thick, and the sides of the goggles were covered to obliterate my peripheral vision. Once I had these goggles on I could only see about 30-feet in front of me. Anything beyond that distance became increasingly blurred. If an M1 Tank barrel had been pointed at me, from about 50-feet away, I would not have seen it. It was very disconcerting to have to wear those glasses.

The whole time I was there, some 10-days consecutive, followed by several follow-up visits, the routine was the same. Leave, Nellis before sunrise, and return home – from Nellis – after dark every day.

Only once did I get a chance to see the whole base, and that was when I was flown up – from Nellis – in a helicopter to Groom Lake for emergency repairs of their crypto test equipment. For those stationed at Groom Lake, or commuting there daily – with flight schedules posted for classified flights – everyone ‘not cleared for that particular program and flight’ must be off the ramp – inside 30-minutes – prior to the scheduled operation.

A couple of thousand personnel are flown into Area 51 daily, from McCarran Airport in Las Vegas, Nevada and – from Edwards AFB in California – on contractor aircraft. Several hundred commute from Tonopah and Central Nevada via the north entrance near Rachel, Nevada. Other commuters use its [ Area 51 ] south entrance via Mercury, Nevada or Indian Springs, Nevada west of Las Vegas, Nevada.

While at Groom Lake I made contacts and met people from other programs. Over time, a few became friends and we exchanged stories.

On my 3rd day on the job at Groom Lake, I had to remove a module from a multi-bay piece of satellite communications equipment used to support certain special mission aircraft. I noticed while inside the bay checking out the wiring that it contained a sealed unit about the size of a large briefcase. It had a National Security Agency ID plate on it. The nomenclature on the nameplate was, Direct Orbital Communication Interface Link Emitter [ DOCILE ]. I thought this was strange, as the unit was part of a digital communications link used solely to communicate with classified Air Force vehicles. I was unaware – at the time – of any military orbital missions not related to NASA. Remember, this was in the late 1970s – the shuttle didn’t fly until 1981.I disconnected the unit, and – out of curiosity – I removed the rear access cover. To my amazement, there were some half-dozen [ 6 ] large hybrid integrated circuit [ IC processor ] chips inside. The largest chip had over 500 hair-thin leads attached and was approximately the size of a Zippo lighter. The ‘paper inspection stamp’ on the chip was dated 1975.

In 1975, the most advanced processor speeds – on the most classified projects – were equivalent to an IBM 8088 which ran at 4,000,000 million cycles per second, but this unit had a processor speed of 1,000,000,000 billion cycles per second.

It wasn’t until more than a dozen [ 12 ] years had passed before I saw comparable technology with integrated circuit chips, but ‘then’ [ 1975 ] it was at a Top Secret avionics development project at the ITT company.

In the mess hall at Groom Lake, I heard words like:

Lorentz Forces; Pulse Detonation; Cyclotron Radiation; Quantum Flux Transduction Field Generators; Quasi-Crystal Energy Lens; and, EPR Quantum Receivers.

I was told ‘quasi-crystals’ were the key to a whole new field of propulsion and communication technologies.

To this day, I’d be hard pressed to explain to you the unique electrical, optical and physical properties of Quasi-Crystals and why so much of the research is classified.

Even the unclassified research [ on quasi crystals ] is funded by agencies like the Department of Energy [ DOE ] and the Department of Defense [ DOD ].

Why is the U.S. Department of Energy and Ames Laboratory so vigorously pursuing research with Quasi-Crystals?

What is the DOE new Initiative in Surface and Interface Properties of Quasi-crystals?

“Our goal is to understand, and facilitate exploitation of, the special properties of Quasi-crystals. These properties include ( but are not limited to ) low thermal and electrical conductivity, high-hardness, low friction, and good oxidation resistance.” That’s the ‘unclassified’ part.

What are Quasi-Crystals?

In 1984, a paper was published, which marked the discovery of quasi crystals, two ( 2 ) distinctly different metallic crystals joined symmetrically together.

[ NOTE: See, e.g. Project CARET ( 1984 ), at: http://upintelligence.wordpress.com/2010/10/23/secret-extraterrestrial-information-technologies/ ]

By 1986, several Top Secret advanced studies were going on – funded by DARPA – with leading scientists already working in the field.

In classic crystallography, a crystal is defined as a three dimensional ( 3-D ) periodic arrangement of atoms with translational periodicity along three ( 3 ) principal axis’s.

Since Quasi-crystals ‘lose periodicity’ in at least one [ 1 ] dimension, it is not possible to describe them in 3D-space as easily as normal crystal structures. Thus, it becomes more difficult to find mathematical formulae for interpretation and analysis of diffraction data.

After the official release for the discovery of Quasi-crystals in 1984, a close resemblance was noted between icosahedra quasi-crystals and 3-D Penrose patterning.

Before quasi-crystals were discovered, in 1984, British mathematician Roger Penrose devised a way to cover a plane in a non-periodic fashion using two ( 2 ) different types of tiles arranged in a way they obey certain rule match-ups of rhombohedrens [ rhombohedrans ] – instead of the rhombi [ rhombae ] – that later became known as 3D-Penrose Tiling.

12-years later, ‘Penrose Tiling’ became the prototype of very powerful models explaining structures within Quasi-crystals discovered in rapidly quenched metallic alloys.

14-years of quasi-crystal research, established the existence of a wealth of stable and meta-stable quasi-crystals with 5, 8, 10, and 12 fold symmetry with strange ‘structures’ and interesting ‘properties’. New ‘tools had to be developed for – study and description – of these extraordinary materials.

[ NOTE: See, e.g. Project CARET ( 1984 ), at: http://upintelligence.wordpress.com/2010/10/23/secret-extraterrestrial-information-technologies/ ]

I’ve discovered classified research showing quasi-crystals as promising candidates for high-energy storage materials, metal matrix components, thermal barriers, exotic coatings, infrared sensors, high-energy laser applications and electromagnetics. Some high strength alloy surgical tools are already on the market.

One of the stories I was told more than once was that one of the crystal pairs used in the propulsion of the Roswell crash was a Hydrogen Crystal. Until recently, creating a hydrogen crystal was beyond the reach of our scientific capabilities. That has now changed. In one Top Secret Black Program, under the DOE, a method to produce hydrogen crystals was discovered, and then manufacturing began in 1994.

The lattice of hydrogen quasi-crystals, and another material not named, formed the basis for the plasma shield propulsion of the Roswell [ New Mexico, USA UFO crash ] craft and was an integral part of the bio-chemically engineered vehicle.

A myriad of advanced crystallography – undreamed-of by scientists – were discovered by the scientists and engineers who evaluated, analyzed, and attempted to reverse-engineer the technology presented with the Roswell vehicle and eight [ 8 ] more vehicles which have crashed since then.

I wrote down everything I saw, heard and touched in my log – every night before going to bed. By the way, the food at the Groom Lake mess hall was excellent, but what would you expect – there was no cable TV, no alcohol and no women. I guess they figured they’d better do something right.

Later, while back at the base, my routine went on as normal – as did my part-time job that summer at the Silver Dollar Salon [ Las Vegas, Nevada ].

My NSA friend, Jerald, managed a team that investigated and ‘watched’ those with highly classified jobs at the Nevada Test Site and the Nellis Range – among other highly classified facilities – happened to show up.

I met Jerald in 1976 when I was part of a team that moved the F-15 operation from Edwards AFB to Nellis AFB and set up the Joint NAV – AIR AIM – VAL / ACE – VAL program.

He was checking up on a guy who had a drinking problem who worked at the Nevada Test Site ( S-4 ) where they set off underground atomic explosions.

He [ Jerald ] happened to mention a vehicle that could be boosted into orbit and return and land in the Nevada desert. This was during the late 1970s. It was an Unmanned Reconnaissance Vehicle ( URV ) – launched from a B-52 bomber – and used booster rockets to place it in temporary low earth orbit ( LEO ) for taking high-altitude reconnaissance photographs. I thought he was feeding me a line of bull. Then he said, “This vehicle is remote-piloted ( unmanned ) with communications via the DOCILE system at Groom.”

I’m not usually too slow, but it didn’t hit me until he repeated, “You know, the Direct Orbital Code Interface Link Emitter ( DOCILE ).” Bingo, the light bulb went on, I had seen – at Groom Lake [ Nevada ] a part of the DOCILE equipment – the NSA briefcase sized unit with the large chips.

These are old pictures of the Virtual Reality Laboratory ( VRL ) at Brooks Air Force Base ( BAFB ) where the software – to remotely fly exotic aircraft – was developed.

Let me get back to the development of [ my book ] Alien Rapture – The Chosen.

After I agreed to write my co-conspirator’s story, I talked to several military Judge Adjutant General ( JAG ) lawyers whom were told I wanted to write about some of my experiences in the military and that I had been on many classified projects. I was told, I had to write my story as ‘fiction’, which I have.

I was told, I couldn’t name any real individuals with clearances or covers, or use their working names, which I haven’t.

I was also told, I couldn’t discuss any secrets of programs I had been personally assigned to, which I have ‘not done’ and ‘will never do’.

Then I was told so long as I did that I could damn well write anything I wanted to. Of course I did ‘not’ tell them that I was going to write about the government conspiracy to cover-up UFO contact and reverse engineering of alien technology. Or, that I was interviewing pilots who flew classified ‘air craft’, plus others whom had worked Black Programs.

In the summer of 1992, we again met in Las Vegas, Nevada. I had compiled my notes from our first meeting, my interviews, and input that five ( 5 ) friends had passed on to me, of whom each reached out to ‘their friends and contacts’ uncovering even more information.

We agreed I was the only one who could get away with writing about our experiences, since I was planning on getting out of the D.O.D. ( Department of Defense ) industry.

My friends for the most part, were still connected.

Bud, one of my co-conspirators and close friends, informed me he had a cancerous tumor and was going through some severe depression. He was dead 30-days later. It was a real blow to us. We lost Jerrold, 1-year before, due to his heart attack.

Of the remaining three ( 3 ) friends, Sal dropped off of the face of the earth and none of his – nor my contacts – have been able to locate him for 2-years now. Sal was extremely paranoid about the two ( 2 ) deaths ( of Bud and Jerrold ), and had second thoughts about the book. Sal said he was going to move and didn’t know when or if he would contact me next. I like to think of him sipping a tropical drink on some Pacific island.

Let me talk about my friend Doc, who has a theory that UFOs were drawn-to like fast Aircraft. SR-71 pilot, Doc, whom I knew well, was stationed at Kadena AFB ( KAFB ) where they were located on the SAC ( Strategic Air Command ) side of the base during 1973.

While flying back across the South China Sea, from a reconnaissance mission, SR-71 pilot ( Doc ) encountered a shadow over his cockpit. Doc said his avionics systems went totally haywire, and he felt the aircraft nose down slightly, which can be dangerous at 2,000 miles per hour ( 35 miles per minute ). When the SR-71 pilot looked up, Doc was so startled that he almost panicked and immediately made an evasive maneuver to the right and down – one of many maneuvers made if an approaching missile is detected.

Doc said the object was so big that it totally blocked out the sun. His estimate was that it was 250-feet to 300-feet across, oval shaped and appeared bright blue-grey in color, however Doc wasn’t sure because it also had a shimmering halo of energy surrounding the vehicle.

About 3-minutes later, and some thousands of feet lower, the vehicle reappeared to the side of his left wing tip. He tried his UHF radio and all he could pick up was a deep electrical hum. He abandoned his attempts to use his radio, as his immediate survival was more important.

For the next 10-minutes, the large oval vehicle moved from his left wing tip to the rear of the SR-71 and then to his right wing tip. The movement from the left, to the rear, to the right wing tip took about 2-minutes, and then it reversed the movement. During the UFO’s last swing to the rear of his SR-71 his aircraft started buffeting wildly, which is terrifying at Mach 3, then it stopped after about 15-seconds, and then he never saw it again.

Doc said he heard a sound in his head, ‘like a swarm of bees in my brain,’ as he described it. When Doc returned from the mission he immediately went to his debriefing. The minute he mentioned the incident with the Unidentified Aerospace Vehicle ( UAV ) to his commander, he was pulled away from the debriefing and taken to his commander’s office. His commander, a colonel, filled out an incident report, in detail, and then told Doc not to mention the incident to anyone or he would be subject to severe and speedy penalty under military regulations.

Doc told me he knew of no single other SR-71 pilot or astronaut who hadn’t had a close encounter or a UFO sighting. Doc felt none would ever go on record with their experiences because of fear of retaliation from the U.S. Department of Defense ( DOD ) and loss of their retirement pay and benefits for breaking the U.S. oath of Secrets.

During the 9-years after this in-flight incident, Doc related that a few of his trusted friends related similar incidents, with the same type of vehicles, or glowing orbs of dense light, dancing around their SR-71.

Then Doc told me another story about his friend Dave, another SR-71 Blackbird pilot, who – while drunk on Sake in Japan – told him in whispers that he hadn’t been a drinker until he made a reconnaissance flight over the Eastern border of Russia 6-months earlier when Dave returned delirious and semi-conscious.

Dave’s SR-71 crew had to pull him out of the cockpit. The Flight Surgeon attributed his symptoms to loss of oxygen. Dave didn’t share ( with U.S. Air Force base physicians ) his nightmares, for fear the Flight Surgeon would ground him and possibly lose his flying status, however under the influence of alcohol – in a quiet bar – with a trusted fellow SR-71 blackbird pilot ( Doc ) friend, Dave opened up.

Dave tearfully related – in an emotional story – having frequent nightmares that something had gotten to him during his flight over Russia. For Dave, what made matters worse was he had absolutely no memory of the flight from the time he lifted off from Osun Air Base ( Korea ) until the day after he returned and found himself in the Naval Regional Hospital in Okinawa, Japan.

I managed to track down Dave, who lives in Southern California, who confirmed off-the-record the incident relayed by Doc to me was true. Dave said he was actually happy someone was writing about stories of contact and sightings by military pilots, and was sure he had some type of contact with a UFO.

One day, while still at Nellis Air Force Base test range S-4, we were informed there was an F-15 that crashed on the Nellis Range, where Area 51 is located. The F-15 crash happened in 1977. A Lieutenant Colonel and Doc Walters, the Hospital Commander, actually flew into the side of a mountain while doing a routine Functional Check Flight.

A sergeant, who worked for me, recovered the F-15 Heads-Up Display film canister while he had earlier been assigned to the Accident Investigation Team. He told me that a guy ( in a dark jump suit ), who was out of Washington D.C. personally took that film canister away from him – unusual because everything else was picked up and logged, first, then taken back to the assigned hanger for analysis – in-addition to a ‘prototype video camera’ ( also on the aircraft ) and flight data recorder; all handed over to the guy from Washington, D.C.

One night a couple of weeks after the crash, my U.S. National Security Agency ( NSA ) friend, Jerald ( Gerald ? Jerrold ? ) relayed to me – at the Silver Dollar Saloon ( Las Vegas, Nevada ) – that the aforementioned Lieutenant Colonel radioed the Nellis Tower that he had an ‘extremely large thing’ hovering his aircraft and was experiencing loss of flight systems. His communications went dead and a few seconds later his aircraft exploded into the side of a mountain-top.

Jerald, who was the most ‘connected’ person I’ve ever known, told me viewing the ‘video’ showed some type of oval vehicle – of tremendous size – was so close to the F-15 its camera appeared to have gone out of focus.

When Doc and the Lieutenant Colonel ejected, the UFO was still above them, their bodies were torn to shreds. Officially, it was determined – as always the case – that ‘pilot error’ caused the perfectly functional aircraft – in clear airspace with maximum visibility – to crash.

Nevada calls itself the silver state, the battle-born state, and the sagebrush state. A more appropriate motto would be, the conspiracy state.

Of the 111,000 square miles of land in Nevada, over 80% is controlled by the federal government – the highest percentage of any state in the U.S. If it were not for the gaming industry in Nevada, the federal government would be the largest employer in the State of Nevada, with 18,000 federal and military personnel and another 20,000 government contractors and suppliers.

The Nevada Test Site, Nellis Air Force Base and Range [ Nevada ], Fallon Naval Air Station [ Nevada ], the Tonopah Range, and the aerospace industry eat-up a lot of U.S. tax dollars.

The Nevada Test Site and the Nellis Range have myriad secrets yet to be revealed, including a super secret laboratory named DARC, part of the Defense Advanced Research Center ( DARC ).

DARC is located, inside the Nellis Range, 10 stories underground.

DARC was built in the 1980s with SDI ( Strategic Defense Initiative ) money.

DARC is next to a mountain – near Papoose Lake, south of Groom Lake, Nevada – where the TR-3Bs ( TIER III B ) are stored in an aircraft hanger built into a side of a mountain near DARC. The Nellis Range covers more than 3,500,000 million acres.

EG&G, a government contractor, provides classified research and development services for the military and government, and supplies technical and scientific support for nuclear testing and energy research and development programs.

EG&G provided large diameter drilling, mining and excavation for DARC underground and mountainside facilities.

EG&G built these DARC hidden bunkers, mountain hangers and vast underground facilities at Groom Lake, Papoose Lake, and Mercury for the government – facilities and observations posts well camouflaged inside the Nevada Test Site and the Nellis Range.

Starting in 1971 and continuing through 1975, a massive amount of excavation took place at the Groom Lake facility and Papoose Lake facility where most subsequent construction also took place underground.

In 1972, EG&G was granted an ‘indefinite contract’ called “Project Red-Light” to support the U.S. Department of Energy ( DOE ) and the military. This contract gave EG&G responsibility to assist in the recovery of nuclear materials in cases of mishaps and to provide aerial and ground security for highly classified government and military sites.

My sources say DOE and NSA are primary responsibles, to the MJ-12 committee, for reacting to UFO sightings and recovery of UFO artifacts in cases of crash.

So what’s going on more recently, you may ask? Let’s talk about the newest secrets and rumors:

  [ photo ( above ) AVRO VZ-9 AeroCar ( 1957 BELL Laboratory ( Canada ) contractor for CIA ) ]

The Hillary platform, AVRO saucer and NORTHROP sections where aerospace vehicles with advance technology was developed and tested – each emulated some characteristic of UFOs as described by the late Dr. Paul Hill, a NASA UFO investigator. Hill’s posthumously published book, Unconventional Flying Objects, discusses UFO technology extensively. If you have not read this illustrious tome, I suggest you do so.

Newly unclassified documents show that AVRO [ Canada ] built and tested a number of saucers – at Area 51 in Nevada  [ USA ] – contrary to DOD lies that the program was canceled because it failed to meet expectations. LOCKHEED Advanced Developmental Projects Division ( ADPD ) – also known as the “Skunk Works” – developed the A-12 – for the U.S. Central Intelligence Agency ( CIA ) – and a later version, the SR-71 for the USAF, in the early 1960s.

30-years later, the SR-71 was still breaking world speed records. The sleek matte-black stiletto shaped spy plane SR-71 Blackbird – traveling 2,000 miles in 1-hour 4-minutes ( including multiple orbits, touch-down and take-off ) – broke the world air speed record from Los Angeles, California to Washington, D.C. on its retirement flight in 1990.

Area 51 ( Groom Lake Air Base Nevada facilities ) has a 6-mile long runway, the longest in the United States where U.S. Department of Defense ( DOD ) and CIA most exotic aerospace vehicles are tested and modified. Area 51 is a place where curious outsiders circulate rumors about aliens and extra-terrestrial technology being utilized to accelerate the various programs at Area 51.

Why a 6-mile long runway? You need a runway this long if the minimum, or stall speed, of an aircraft is a very high speed. Aircraft ( without wings ), like wedge shaped lifting bodies or those with 75 degree swept back wings, have very high stall speeds, and while they launch-lift very fast, they land even faster.

My sources estimate that up to 35% of SDI ( Strategic Defense Initiative – “Star Wars” ) funding was siphoned-off to provide primary expenditures for the U.S. Air Force most secret ‘Black Program’, begun in 1982, the AURORA Program.

AURORA, the Program, ‘is’ the ‘code name’ of the ‘ongoing Program’ to have advanced aerospace vehicles built and tested.

AURORA, contrary to popular belief, is ‘not’ the name of an individual aircraft.

AURORA Program is the namesake of “aurora borealis” – excited gas in the upper atmosphere. As early as 1992 the Air Force had already made contingency plans to move some of it’s aircraft out of Groom Air Base ( Area 51 ) where the public eye was on the base and they ( government officials ) did ‘not’ like that one bit.

Everything, needing a long runway, like the SR-75 was removed by early 1992 to other bases in Utah, Colorado, Alaska, Greenland, Korea, Diego Garcia, and remote islands in the Pacific.

Short take-off and landing vehicles ( STOL ), especially the bat wing TR-3A ( TIER III A ) Manta and TR-3B ( TIER III B ) Astra – the ‘Flying Triangle’ – were relocated to Papoose Lake, Nevada ( USA ) in the southern part of the Nellis Air Force Base ( AFB ) Test Range Site called Area S-4. Other than the SR-75 – still being dispersed to other locations – more research and development ( R&D ) and flight operational test and evaluation continues in Area 51, more-so now than ever before.

For the last few years high-tech buffs speculated that at least one ( 1 ) new and exotic aerospace vehicle existed, the SR-75 – first operational AURORA Program vehicle that went operational in 1989, 2-years ( 1987 ) ‘after’ flight testing and modifications were complete.

SR-75 is a hypersonic strategic reconnaissance ( SRV ) spy plane called the Penetrator, a mother ship, that I will explain shortly.

Hypersonic speeds start at approximately Mach 5.

SR-75 replaced the SR-71 spy plane, retired in 1990 by the U.S. Air Force that said, “there is no replacement, all we really need is our spy satellites to do the job.” Hmm?

The DOD, upon analysis of Desert Storm, admitted satellites ( alone ) could not provide the necessary quick response real-time reconnaissance information needed by various military agencies. Yet they have repeatedly fought some U.S. Congressional efforts to bring back the SR-71. Why? The answer should be obvious.

SR-75, is better capable of global positioning in less than 3-hours, equipped with multi-spectral sensors ( optical, radar, infrared, and laser ) collecting images, electronics intelligence ( ELINT ), signals intelligence ( SIGINT ), illuminating ( lighting-up via laser ) targets, and more.

SR-75, far exceeds classified military speed and altitude records set by predecessor SR-71 flight records – still classified Mach 3.3+ ( confidently add another 3 to 4 Mach + ) with ceiling altitudes of ( at least ) 85,000 feet.

SR-75, attained altitudes of over 120,000 feet and speeds exceeding Mach-5 ( 5 times faster than the speed of sound ) – equating to more than 3,300 miles per hour.

SR-75, from take-off to landing, can make the round trip from central Nevada to Northeast Russia and back in under 3-hours.

SR-75, is 162-feet long and has a wing span of 98-feet.

SR-75, fuselage begins at 10-feet above ground.

SR-75, carries a flight crew of three ( 3 ), i.e. pilot, reconnaissance officer and launch control officer, the latter of whom doubles as the SR-75 Penetrator electronics warfare ( EW ) officer. Two ( 2 ) methane and LOX fueled high bypass turbo-ramjet ( combined cycle ) engines – housed under each wing – where bays run 40-feet thereunder, terminating at the trailing edge of the wing.

SR-75 Penetrator Pulse Wave Detonation engines ( 2 ) push speeds above Mach 5 – now reported Mach 7+ or 4,500-mph ( miles per hour ) with now new engine modifications. Although this plane has been sighted on numerous occasions, picked up on military radar, the pulse wave detonation contrail has been seen, but the Air Force vehemently denies its existence.

SR-75 large Pulse Wave Detonation engine bay inlets ( located under each wing of the black mother ship – hang downward 7-feet under the wings ) – are 12-feet wide.

SR-71 Blackbird, SR-75 Penetrator ( black mothership ), and its SR-74 daughter ship SR-74 ( Scramp ) were all built by LOCKHEED Advanced Development Company ( Skunk-Works ).

SR-74 Scramp, daughtership of the SR-75, is a ‘drone’ launched from ‘under’ the SR-75 black mothership fuselage.

SR-74 Scramp, after launch, utilizes a supersonic combustion ram-jet engine ( Scramjet ).

Jerald witnessed the flight of the big black Air Force SR-75 carrying the little unmanned SR-74 while inside Area 51 where it was initially positioned piggyback ( on its upper raised platform ) atop the SR-75 Penetrator.

[ photo ( above ) LOCKHEED YF-12-A / SR-71 Blackbird drone: D-21 ( drone ) ( click to enlarge ) ]

Remember, the SR-74 Scramp can ‘not’ launch itself from the ground.

SR-75 talk, was heard by me, as far back as the late 1970s when I was at Groom [ Groom Lake ], and I’ve 2 additional friends who saw the SR-75 at Groom Lake, Nevada.

SR-74 Scramp, from its SR-75 mother ship, can only launch at an altitude above 100,000-feet + from where it can then attain orbital altitudes over 800,000-feet ( 151-miles + ).

The U.S. Air Force SR-74 Scramp is for launching small highly classified ‘ferret satellites’ for the U.S. National Security Agency ( NSA ). SR-74 Scramp, can launch at least two ( 2 ) 1,000-pound satellites measuring 6-feet by 5-feet.

SR-74 Scramp is roughly the equivalent size and weight of a F-16 fighter.

SR-74, can easily attain speeds of Mach 15, or a little less than 10,000 miles per hour. NASA Space Shuttle is ‘technologically antique’ by comparison – taxpayer joke.

If you think these rumors are far-fetched, look at the YB-49 and XB-70 flown in 1948 and 1964 respectively, then look at the SR-75 that has been spotted numerous times.

Some say, “The government cannot keep a secret,” wrong statement if some think the government can’t.

There are new rumors that the U.S. placed two ( 2 ) new vehicles in permanent orbit. One of these, being the Space Orbital Nuclear – Service Intercept Vehicle ( SON-SIV ) code name LOCUST.

The SR-74 SCRAMP and the TR-3B [ TIER III B ] can deliver Spare Replacement Units ( SRU ), service fuels, fluids and chemicals to the SON-SIV.

SON-SIV robotic uses those deliverables to service, calibrate, repair and replace parts on the newer NSA, CIA and NRO ( National Reconnaissance Office ) satellites built to be maintained ‘in space’.

Finally, I’ve saved the best for last. The Operational model of the TR-3B [ TIER III B ].

The early information I gathered from interviewing my contacts and their closest friends who worked black programs resulted in the basic specifications of the TR-3B [ TIER III B ] Flying Triangle. I had this simple drawing by late 1990.

On the night of March 30, 1990 a Captain of the Belgium ( Belgique ) National Police decided to pursue incoming reports about a ‘triangular’ shape UFO. Two ( 2 ) radar installations, one a NATO defense group and the other a Belgium ( Belgique ) civilian and military radar verified the triangle UFO.

Excellent atmospheric conditions prevailed, and there was no possibility of false echoes due to temperature inversions. At 5 AM in the morning, 2 dispatched F-16 fighters spotted the Triangle on their radar screens that had locked onto the target.

Six seconds later the object speeded up from an initial velocity of 280 kilometers per hour to 1,800 kilometers per hour; at the same time descending from an altitude of 3,000 meters to 1,700 meters, then down to 200-meters, causing the F-16 radars to lose lock-on. This maneuver happened all in a matter of 1-second. The 40 G acceleration of the Triangle was some 32 gravitational forces higher than what a human pilot can stand.

Contrary to normal aeronautical expectations, no sonic boom was heard. This phenomenal game of hide and seek was observed by twenty [ 20 ] Belgium National Policemen and hundreds of other witnesses who all saw the Triangular vehicle and the F-16 fighters. The chase was repeated twice more during the next hour.

Belgians have made all information of this event public, unlike our U.S. government that only admits nothing and denies everything to do with UFOs – even when some are ours.

[ photo ( above ) another Triangle craft believed over Turkey enroute to Iraq ( click to enlarge ) ]

The TR-3B original photo was taken with a digital camera carried aboard, a special black operations C-130 cargo plane, by a U.S. Air Force Special Operations sergeant who took the picture from aboard the C-130 flying mission support for the TR-3B.

I’ve seen this picture personally and have interviewed several people who worked on the program. I’m sure of my facts and specifications.

You can see for yourselves, that from the Belgium pictures, a computer generated software composite rendition – resulting from the Europe sightings, plus my original schematic – taken from interviews – that this ‘is an accurate rendition’ of the TR-3 [ TIER III ].

From the ‘original digital photograph’ of the TR-3B, a ‘computer graphic enhanced representation’ – made using 3D graphic rendered software – hangs on the wall inside a ‘black vault’ at the AURORA Program Office. I am not at liberty to divulge further details about the digital picture except to say a friend took a great career risk taking it and showing it to me.

We have used these highly accurate computer graphic pictures of the Prototype and Operational models of the TR-3B to get further verification of their accuracy. You will not get a clearer picture of what the Flying Triangles are until one lands amidst ‘public domain’ and is captured by CNN or other news media.

Jerald said he would never forget the sight of the alien looking TR-3B based at Papoose Lake, Nevada where the pitch black triangular shaped craft was rarely mentioned – and then only in hushed whispers – at the Groom Lake facility where he worked.

TR3B [ TIER 3 B ] flew over the Groom Lake runway in complete silence, and magically stopped above Area S-4 where t hovered ( silently for about 10-minutes ) silently in the same position before gently settling vertically to the tarmac.

At times a corona of silver blue light glowed around the circumference of the massive TR-3B. The operational model of the TR3-B is 600-feet across.

TR3B ‘prototype’ ( 200-foot ) and TR3B ‘operational’ ( 600-foot ) version crafts are code named ASTRA.

TR3B tactical reconnaissance version first [ 1st ] ‘operational flight’ was in the early 1990s.

TR-3A Manta [ TIER III A Manta ] is a subsonic reconnaissance vehicle shaped like a bat wing and is in no way related to the TR-3B.

The nomenclature for the TR-3B is unconventional and named thusly to confuse those tracking black budget Programs, Projects and rumors leaked to be purposefully confusing as most are in the aerospace industry where one would think there ‘must be a relationship’ between the TR-3A and the TR-3B, although there is ‘no relationship’.

TR3B triangular shaped nuclear powered aerospace platform was developed under the Top Secret AURORA Program with SDI ( Strategic Defense Initiative – Star Wars ) and black budget monies. At least three [ 3 ] TR3B, $1,000,000,000 billion dollar +, were flying by 1994.

AURORA is the most classified aerospace development program in existence – with its TR-3B being the most exotic vehicle created from the AURORA Program – funded and operationally tasked by the National Reconnaissance Office ( NRO ), NSA, and CIA.

TR-3B flying triangle is ‘not fiction’ and was built with technology available in the mid 1980s and uses more reversed alien technology than any vehicle ever before, but ‘not’ “every” UFO spotted is one of theirs.

TR-3B vehicle outer coating is electro-chemical reactive, that can fluctuate from radar electrical frequency ( EF ) stimulation, causing reflective color spectrum changes from radar absorption.

TR3B is the first [ 1st ] U.S. aerospace vehicle employing quasi-crystals within its polymer skin used in conjunction with TR-3B Electronic Counter Measures ( ECCM ) that can make the vehicle look like a ‘small aircraft’ or ‘flying cylinder’ that also fools radar receivers into falsely detecting a either a ‘variety of aircraft’, ‘no aircraft’ or ‘several aircraft placed in various locations’.

Electro-chromatic polymer skins can be uncovered performing research that points to stealth material properties.

A couple in Ohio spotted the Triangle in early 1995, was first spotted by a man indicating its orange ball of light and then triangle shape with three [ 3 ] bright spots at each corner. As it moved slowly southward they were awestruck by the enormous size. “The damn thing is real,” he exclaimed to his wife, “It’s the flying triangle.” The man said it was the size of “two [ 2 ] football fields” – making it 200 yards or 600-feet across – same as the TR3B operational version craft.

From the collection of pictures, analysis and further refinement we now have a better schematic layout of the Top Secret USAF Flying Triangle [ TR 3 B ] seen by thousands that the U.S.  Department of Defense ( DOD ) and U.S. government claims ‘does not exist’.

A circular, plasma filled accelerator ring called the Magnetic Field Disrupter, surrounds the rotatable crew compartment, far ahead of any imaginable technology.

Sandia & Livermore U.S. National Laboratories developed reverse engineered MFD technology

The government will go to any lengths to protect this technology. The plasma, mercury based, is pressurized at 250,000 atmospheres at a temperature of 150 degrees Kelvin, and accelerated to 50,000 rpm to create a super-conductive plasma with the resulting gravity disruption.

MFD generates a magnetic vortex field, which disrupts or neutralizes the effects of gravity on mass within proximity, by 89 percent. Do not misunderstand. This is ‘not’ anti-gravity. Anti-gravity provides a ‘repulsive force’ that can be ‘used for propulsion’.

The MFD creates a disruption – of the ‘Earth gravitational field’ – upon the mass within the circular accelerator.

The mass of the circular accelerator and all mass within the accelerator, such as the crew capsule, avionics, MFD systems, fuels, crew environmental systems, and the nuclear reactor, are reduced by 89%.

A side note to the Magnetic Field Disruptor development; one source that worked at GD Convair Division in the mid 1960s described a mercury based plasma that was cooled to super-conductive temperatures, rotated at 45,000-rpm and pressurized at thousands of atmospheres. This would be considered state-of-the-art technology even by today’s standards, some 30 years after he worked this project.

He related that the project achieved its objective. Instruments and test objects within the center of the accelerator showed a 50 percent loss of weight, attributed to a reduction in the gravitational field. He had worked on MFD as far back as 1965 and was told by a senior scientist that the research had been going on for a decade. See: Convair, notes from Gravitics research and Gravity RAND article.

The current MFD in the TR-3B causes the effect of making the vehicle extremely light, and able to outperform and outmaneuver any craft yet constructed, except of course, the UFOs we did not build.

The TR-3B is a high altitude, stealth, reconnaissance platform with an indefinite loiter time. Once you get it up there at speed, it doesn’t take much propulsion to maintain altitude.

At Groom Lake there have been whispered rumors of a new element that acts as a catalyst to the plasma.

Recently NASA and Russia have admitted breakthroughs in technology that would use a plasma shield for exotic aerospace vehicles. If you know anything about the history of classified black programs, you know that by the time NASA starts researching something, it’s either proven or old technology. They are the poor stepchildren when it comes to research and development technology and funding.

With the vehicle mass reduced by 89% the craft can travel at Mach 9, vertically or horizontally. My sources say the performance is limited only the stresses that the human pilots can endure. Which is a lot, really, considering along with the 89% reduction in mass, the G forces are also reduced by 89%.

The crew of the TR-3B should be able to comfortable take up to 40Gs. The same flight characteristics described in the Belgium sightings and many other sightings. Reduced by 89%, the occupants would feel about 4.2 Gs.

The TR-3Bs propulsion is provided by 3 multimode thrusters mounted at each bottom corner of the triangular platform. The TR-3 is a sub-Mach 9 vehicle until it reaches altitudes above 120,000 feet – then who knows how fast it can go.

The three [ 3 ] multimode rocket engines mounted under each corner of the craft use hydrogen or methane and oxygen as a propellant. In a liquid oxygen / hydrogen rocket system, 85% of the propellant mass is oxygen.

The nuclear thermal rocket engine uses a hydrogen propellant, augmented with oxygen for additional thrust.

The reactor heats the liquid hydrogen and injects liquid oxygen in the supersonic nozzle, so that the hydrogen burns concurrently in the liquid oxygen afterburner.

The multimode propulsion system can operate in the atmosphere, with thrust provided by the nuclear reactor, in the upper atmosphere, with hydrogen propulsion, and in orbit, with the combined hydrogen\ oxygen propulsion.

What you have to remember is that the 3 multi-mode rocket engines only have to propel 11% of the mass of the Top Secret TR-3B. Rockwell reportedly builds the engines.

From the evolution of exotic materials, advanced avionics, and newer propulsion engines the stealth aircraft were born. Leaps in technology have been obtained with reverse engineering of Alien Artifacts as described in the newly released MJ-12 Revised Charter, signed during the Reagan [ Ronald Reagan ] Administration.

According to Jerald’s account, the technology developed at Papoose far exceeded any known within the world scientific community. Jerald was in his late 50s when I first met him in LV. He had actually spoken to scientists who analyzed the Roswell vehicle and technology–technology that we can assuredly assume was developed from reverse engineering of recovered alien artifacts. The control of all Alien Artifacts, i.e. the research, the reverse engineering, and analysis of the Extraterrestrial Biological Entities (aka) EBE, were transferred to the super-secret laboratory, called the Defense Advanced Research Center ( DARC ), in Area S-4.

Many sightings of triangular UFOs are not alien vehicles but the top secret TR-3B. The NSA, NRO, CIA, and USAF have been playing a shell game with aircraft nomenclature.

Creating the TR-3, modified to the TR-3A, the TR-3B, and the Tier 2, 3, and 4, with suffixes like Plus or Minus added on to confuse even further the fact that each of these designators is a different [ Unmanned Aerial vehicle ( UAV ) ] aircraft, and not the same aerospace vehicle. [ See, e.g. “ DARPA Photos ” on this website for more on UAV and UCAV with Tier designations provided. ]

A TR-3B is as different from a TR-3A as a banana is from a grape. Some of these vehicles are manned and others are unmanned.

Before Jerald died, we had a long conversation. He was sure he had documentation that would prove the existence of the MJ-12 committee and our using crashed alien vehicles to reverse engineer their technology. I told him that I did not want any classified documents in my possession. I never found out what happened to them.

I also believe the recently deceased Colonel Corso, who discloses the government’s involvement with alien technology, was an honest and honorable man. I believe he was on the inside of administering alien artifact protocol for the Army, and he might have embellished the depth of his involvement.

I don’t have time to go into the two [ 2 ] unique MJ-12 documents that I acquired.

The characters in [ my book ] Alien Rapture are fictional, but the facts of the covert government agenda to suppress alien artifacts, reverse engineering of alien [ ExtraTerrestrial ] technology, and the details of Black Programs are absolutely true.

Part of our agreement was that every one of my close five friends would get a chance to look at the manuscript before I sent it to any Literary Agents.

Dale discussed the Alien Rapture manuscript with his father, who worked high up in the NSA for over 20-years. His father asked him how much he trusted me, and Dale told him ‘completely’.

Dale’s father provided him with two MJ-12 documents, and told him to retype them, and send them to me with the understanding that I would not ever reveal the source of them.

The documents that Dale retyped had most the names and dates blacked out. This is how I received these documents, and I was so naive that I didn’t even know what the histories of the MJ-12 documents were.

From as far back as the Viet Nam conflict [ war ], I knew Dale and that he was as close to his father as a son can get. I do not feel that his father used him for distributing disinformation. Whether his father was duped, I have no idea. From my personal opinion, I believe the MJ-12 committee was real, and still exists in some form.

The TR-3B’s anti-gravity physics is explained, insofar as the theory of general relativity can be considered as an explanation for anti-gravity flight anomalies.

Edgar Fouché describes (above) the TR-3B’s propulsion system as follows: “A circular, plasma filled accelerator ring called the Magnetic Field Disrupter, surrounds the rotatable crew compartment and is far ahead of any imaginable technology…

The plasma, mercury based, is pressurized at 250,000 atmospheres at a temperature of 150 degrees Kelvin, and accelerated to 50,000 rpm to create a super-conductive plasma with the resulting gravity disruption.

The MFD generates a magnetic vortex field, which disrupts or neutralizes the effects of gravity on mass within proximity, by 89%…

The current MFD in the TR-3B causes the effect of making the vehicle extremely light, and able to outperform and outmaneuver any craft yet …My sources say the performance is limited only the stresses that the human pilots can endure. Which is a lot, really, considering along with the 89% reduction in mass, the G forces are also reduced by 89%.

The crew of the TR-3B should be able to comfortable take up to 40Gs… Reduced by 89%, the occupants would feel about 4.2 Gs.

The TR-3Bs propulsion is provided by 3 multimode thrusters mounted at each bottom corner of the triangular platform. The TR-3 is claimed by some to be either a sub-Mach-9 up to a Mach 25+ lenticular-shaped vehicle until it reaches altitudes above 120,000 feet. No one knows really how fast it can go.

Many have been skeptical of Mr. Fouché’s claims however, in an interesting scientific article another claim is made that, the charged particles of plasma don’t just spin uniformly around in a ring, but tend to take up a synchronized, tightly pitched, helical or screw-thread motion as they move around the ring. This can be understood in a general way, where the charged particles moving around the ring act as a current that in turn sets up a magnetic field around the ring.

It is a well-known fact that electrons ( or ions ) tend to move in a helical fashion around magnetic field lines. Although it is a highly complex interaction, it only requires a small leap of faith to believe that the end result of these interactions between the moving charged particles ( current ) and associated magnetic fields results in the helical motion described above. In other words, the charged particles end up moving in very much the same pattern as the current on a wire tightly wound around a toroidal core.

In an article entitled, “Guidelines to Antigravity” by Dr. Robert Forward, written in 1962 ( available at: http://www.whidbey.com/forward/pdf/tp007.pdf ) Dr. Forward describes several little known aspects – of Einstein’s general relativity theory – indicating how moving matter can create unusual gravitational effects. Figure 5, indicates how the moving matter pattern describes what’s necessary to generate a gravitational dipole which was exactly the same as the plasma ring pattern described in the physics article discussed above.

If Fouche’s description is even close to correct, then the TR-3B utilizes this little known loophole in the General Relativity Theory to create it’s antigravity effects. Even though the TR-3B can only supposedly cancel 89% of gravity (and inertia) today, there is no reason why the technology can’t be improved to exceed 100% and achieve true antigravity capability.

In theory, this same moving matter pattern could be mechanically reproduced by mounting a bunch of small gyroscopes – all around the larger ring – with their axis on the larger ring, and then spinning both the gyroscopes and the ring at high speeds, however as Dr. Forward points out any such mechanical system would probably fly apart before any significant anti-gravity effects could be generated. However, as Dr. Forward states, “By using electromagnetic forces to contain rotating systems, it would be possible for the masses to reach relativistic velocities; thus a comparatively small amount of matter, if dense enough and moving fast enough, could produce usable gravitational effects.”

The requirement for a dense material moving at relativistic speeds would explain the use of Mercury plasma (heavy ions). If the plasma really spins at 50,000 RPM and the mercury ions are also moving in a tight pitched spiral, then the individual ions would be moving probably hundreds, perhaps thousands of times faster than the bulk plasma spin, in order to execute their “screw thread” motions. It is quite conceivable that the ions could be accelerated to relativistic speeds in this manner. I am guessing that you would probably want to strip the free electrons from the plasma, making a positively charged plasma, since the free electrons would tend to counter rotate and reduce the efficiency of the antigravity device.

One of Einstein’s postulates of the Theory of General Relativity says that gravitational mass and inertial mass are equivalent. This is consistent with Mr. Fouche’s claim that inertial mass within the plasma ring is also reduced by 89%. This would also explain ‘why the vehicle is triangular’ shaped. Since it still requires conventional thrusters for propulsion, the thrusters would need to be located outside of the “mass reduction zone” or else the mass of the thruster’s reaction material would also be reduced, making them terribly inefficient. Since it requires a minimum of three [ 3 ] legs to have a stable stool, it follows that they would need a minimum of three [ 3 ] thrusters to have a stable aerospace platform. Three [ 3 ] thrusters – located outside of the plasma ring – plus appropriate structural support would naturally lead to a triangular shape for the vehicle.

Some remain skeptical of the claimed size for the TR-3B at being approximately 500 to 600-feet across. Why would anyone build a tactical reconnaissance vehicle almost two ( 2 ) football fields long? However, the answer to this may also be found in Dr. Forward’s paper. As Dr. Forward’s puts it, “…even the most optimistic calculations indicate that very large devices will be required to create usable gravitational forces. Antigravity…like all modern sciences will require special projects involving large sums of money, men and energy.” Dr. Forward has also written a number of other articles, at: http://www.whidbey.com/forward/TechPubs.html

The TR3-B was spotted by U.S. military personnel as the “initial penetration bomb delivery vehicle” prior to the follow-up work done by the F-117 stealth fighters and B-2B bomber mop-up crews and not what Americans were told by the media were the initial strike penetration vehicles used during the Persian Gulf War air strikes inside Iraq airspace.

Recent sightings in December of 2000 has the TR3-B flying south from Utah across the mountain regions near a National Monument valley of Taos, New Mexico. For more information and images, See, e.g. ” UFO & ETT Pics ” on this X-CIA Files website.

Aerial vehicles, such as this ( below ), appears to have been around for decades

[ photo ( above ) VRIL VII Manned Combat Aerial Vehicle ( MCAV ) Circa: 1941 ( East Germany ) ]

November 9, 2001

The Old Technology

Reverse-engineered information production, although rarely used due to time consumption, may be utilized to uncover data previously gathered but somehow overlooked is a methodology rarely used by some of the most sophisticated intelligence gathering agencies in the World.

The concept is not ‘new’ but is the most feared form of information assimilation that tears down the walls of classification and censorship as to how things are or were previously perceived as fundamental beliefs. Little known and forgotten about practices and methodologies can easily circumvent new technologies by using the “Old Technology”.

An example of “Old Technology” was demonstrated in early-1970s era Soviet MIG defense aircraft using avionics that the West once saw used in 1950s-era television (TV) sets, i.e. electron vacuum tubes. During the 1970s, the West – sitting smug with transistorized miniature circuits in their fighter aircraft avionics – laughed at Soviet defense aircraft use of electron vacuum tube technology in these 1970s era advanced fighter jet aircraft being utterly outdated technology.

Years later, however the West discovered that Soviet defense aircraft avionics equipped with electron vacuum tube-type technology conquered EMP (elector-magnetic pulse) radiation (produced during above-ground Nuclear detonations) transistor micro circuit technology disruption. EMP wreaked havoc on Western ‘new technology’ electronics at great distances (sometimes, hundreds of miles) away from a ground zero nuclear blast or airburst electronic bombs or e-bombs.

Old “TV tube” technology was actually “higher technology” than what the West used for military defense fighter aircraft during the 1970s. The West was forced to develop other technology strategies, one of which was actually based on an even older technology, the Faraday Cage, a defense used against e-bombs.

That hurdle and yet another were to be eventually conquered, even better once again, with an ‘old technology school-of-thought’. The West found that the age-old adage of ‘fighting fire with fire’ would apply to protecting communication and electronic devices by, simply bombarding its “pre-production material’s structures” with gamma particle radiation. Thus the term, “rad-hard” or, “radiation hardened”, was coined.

In order to prevent EMP telecommunication disruption, fiber-optic communication cables using light-waves to transmit communications were bombarded (before installation) with ‘gamma radiation particles’ that provides a gamma-to-gamma resistance or vaccination against EMP telecommunication disruption, a “rad-hard technique” incorporated for years within U.S. defense Command, Control, Communication, and Computer Intelligence (C4I).

Old Technology, i.e. electron vacuum tubes, solid-state, analog, and arithmetic calculations, once thought to be old technology is still considered sensitive and continuing to be classified by the U.S. government. On the other hand, Extra-Terrestrial Technology (ETT) has brought unusual products into the consumer marketplace, which many take for granted. “Smart Structures” such as “Hybrid Composites” and “Smart Materials” such as “Shape Memory Alloys”, the later of which is now in the public domain and found in new technology ‘eyewear frames’ that will reshape back to their original form – after being bent out of shape – when water is poured over the frames.

Some believe that “cellphone technology” is a form of ETT, nevertheless more ETT products are coming our way, however what is “not” coming our way is the background information we should be focusing on, and for this reason, the following information is revealed.

Research and development on ETT materials was decompartmentalized into un-recognizable facilities after Area 51 began receiving so much publicity. Highly classified material and research projects began being conducted off government installations in universities and private firm research facilities around the world.

Initially, material pieces and sections of covertly held extra-terrestrial spacecraft began undergoing research studies at the Wright-Patterson Air Force Base General Electric Research and Development Division in Ohio, the Nellis Air Force Base test range sites at S-4 and Area 51 near Groom Lake in Nevada, and on a U.S. Army base in Dugway, Utah.

The U.S. Central Intelligence Agency (C.I.A.), U.S. National Security Agency (N.S.A.), and U.S. National Reconnaissance Office (N.R.O.) didn’t feel they had enough real estate space at Nellis Air Force Base (A.F.B.) to sufficiently “test” alien technology or ETT spacecraft insofaras their reverse-engineering programs went so, they went about an ingenious way of slowly but surely expanding their real estate property coverage area, which served to keep sensitive information even further hidden from the prying eyes of the outside World.

Additional funds were also used to bunkerize ETT-developed U.S. defense air, sea and even the new land-tunneling vehicular programs where most of the extremely sensitive and larger programs went literally underground from even the ever so watchful eyes of not only our own un-controllable and subverted satellites but, friendly foreigns as well as, enemy-based borns too. With everything in-place, the Government could move forward.

– – – –

Area 51 Law Suits – 1995 thru 2000

Lack of oversight creates opportunities for violations of environmental law to go undetected and unpunished. Some have charged that the Department of Defense, as recently as 1993, used secrecy as a cover for violations of environmental law. Recent lawsuits against the Department of Defense and the Environmental Protection Agency (EPA) allege that:

(1) Illegal open-air burning of toxic wastes took place at a secret Air Force facility near Groom Lake, Nevada; and,

(2) EPA has not exercised its required environmental oversight responsibilities for this facility.

Responding to the second (2nd) of these lawsuits, the EPA reported that in early 1995 it had seven (7) regulators on staff with Special Access [access to “Black Programs”] clearance that inspected the Groom Lake facility regarding “unknown health dangers” suffered by U.S. government contractors whom worked at the U.S. Nellis Air Force Base (AFB) experimental testing range near Groom Lake, Nevada known as “Area 51″.

The U.S. government would not release information to the victims or their professional advisors (medical and legal) as to the exact nature of “what” those workers had been exposed to, and consequently lawsuits were filed against the U.S. government.

What the public wasn’t aware of was, what occurred ‘before’ a U.S. Presidential directive in 2000.

What Happened In 1995:

In a federal 9th Circuit Court of Appeals case, these same injured government workers – from Area 51 – were battling for the release of U.S. government information. In one such instance, the U.S. Air Force base “Nellis AFB Nevada” claimed one (1) of their “manuals” – an “unclassified manual” in its entirety – should be considered a “classified” manual by the federal court. [It is suspected that because the particular “unclassified” manual was simply “found to be at Area 51” that it automatically became “classified” by reason of its “location”. See, e.g. Declaration of Sheila E. Widnall (below)]. Hence, the 9th circuit court of appeals ruled in favor of the government, and prohibiting government worker Plaintiffs from pursuing their case further through any other court.

Interesting information on this was revealed by, an official of the United States Air Force during the federal hearings. (See Immediately Below)

UNCLASSIFIED DECLARATION AND CLAIM OF MILITARY AND STATE SECRETS PRIVILEGE OF:

SHEILA E. WIDNALL, SECRETARY OF THE AIR FORCE

I, SHEILA E. WIDNALL, HEREBY DECLARE THE FOLLOWING TO BE TRUE AND CORRECT:

1. Official Duties: I am the Secretary of the United States Air Force and the head of the Department of the Air Force. In that capacity, I exercise the statutory functions specified in section 8013 of Title 10, U.S. Code. I am responsible for the formulation of Air Force policies and programs that are fully consistent with the national security directives of the President and the Secretary of Defense, including those that protect national security information relating to the defense and foreign relations of the United States. As the Secretary of the Air Force, I exercise authority over the operating location near Groom Lake, Nevada, and the information associated with that operating location. As the head of an agency with control over the information associated with the operating location near Groom Lake, I am the proper person to assert the military and state secrets privilege with regard to that information. Under Executive Order 12356, I exercise original TOP SECRET classification authority, which permits me to determine the proper classification of national security information on behalf of the United States. Executive Order No. 12356, Sec. 1.2, 47 Fed. Reg. 20,105 (1982), reprinted in 50 U.S. Code Section 401 (1991); Presidential Order of May 7, 1982, Officials Designated to Classify National Security Information, 50 U.S. Code Section 401 (1991).

2. Purpose: This Declaration is made for the purpose of advising the court of the national security interests in and the security classification of information that may be relevant to the above captioned lawsuits. The statements made herein are based on (a) my personal consideration of the matter, (b) my personal knowledge; and (c) my evaluation of information made available to me in my official capacity. I have concluded that release of certain information relevant to these lawsuits would necessitate disclosure of properly classified information about the Air Force operating location near Groom Lake, Nevada. I am satisfied that the information described in the classified Declaration is properly classified. I have further determined that the information described in the classified Declaration, if released to the public, could reasonably be expected to cause exceptionally grave damage to the national security. It is not possible to discuss publicly the majority of information at issue without risking the very harm to the national security that protection of the information is intended to prevent.

3. Security Classification: Under Information Security Oversight Office guidance, “[certain information that would otherwise be unclassified may require classification when combined or associated with other unclassified information.” (32 CFR 2001.3(a)) Protection through classification is required if the combination of unclassified items of information provides an added factor that warrants protection of the information taken as a whole. This theory of classification is commonly known as the mosaic or compilation theory. The mosaic theory of classification applies to some of the information associated with the operating location near Groom Lake. Although the operating location near Groom Lake has no official name, it is sometimes referred to by the name or names of programs that have been conducted there. The names of some programs are classified; all program names are classified when they are associated with the specific location or with other classified programs. Consequently, the release of any such names would disclose classified information.

4. National Security Information: As the head of the agency responsible for information regarding the operating location near Groom Lake, I have determined that information that concerns this operating location and that falls into any of the following categories, is validly classified:

a. Program(s) name(s); b. Mission(s); c. Capabilities; d. Military plans, weapons, or operations; e. Intelligence sources and methods; f. Scientific or technological matters; g. Certain physical characteristics; h. Budget, finance, and contracting relationships; i. Personnel matters; and, j. Security sensitive environmental data.

The following are examples of why certain environmental data is sensitive to the national security. Collection of information regarding the air, water, and soil is a classic foreign intelligence practice, because analysis of these samples can result in the identification of military operations and capabilities. The presence of certain chemicals or chemical compounds, either alone or in conjunction with other chemicals and compounds, can reveal military operational capabilities or the nature and scope of classified operations. Similarly, the absence of certain chemicals or chemical compounds can be used to rule out operations and capabilities. Revealing the composition of the chemical waste stream provides the same kind of exploitable information as does publishing a list of the chemicals used and consumed. Analysis of waste material can provide critical information on the makeup as well as the vulnerabilities of the material analyzed. Disclosure of such information increases the risk to the lives of United States personnel and decreases the probability of successful mission accomplishment.

5. Role of State and Federal-Environmental Agencies: Since 1990, appropriately cleared representatives of Nevada’s Department of Conservation and Natural Resources have been authorized access to the operating location near Groom Lake. The state representative’s role is and has been to monitor and enforce compliance with environmental laws and regulations and to advise on remedial efforts, if required. Appropriately cleared officers of the U.S. Environmental Protection Agency were recently granted access to the operating location near Groom Lake for inspection and enforcement of environmental laws. Federal inspectors from the Environmental Protection Agency commenced an inspection pursuant to the Solid Waste Disposal Act, commonly referred to as a “RCRA inspection,” at the operating location near Groom Lake, Nevada on December 6, 1994.

[EDITOR’S NOTE: Groom Lake: On May 19, 1995, the Director of the FFEO and the Deputy Assistant Secretary of the U.S. Air Force signed a memorandum of agreement ensuring that EPA has continued access to the operating location near Groom Lake for administering environmental laws. Moreover, due to national security concerns, the Air Force agreed to provide reasonable logistical assistance to EPA. Finally, EPA agreed that any classified information obtained by EPA would be treated in accordance with applicable laws and executive orders regarding classified materials.]

The Air Force has taken these steps to ensure full compliance with all applicable environmental laws. At the same time that the operating location near Groom Lake is being inspected for environmental compliance, it is essential to the national security that steps also be taken to prevent the disclosure of classified information.

6. Invoking Military and State Secrets Privilege: It is my judgment, after personal consideration of the matter, that the national security information described in this Declaration and in the classified Declaration, concerning activities at the U.S. Air Force operating location near Groom Lake, Nevada, constitutes military and state secrets. As a result, disclosure of this information in documentary or testimonial evidence must be barred in the interests of national security of the United States. Pursuant to the authority vested in me as Secretary of the Air Force, I hereby invoke a formal claim of military and state secrets privilege with respect to the disclosure of the national security information listed in paragraph four of this Declaration and more fully discussed in the classified Declaration, whether through documentary or testimonial evidence.

7. Environmental Compliance: Although I have found it necessary to invoke the military and state secrets privilege, I believe it important to comment on the Air Force’s commitment to full compliance with the environmental laws of the United States. Our goal is to be the best possible environmental steward of the lands comprising the Nellis Range. To meet that goal we are cooperating and will continue to cooperate with both federal and state environmental agencies.

8. Under penalty of perjury, and pursuant to section 1746 of Title 28, U.S. Code, I certify and declare that the foregoing statements are true and correct.

Executed this 21st day of February 1995 at Arlington, Virginia.

Sheila E. Widnall Secretary of the Air Force

What Happened In 2000:

The outcome – five (5) years later – was that before President Clinton left office, he signed a document sealing the lid on the secret, once and for all, which also sealed the U.S. Government workers fate. Interests of “national security” were cited.

Below, is an exact copy of the document signed by the U.S. President.

THE WHITE HOUSE Office of the Press Secretary

For Immediate Release February 1, 2000

TO THE CONGRESS OF THE UNITED STATES:

Consistent with section 6001(a) of the Resource Conservation and Recovery Act (RCRA) (the “Act”), as amended, 42 U.S.C. 6961(a), notification is hereby given that on September 20, 1999, I issued Presidential Determination 99-37 (copy enclosed) and thereby exercised the authority to grant certain exemptions under section 6001(a) of the Act.

Presidential Determination 99-37 exempted the United States Air Force’s operating location near Groom Lake, Nevada from any Federal, State, interstate, or local hazardous or solid waste laws that might require the disclosure of classified information concerning that operating location to unauthorized persons.

Information concerning activities at the operating location near Groom Lake has been properly determined to be classified, and its disclosure would be harmful to national security. Continued protection of this information is, therefore, in the paramount interest of the United States.

The determination was not intended to imply that in the absence of a Presidential exemption, RCRA or any other provision of law permits or requires the disclosure of classified information to unauthorized persons. The determination also was not intended to limit the applicability or enforcement of any requirement of law applicable to the Air Force’s operating location near Groom Lake except those provisions, if any, that would require the disclosure of classified information.

WILLIAM J. CLINTON

THE WHITE HOUSE,

January 31, 2000

– –

Submitted for review and commentary by,

Concept Activity Research Vault ( CARV ), Host
E-MAIL: ConceptActivityResearchVault@Gmail.Com
WWW: http://ConceptActivityResearchVault.WordPress.Com