AntiMatter Technology Problems

AntiMatter Technology Problems


AntiMatter Technology Problems by, Concept Activity Research Vault

May 16, 2011 09:42:4 ( PST ) Updated ( Originally Published: May 10, 2011 )

CALIFORNIA, Los Angeles – May 16, 2011 – The global scientific community is eyeing suspiciously a 1952 ‘experimental projects’ organization known as the Conseil Européen pour la Recherche Nucléaire ( CERN ) [ ( also known as ) European Organization for Nuclear Research ] wherein its Large Hadron Collider ( LHC ) consists of a huge 27-mile in diameter high-energy particle collider is conducting some extremely serious experiments involving what scientists and physicisys say involves something called a “CP-violation” that deals with creating a variety of new subatomic particles that are believed to have never existed anywhere on Earth.

There is quite a bit of controversy concerning something called a “strangelet” ( strangelets ) and other particle creations within the CERN experiment, which because of conjectures in scientific theories are feared by some professionals, could create a ’new subatomic particle’ that may upset the balance of Earth as we know it, and what is even more frightening is that if something goes out-of-control, it may take anywhere between 1-year to 5-years ‘before anyone notices a chain reaction having already been created that some have already identified as a ‘micro-blackhole’ that could theoretically begin consuming Earth from within its own magnetic iron core. Sounding like ‘science fiction’, apparently CERN experiments are ’definitely not’ something to be taken lightly.

This serious and highly controversial subject amongst scientists and physicists around the world is being touched-on in this report, amongst other related information, amongst which includes video clips ( below ) for better understanding some of the many aspects for public knowledge not being addressed by mainstream news broadcasts.

CERN went even further, though, by expanding its deep underground experiments to conduct related experiments in outerspace within what it calls the Alpha Magnetic Spectrometer ( AMS / AMS-02 ) now scheduled for launch aboard the U.S. Space Shuttle Endeavor STS-134 mission set for May 16, 2011. The AMS-02 is, however, to be delivered to the International Space Station ( ISS ) where it will continue CERN designated experiments.

Interestingly, during July 2010 the Alpha Magnetic Spectrometer ( AMS / AMS02 ) was ‘not’ launched as the video clip ( above ) depicted. The Alpha Magnetic Spectrometer ( AMS / AMS-02 ), being equated to that of the Hubble space telescope, actually holds far more technological advancements from CERN and is solely designed to focus on subatomic particles surrounding antimatter issues.

U.S. Space Shuttle Endeavor mission STS-134 was scheduled to launch on April 14, 2011 but was delayed until the end of April 2011, but then was delayed yet again until May 16, 2011. Why so many delays and reschedulings?

Earth anti-matter issues are rarely addressed by the mainstream news media with the public, however in-lieu of the recent NASA public warning that it is expecting a ‘significant’ “solar flare” to erupt, coming bound for Earth, as something “we all need to be concerned about,” the Alpha Magnetic Spectrometer ( AMS ) having just been recently placed onboard the U.S. Space Shuttle Endeavour mission – scheduled to deliver the AMS aboard the International Space Station ( ISS ) – is something the public really needs to take a closer look at.

[ PHOTO ( above ): Alpha Magnetic Spectrometer ( AMS / AMS-02 ) in U.S. Space Shuttle Endeavour cargo bay April 2011 ( click to enlarge ) ]

– –

Source: Nature.Com

AntiUniverse Here We Come by, Eugenie Samuel Reich

May 4, 2011

A controversial cosmic ray detector destined for the International Space Station will soon get to prove its worth.

The next space-shuttle launch will inaugurate a quest for a realm of the Universe that few believe exists.

Nothing in the laws of physics rules out the possibility that vast regions of the cosmos consist mainly of anti-matter, with anti-galaxies, anti-stars, even anti-planets populated with anti-life.

“If there’s matter, there must be anti-matter. The question is, where’s the Universe made of antimatter?” says Professor Samuel C.C. Ting, a Nobel prize winning physicist at the Massachusetts Institute of Technology ( MIT ) in Cambridge, Massachusetts. But most physicists reason that if such antimatter regions existed, we would have seen the light emitted when the particles annihilated each other along the boundaries between the antimatter and the matter realms. No wonder the Professor Samuel C.C. Ting brainchild, a $2,000,000,000 billion dollar space mission was sold ‘partly on the promise of looking for particles emanating from anti-galaxies’, is fraught with controversy.

Professor Ting’s project, however has other ‘more mainstream scientific goals’ so, most critics of which held their tongues last week as the U.S. Space Shuttle Endeavour STS-134 mission – prepared to deliver the Alpha Magnetic Spectrometer ( AMS version, known as the AMS-02 ) to the International Space Station ( ISS ) – flight was delayed ( because of problems ) until later this month ( May 2011 ).

Pushing The Boundaries

Seventeen ( 17 ) years in the making, the Alpha Magnetic Spectrometer ( AMS ) is a product of the former NASA administrator Dan Goldin quest to find remarkable science projects for the Internation Space Station ( ISS ) and of the Ting fascination with anti-matter.

Funded by NASA, the U.S. Department of Energy ( DOE ), plus a sixteen ( 16 ) country consortium of partners, the Alpha Magnetic Spectrometer ( AMS ) has prevailed – despite delays and technical problems – along with the doubts of many high-energy and particle physicists.

“Physics is not about doubt,” says Roberto Battiston, deputy spokesman for the Alpha Magnetic Spectrometer ( AMS ) and physicist at the University of Perugia, Italy. “It is about precision measurement.”

As the Alpha Magnetic Spectrometer ( AMS ) experiment headed to the Space Shuttle Endeavour launch pad, Roberto Battiston and other scientists were keen to emphasize the Alpha Magnetic Spectrometer ( AMS ) ‘unprecedented sensitivity’ to the gamut of cosmic rays, that rain down on Earth, that should allow the Alpha Magnetic Spectrometer ( AMS ) to perform two ( 2 ) things:

1. Measure Cosmic Ray High-Energy Charged ‘Particles’ and ‘Properties’ ( thereof ), sent from:

– Sun ( Earth’s ); – Supernovae ( distant ); and, – γ-ray bursts.


2. Detect AntiMatter ( errant chunks ), sent from the:

a. Universe ( far-away ).

Cosmic rays ( on Earth ) can only be indirectly detected by their showers of ‘secondary particles’ produced – when slamming into molecules of atmosphere in high regions above the Earth, but the Alpha Magnetic Spectrometer ( AMS ) in space will get an undistorted view.

“We’ll be able to measure ( solar ) Cosmic Ray Flux very precisely,” says collaboration member physicist Fernando Barão of the Laboratory of Instrumentation and Experimental Particle Physics in Lisbon, Spain. “The best place ( for detecting this ) is to be in ‘space’ because you don’t have Earth’s atmosphere that is going to destroy those cosmic rays.”

No matter what happens, with the more speculative search for antimatter, the Alpha Magnetic Spectrometer ( AMS ) should produce a definitive map of the cosmic ray sky – helping to build a kind of ‘astronomy not dependent on light’.

Alpha Magnetic Spectrometer ( AMS ) consists of a powerful permanent magnet surrounded by a suite of particle detectors.

Over 10-years ( or more ), that the Alpha Magnetic Spectrometer ( AMS ) experiment will run, the Alpha Magnetic Spectrometer ( AMS ) magnet will bend the paths of cosmic rays by an amount that reveals their energy and charge, thereby their identity.

Some will be ‘heavy atomic nuclei’, while others ( made from anti-matter ), will reveal themselves by ‘bending in the opposite direction’ from their ‘matter’ counterparts ( see, e.g. cosmic curveballs ).

By ‘counting positrons’ ( i.e. antimatter ‘electrons’ ), the Alpha Magnetic Spectrometer ( AMS ) could also ‘chase a tentative signal of dark matter’, the so-far ‘undetected stuff’ thought to account for ‘much of the mass of the Universe’.

In 2009, Russia and Italy researchers – with the Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics ( PAMELA ) onboard a Russia satellite – published evidence of an ‘excess amount of positrons in the space environment surrounding Earth’ ( O. Adriani et al. Nature 458 , 607–609; 2009 ). One potential source of this is the ‘annihilation of dark-matter particles’ within the ‘halo enveloping our Galaxy’.

Another speculative quest, is to follow up on hints of ‘strange matter’, a ‘hypothetical substance’ that should be found in ‘some collapsed stars’ containing ‘strange quarks’, ‘up quarks’ and ‘down quarks’ – within ordinary nuclei.

NASA Alpha Magnetic Spectrometer ( AMS ) program manager Mark Sistilli says hints of ‘strange matter’ were seen – during a 1998 pilot flight of the Alpha Magnetic Spectrometer ( AMS / AMS-01 ) aboard the Space Shuttle, however NASA determined results ‘too tentative to publish’.

Because the Alpha Magnetic Spectrometer ( AMS / AMS-02 ) status was made as an “exploration mission,” the Alpha Magnetic Spectrometer ( AMS ) ‘did not need to follow’ “peer review” NASA would ‘normally have required’ for a ”science mission.”

But Sistilli emphasizes the Alpha Magnetic Spectrometer ( AMS ) earned flying colors from committees convened by the U.S. Department of Energy ( DOE ), which is supplying $50,000,000 million of the funding.

Now their ( DOE ) confidence will be put to the test.


– –

While for some it may appear strangelet subatomic antimatter particle research is for advancing our knowledge of unlocking the secrets of life in the Universe, others are still asking NASA what they really know is behind ‘why’ an ‘expected significant’ Solar Energetic Particle Event ( SEPE ) is something “we all need to be concerned about” on Earth.

With Solar Energetic Particle Event ( SEPE ) high-energy effects capable of disrupting Earth ground-based and space-based electrical components and electricity grid infrastructure systems for up to 10-years, many wonder why billions upon billions of dollars were and are still being pumped into the CERN project studying ‘strangelets’ and want to know just why we need ‘more immediate information detection capabilities’ on high-energy solar flare proton and electron ejections coming toward Earth soon, which NASA and other agencies ‘know far more about’ than they are willing to tell the public.

How advanced has government authorities grown from private-sector science and technology knowledge? The United States has already mapped internal magma flows of the Sun.

How could the U.S. government possibly ‘see inside the Sun’ to know when a coronal mass ejection from a solar flare would occur in the future?

In layman terms, for government it was like looking through a clear glass Pyrex bowl positioned atop a stove burner, watching as water starts to boil inside it, and then predicting – based on the flame heating it the water – when bubbles will come to the surface, when one takes into account a government ’ground-based’ ( does ‘not’ require ‘space-based placement’ ) observatory telescope equipped with a “super lens” used for imaging ( observing ) ‘objects at great distances inside matter’ – a “superlens” that now even ‘defies light-speed’ and ‘matter’. ( Read Below )

– –

[ PHOTO ( above ): Antimatter photon ‘optic’ substrate structure material for ‘subsurface solar imaging plasma flows’ inside Sun enables plotting Coronal Mass Ejections ‘before solar surface eruptions’ ( click to enlarge ) ]

Source: U.S. Department of Energy, Lawrence Berkeley National Laboratory, Operated by the University of California

Optical Antimatter Structure Shows The Way For New Super Lens by, Aditi Risbud

April 21, 2009

A device, made from alternating layers of ‘air’ and ‘silicon photonic crystal’, behaves like a ‘super lens’ – providing the first experimental demonstration of optical antimatter.

Scientists at Berkeley Lab ( Berkeley, California, USA ) and the Institute for Microelectronics and Microsystems ( CNR ) in Naples, Italy have experimentally demonstrated – for the first time – the ‘concept of optical antimatter’ by ‘light traveling through a material without being distorted’.

By engineering a material focusing light through its own internal structure, a beam of light can enter and exit ( unperturbed ) after traveling through millimeters of material.

For years, optics researchers have struggled to bypass the ‘diffraction limit’, a physical law restricting imaging resolution to about 1/2 the wavelength of light used to make the image.

If a material with a negative index of refraction ( a property describing how light bends as it enters or exits a material ) could be designed, this diffraction hurdle could be lowered.

Such a material could also behave as a superlens, useful in observing objects from imaging equipment with ‘details finer than allowed by the diffraction limit’, a physical law restricting imaging resolution to about 1/2 the wavelength of light used to make the image.

Despite the intriguing possibilities posed, by a substance with a negative index of refraction, ‘this property is inaccessible through naturally occurring ( positive index ) materials’.

During the mid 1990s, English theoretical physicist Sir John Pendry proposed his clever ‘sleight of light’ using so-called metamaterials – engineered ‘materials’ whose underlying structure ‘can alter overall responses’ to ‘electrical fields’ and ‘magnetic fields’.

Inspired by the Sir John Pendry proposal, scientists have made progress in scaling metamaterials from microwave to infrared wavelengths while illuminating the nuances of light-speed and direction-of-motion in such engineered structures.

“We’ve shown a ‘completely new way to control and manipulate light’, ‘using a silicon photonic crystal’ as a ‘real metamaterial’ – and it works,” said Stefano Cabrini, Facility Director of the Nanofabrication Facility in the Molecular Foundry, a U.S. Department of Energy ( DOE ) User Facility located at Lawrence Berkeley National Laboratory ( LBNL ) providing support to nanoscience researchers around the world.

“Our findings will open-up an easier way to make structures and use them effectively as a ‘super-lens’.”

Through the Molecular Foundry user program, Cabrini and post-doctoral researcher Allan Chang collaborated with Vito Mocella, a theoretical scientist at the Institute of Microelectronics and Microsystems ( CNR ) in Naples, Italy to fabricate a 2 X 2 millimeter device consisting of alternating layers of air and a silicon based photonic crystal containing air holes.

Using high precision nanofabrication processes, the team designed the spacing and thicknesses of each layer to behave like the metamaterial Sir John Pendry had envisioned.

This device was then used to focus a beam of near-infrared ( I-R ) light, essentially ‘annihilating’ 2 millimeters of ‘space’.

“Now that we have a prototype to demonstrate the concept, our next step will be to find the geometry and material that will work for visible light,” said Cabrini.

Along with possibilities in imaging, the researchers’ findings could also be used to develop hybrid negative-index and positive-index materials, Cabrini added, which may lead to novel ‘devices’ and ‘systems’ unachievable through either material alone.

“Self-collimation of light over millimeter-scale distance in a quasi zero average index metamaterial,” by Vito Mocella, Stefano Cabrini, Allan S.P. Chang, P. Dardano, L. Moretti, I. Rendina, Deirdre Olynick, Bruce Harteneck and Scott Dhuey, appears in Physical Review Letters available in Physical Review Letters online.

Portions of this work were supported by the U.S. Department of Energy ( DOE ) Office of Science, Office of Basic Energy Sciences under Contract No. DE-AC0205CH11231.

The Molecular Foundry is one ( 1 ) of five ( 5 ) U.S. Department of Energy ( DOE ) Nanoscale Science Research Centers ( NSRC ) that are premier national user facilities for interdisciplinary research at the nanoscale. Together, the U.S. Department of Energy ( DOE ) Nanoscale Science Research Centers ( NSRC ) comprise a suite of complementary facilities providing researchers with state-of-the-art capabilities to fabricate, process, characterize and model nanoscale materials, which constitutes the ‘largest infrastructure investment’ of the National Nanotechnology Initiative ( NNI ).

U.S. Department of Energy ( DOE ) Nanoscale Science Research Centers ( NSRC ) are located at these six ( 6 ) locations:

– Argonne National Laboratory ( ANL ); – Brookhaven National Laboratory ( BNL ); – Lawrence Berkeley National Laboratory ( LBNL ); – Oak Ridge National Laboratory ( ORNL ); – Sandia National Laboratory ( SNL ); and, – Los Alamos National Laboratory ( LANL ).

For more information about the DOE NSRCs, please visit

Berkeley Lab is a U.S. Department of Energy ( DOE ) National Laboratory located in Berkeley, California conducting ‘unclassified scientific research’ managed by the University of California.


– –

If the public could keep its eye open for one second, it would see what is coming at them before it hits them with a surprise that only government knows anything about, but governments have spoken mysteriously to citizens for a very long time, but perhaps a mere fact ’known today’ may eventually come as no surprise to many whom would have otherwise been kept in the dark while only a few know far more about what awaits the masses.

Perhaps, people may begin asking more questions of their country’s agencies spending so much money so quickly for apparently some ‘mysterious emergency purpose’, and if not for some ‘mysterious emergency purpose’, why is so much money being spent on science and space projects while the global public is told about serious government budget cutbacks causing so many people to suffer? If there is no ’emergency’, then people should know ‘why they are suffering financially more’ – just for the sake of ‘growing science experiment budgets’? Might be a good idea for everyone to begin keeping their eyes open a little more often and trained on something more than light-hearted mainstream media news comedy broadcasts.

If people think they get real serious about ‘what they know’ as told on television news broadcasts, imagine how much more serious they will become when they learn about what they ‘were not told’?

Think about it. How Fast Is Technology Growing? Just beginning to grasp something ‘new’? Now think about something even newer than the Large Hadron Collider ( LHC ) at CERN, the Relativistic Heavy Ion Collider ( RHIC ) that added its Solenoidal Tracker At RHIC ( STAR ) claiming to ‘reverse time’ by ultra super computers reconstructing sub-atomic particle interactions producing particles – emerging from each collision – that STAR is believed to be able to ‘run time backward’ in a process equated to examining final products coming out-of a factory that scientists and physicists have no idea ’what kinds of machines produced the products’. Basically, they are developing items so fast, they do not know how they were formed, much less what the capabilities are. Fact is, ’they could easily produce a monster’ and ‘not know what it is until after they are eaten by it’. Scary, really, like kids being given matches to play with.

They are being educated beyond their own intelligence, so much so and to the point by which scientists and physicists ’cannot even grasp what ‘it’ is they’re looking at – much less know what they are trying to manipulate to ‘see what it does next’ – nevertheless they are conducting experiments like children playing with dynamite.

Think this is science fiction? Think they are mad scientists at play? Check the research reference links ( below ). Think antimatter technology has advanced alot since you began reading this report? calculate ‘more’ because the public does not even know half of it.

CERN has been operating since 2002, and the “SuperLens” was worked-on ‘before’ 2002, making ‘both’ today now 10-years old.

Want newer ‘news’?

Superlenses – created from perovskite oxides – are simpler and easier to fabricate than ‘metamaterials’.

Superlenses are ideal for capturing light travelling in the mid-infra-red ( IR ) spectrum range, opening even newer technological highly sensitive imaging devices, and this superlensing effect can be selectively turned ‘on’ and ‘off’, opening yet another technology of ‘highly dense data storage writing’ for ‘far more advanced capability computers’.

Plasmonic whispering gallery microcavities, consisting of a silica interior coated with a thin layer of silver, ‘improves quality by better than an order of magnitude’ of current plasmonic microcavities. and paves the way for ‘plasmonic nanolasers’.

Expand your knowledge, begin researching the six ( 6 ) reference links ( below ) so that the next time you watch the ‘news’ you’ll begin to realize just how much you’re ‘not being told’ about what is ‘actually far more important’ – far more than you’re used to imagining.


Submitted for review and commentary by,


Concept Activity Research Vault E-MAIL: ConceptActivityResearchVault@Gmail.Com WWW: http://ConceptActivityResearchVault.WordPress.Com



2 responses to “AntiMatter Technology Problems

  1. Gravitational Lensing and more information ( below ):


    Source: National Center for Computational Sciences ( NCCS ) – Oakridge, Tennessee, USA

    In The Spotlight

    Invisible Means of Support
    by, Leo Williams

    August 7, 2008

    Astrophysicists simulate the dark matter that cradles a galaxy

    Sometime in the early 1930s, the eminent Switzerland astronomer Fritz Zwicky noticed something very odd as he was looking to the skies: Galaxies move around each other too fast.

    Zwicky was scrutinizing a group of eight ( 8 ) galaxies – orbiting one another – more than 350,000,000 ( million ) light-years away; in the Coma Galaxy Cluster.

    Sir Issac Newton and Albert Einstein principles, allowed Swiss astronomer Zwicky to understand ‘the balance of forces’ necessary to maintain the delicate balancing act for galaxies to exist, where every object ( with ‘mass’ ) exerts a ‘gravitational pull’ ( on other objects ); from a yo-yo ( dancing on a string ) to a galaxy, each ( both ) requires:

    1. Centrifugal Force – ‘pushing outward’ on galactic mass ( e.g. ‘yo-yo’ – of itself ), but with ‘too much outward force’, a galactic system ‘flies outward and apart’; and,

    2. Gravity – ‘pulling inward’ ( e.g. ‘string’ – of yo-yo ) on galactic mass, but with ‘too much inward force’, a galactic system ‘collapses in on itself’.

    Astronomer Zwicky problem was the Coma Galaxy Cluster he was observing ‘appeared to have too little mass’, and therefore ‘too little gravity’ – to hold together. Hence, these galaxy systems should have flown off into space.

    To explain this, Zwicky ( and his colleagues ) concluded these galaxies ( and all other galaxies ) are dominated ( controlled ) by the ‘principles of ‘matter’, and that ‘matter’ that was ‘invisible’ would be identifed as ‘dark matter’.

    Armed with a knowledge of how gravity works plus extensive observations of planets, stars and galaxies, scientists have in-fact concluded, that in the Universe:

    a. Less than one-fifth ( 1/5 th ) of all ‘matter’ is ‘visible’; and,

    b. More than four-fifths ( 4/5 th ) of all ‘matter’ is ‘invisible’ ( dark matter ); and,

    c. Dark matter ( invisible matter ) has ‘no interaction with regular matter’ except via ‘gravitational force’.

    There is ‘so much dark matter’ ( ‘invisible’ matter ) in the Universe, that ‘dark matter gravitational forces dominate ( control ) ‘star lifespan’ and ‘galaxy lifespan’.

    What would ‘dark matter look like’ if we could see it?

    Astrophysicist Piero Madau of the University of California – Santa Cruz ( UCSC ) led a team, that included:

    Juerg Diemand ( of UCSC );
    Marcel Zemp ( of UCSC ); and,
    Michael Kuhlen of the Institute for Advanced Study ( Princeton, New Jersey, USA ).

    Reviewed by them was a simulation that took a substantial step toward answering the question of what dark matter looks like.

    Jaguar ( CRAY XT5 ) supercomputer ( Oak Ridge National Laboratory – ORNL ) was used by the team of Piero Madau running ( PKDGRAV2 ) the ‘largest simulation ever’ on ‘dark matter’ as it would appear to evolve over billions of years – on a simulated galaxy ( like the Milky Way ).

    This enveloping became known as the ‘Dark Matter Halo’ ( DMH ).

    Findings were published in their paper entitled, “Clumps and Streams in the Local Dark Matter Distribution” seen in the journal “Nature” ( August 7, 2008 issue ).

    The simulation followed a galaxy size of dark matter through almost the entire history of the Universe, and divided that dark matter into more than 1,000,000,000 ( billion ) separate parcels. Each parcel of ‘dark matter’ was 4,000 times as massive as our Sun. It was a staggering job, tracking 9,000 trillion trillion trillion tons of ‘invisible’ stuff ( dark matter ) – spread across 176 trillion trillion trillion square miles – evolving over 13,000,000,000 ( billion ) years.

    Hypothetical Particles With Real Gravity

    Scientists do not know exactly what dark matter is, but believed hypothetical particle ( Weakly Interacting Massive Particle or WIMP ) candidates include, the:

    Sterile Neutrino;
    Axion; or,
    Another type of WIMP.

    Researchers ‘believe they do not need to know what dark matter is’ – in order to simulate it.

    Researchers ‘believe they only need to know that ‘dark matter interacts with other matter’ ( only through gravity ) and that it is cold – meaning ‘dark matter is made up of particles that were moving slowly when galaxies and clusters began forming’.

    Using initial conditions provided by observations of the cosmic microwave background, Piero Madau and his team were able to simulate dark matter through a computer application called PKDGRAV2 ( developed by a group of numerical astrophysicists at the University of Zurich, Switzerland ) that ignored visible matter and focused entirely on the gravitational interaction between a billion dark matter particles.

    The project had a major allocation of supercomputer time through the U.S. Department of Energy Innovative and Novel Computational Impact on Theory and Experiment ( INCITE ) program using about 1,000,000 ( million ) processor hours on the Jaguar system located at National Center for Computational Sciences ( NCCS ) of the Oak Ridge National Laboratory.

    “The computer was basically just computing gravity,” Piero Madau explained. “You have to compute the gravitational force between 1,000,000,000 ( billion ) particles, and to do that is very tricky. You’re following the orbits of these particles in a gravitational potential that is varying all the time. The code, allows you to compute with very high precision the gravitational force due to the particles that are next to you and with less and less precision the gravitational force due to the particles that are very far away, because the gravity becomes weaker and weaker with distance.”

    Dark matter is not evenly spread out, although researchers believe it was nearly homogeneously distributed right after the Big Bang.

    Over time, however, dark matter became bunched and clumped as gravity pulled it together, first into tiny clumps ( more or less ) like the mass of Earth.

    These clumps were pulled together into larger clumps, which were pulled together into still larger clumps, and so on until they combined to form ‘halos’ of dark matter massive enough to host galaxies. Hence, this enveloping was named ‘Dark Matter Halo’ ( DMH ).

    One ( 1 ) key question, then, is whether the smaller clumps would remain identifiable or would smooth-out within the larger galactic halos.

    The answer required a state-of-the-art supercomputer such as Jaguar, which at the time of the simulations in November 2007 was capable of nearly 120,000,000,000,000 ( trillion ) calculations a second.

    Because they did not have the resolution to resolve any unevenness, smaller simulations showed dark matter smoothing-out, especially within the dense inner reaches of the galaxy.

    Piero Madau’s billion-cell simulation, however, provided enough resolution to verify that subclumps and sub-subclumps do indeed survive, even in very inner regions – where our solar system is located.

    “You expect a hierarchy of structure in cold dark matter,” Madau explained. “What you don’t know is what sort of structure will survive the assembly because as these subclumps come together, they are subject to tidal forces, and they can be stripped and destroyed. So their existence in the field had been predicted. The issue was whether they would survive as assembled together to bigger and bigger structures.”

    “What we find,” he continued, “is the survival fraction is quite high.”

    Madau’s team will be able to verify its simulation results using the National Aeronautics and Space Administration ( NASA ) Gamma-Ray Large Area Space Telescope ( GLAST ) – launched on June 11, 2008 – will scan the heavens to study some of the most extreme and puzzling phenomena in the universe, such as: gamma ray bursts, neutron stars, supernovas, and dark matter – just to name a few.

    Dark matter direct detection is being pursued by large underground detectors so, while dark matter particles cannot be detected by NASA’s GLAST ( Gamma-Ray Large Area Space Telescope ), researchers believe dark matter particles and antiparticles may be annihilated when they bump into each other – producing gamma-rays that can be observed from space.

    The ‘clumps’ of dark matter, predicted by Madau’s team, should bring more particles together and thereby produce an increased level of gamma-rays.

    Piero Madau and his colleagues produced a second ( 2nd ) paper entitled, “The Dark Matter Annihilation Signal from Galactic Substructure: Predictions for GLAST” – scheduled for publication in the Astrophysical Journal ( September 10, 2008 ).

    Unbending Light Gravity – Gravitational Lensing

    A second ( 2nd ) verification comes from an effect known as ‘gravitational lensing’ where ( exerted by a galaxy along the line-of-sight ) ‘gravity bends the light’ traveling from faraway quasars in the background’.

    If dark matter halos ( DMH ) of galaxies are as clumpy as this simulation suggests, light from a distant quasar should be broken up – like a light shining through frosted glass.

    “We already have some data there,” Piero Madau noted, “which seems to imply that the inner regions of galaxies are rather clumpy. The flux ratios of multiple imaged quasars are not as you would predict with a smooth intervening lens potential. Instead of a smooth lens, there is substructure that appears to be affecting the lensing process. Our simulation seems to produce the right amount of lumpiness.”

    While this simulation may help explain why the universe is as it is, it leaves other questions unanswered. For example, Madau noted that while dark matter in the galaxy is very clumpy, that same feature is not found in visible matter. In fact, the Milky Way is orbited by far fewer dwarf galaxies than might be predicted by the simulation.

    “We see as many as twenty ( 20 ) dwarf galaxy satellites of the Milky Way that are luminous,” he explained, “and we estimate that there may be a hundred or so. But we predict thousands of relatively massive clumps. So one issue is, why are most of those clumps dark? It seems that we’re not very efficient at forming stars, and so there must be some sort of threshold where stars can only form in more massive potential.”



    – CARV [ ]

  2. Source: MSNBC.Com

    TV Video Clip

    Cosmonauts Install Earthquake Predictor On ISS
    by, Al Stirrett ( NBC News reporter )

    February 24, 2011

    On February 16, 2011 two ( 2 ) Russia cosmonauts – aboard the International Space Station ( ISS ) – ‘began installing outside the ISS’, the following:

    – ‘Earthquake prediction and seismic forecast’ monitoring via ‘experimental optic video camera sensor’ ( the MSNBC.Com TV broadcast pronounced the monitoring program in ‘Russia language phonetics’ as “Radio Metria,” which in English translates to “radiometry” ); and;

    – ‘Terrestrial lightning storms’ monitoring via ‘experimental optical radiation sensors’ ( multiple ).

    On February 23, 2011 both installations were completed ‘outside the ISS’, which began providing both ‘long-range optical image data streams’ and ‘long-range scientific sensing measurements’.

    These data monitorization feeds are being received ‘in’ Moscow, Russia – according to the MSNBC.Com TV video report ( 24FEB11 ).



    – –

    – CARV [ ]

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s