Dr. Marc Rayman's Mission Log
 



  February 4, 1999

Mission Update:


Thank you for visiting the Deep Space 1 mission status information site, for over 100 days the most popular source on any habitable planet in or near the plane of the Milky Way galaxy for information on this technology validation mission. This message was logged in at 9:30 pm Pacific Time on Thursday, February 4, 1999.

Deep Space 1 continues to make valuable progress in testing its advanced technologies.

The mission has been collaborating with the Deep Space Network in a number of telecommunications experiments. NASA has antennas in Madrid, Spain, near Canberra, Australia, and near Goldstone, California for communicating with probes in deep space, but only certain antennas at Goldstone are capable of receiving the special signals that DS1 can send, so the tests occur when the spacecraft is within view of that location on Earth. In December, DS1 validated a very small, lightweight amplifier made by Lockheed-Martin for radio signals at a frequency about 4 times higher than the current standard frequency used for deep-space missions. This frequency band, meaninglessly called Ka-band, is like another channel in the radio spectrum and offers the possibility of sending more information with less power, important for future small but capable spacecraft. Current tests are helping the Deep Space Network develop the capability to receive Ka-band routinely for future spacecraft and are continuing to quantify the difference in performance between the Ka-band and the more commonly used radio frequency for deep space telecommunications.

Tonight a test of one of DS1's autonomy technologies, the beacon monitor experiment, is being conducted. When beacon monitor is used, it will summarize the overall health of the spacecraft. Then it will select one of 4 radio tones to send to Earth to indicate how urgently it needs contact with the large antennas of the Deep Space Network. These tones are easily detected with low cost receivers and small antennas, so monitoring a spacecraft that uses this technology will free up the precious resources of the Deep Space Network. Each tone is like a single note on a musical instrument. One tone might mean that the spacecraft is fine, and it does not need contact with human operators. Another might mean that contact is needed sometime with a month, while a third could mean that contact should be established within a week. The last is a virtual red alert, indicating the spacecraft and, therefore, the mission, are in jeopardy. In tonight's test, the spacecraft transmitted 4 different beacon signals to verify predictions of how well experimental instruments could detect them. Although it could send strong signals, the spacecraft is programmed tonight to send weaker ones as a test of whether these beacon signals are indeed as easy to detect as expected.

The autonomous navigation system, unimaginatively but affectionately known as AutoNav, continues to conduct optical navigation imaging sessions each week. But this week's was different. For the first time, the operations team allowed the session to proceed without monitoring it. In addition, much of the normal, exceedingly careful ground testing that precedes most spacecraft activities was avoided. These demonstrate the confidence in AutoNav as well as the savings in having autonomous systems on board. The activity AutoNav executed consists of commanding the spacecraft to turn to point its new technology camera at asteroids and stars and taking images of them. The apparent position of an asteroid relative to the much more distant stars will allow AutoNav later in the mission to estimate where it is in the solar system. This is based on parallax and is the same phenomenon you observe if you hold a finger in front of your face and view it through each eye separately. The apparent position of your finger shifts as you switch from one eye to the other. As an example of how this is applied, suppose that distant trees are visible through a window in your house. If I took a picture from inside your house and showed it to you, you could find exactly where I had been standing when I took the picture by lining up the edge of the window with the distant trees. Similarly, because the autonomous navigation system knows where the asteroids are and where the more distant stars are, it can determine where it is in the solar system when the picture is taken. The images taken are being used by AutoNav's designers to test computer routines for processing the pictures. The new routines will be sent to the spacecraft next week. The successful demonstrations of AutoNav's control over important spacecraft systems are another step in preparing NASA for an exciting future in which many of the responsibilities normally fulfilled by human controllers will be transferred to intelligent spacecraft.

Deep Space 1 carries two advanced technologies testing the reduction in the size, mass, and cost of future scientific instruments. Those of you who have read these logs conscientiously have read about the plasma instrument PEPE. The other instrument combines two black and white cameras with infrared and ultraviolet spectrometers, all in one small package. The spectrometers break light into its individual colors, much as a prism does. This reveals a great deal about the source of the light, such as the chemical composition of the material that is reflecting it. Traditional spacecraft would have 3 separate devices to accomplish the same functions as this one. But for NASA to launch smaller, more cost effective missions, it will be important to integrate these functions into small packages. Thus, DS1 is testing a miniature integrated camera spectrometer, which, following the tradition of innovative naming, is known by its initials as MICAS. MICAS has been used to provide the pictures for AutoNav. Last week, the operations team carefully turned on the sensitive ultraviolet detector in MICAS. This week, the spacecraft was turned to point at several orientations relative to the Sun, and the solar arrays were rotated relative to the Sun, to learn more about sources of stray light that is reaching the instrument. The innovative design of MICAS is a result of a collaboration among the United States Geological Survey, SSG, Inc., the University of Arizona, Boston University, the Rockwell Science Center, and JPL.

During the weeks of February 8 and 15, DS1's main computer will be reprogrammed. The new software will give the spacecraft new capabilities to validate more technologies and fix bugs that have shown up during flight. The challenge of loading and activating new software on a spacecraft in flight is significant, and the team has been preparing for this for quite some time. Your faithful correspondent will provide you with a progress report on February 13.

Deep Space 1 is almost 75 times as far away as the moon now. At this distance of over 28 million kilometers, or nearly 18 million miles, it takes radio signals more than 3 minutes to make the round trip.





Thank You For Logging In!