By measuring the time interval between transmission and reception of a signal from one satellite, the electronics in a GPS receiver on earth can compute its own exact distance from that satellite. When that information is combined with the distance from at least three other satellites, the location of the receiver–longitude, latitude, and altitude–is computed electronically via triangulation. Depending on the specifics of the system and the processing of the data, position can be fixed with accuracy from about 100 meters to less than a centimeter.
The explosive growth of GPS applications can be traced to plummeting costs for the technology and leveraging the concept to provide value beyond simple position-location information. “GPS capability is now being integrated with a broad range of different technologies to create brand-new capabilities,” says Swiek.
For instance, embedded in farm equipment, GPS facilitates site-specific delivery of fertilizer. First, soil samples are taken at known locations in a field to create a nitrogen-content “map.” These soil-condition data are fed into a software program in the spreading equipment, which operates in conjunction with a GPS receiver on the tractor. GPS tells where the tractor is, and the mapping data show the requirements of the soil at any point. As the equipment moves across the field, fertilizer is spread at only the rate required at that location. Savings to the farming industry resulting from less over-fertilization is estimated at millions of dollars.
Likewise, a GPS receiver in the vehicle that went over the embankment could communicate data via the phone line to pinpoint location. In fact, the application is currently under development at Ashtec Inc., Sunnyvale, Calif. Tracking of fleet vehicles and inventory for just-in-time delivery are related applications that combine onboard GPS receivers and a communications link.
The basics of a GPS receiver are power source, antenna, display panel, and the GPS circuitry to make the necessary computations. As recently as 1990, the GPS circuitry itself cost about $1,500. “Today, we offer semiconductor chips that perform the same function and are now the size of a postage stamp for about $50,” says Roger Stevens, general manager-automotive electronics at Rockwell Automotive, Troy, Mich. “As a result, GPS has gone from being a fascinating, kind of sexy technology in the early ’90s that everyone wanted to use to [one whose] applications are actually going to be taking off.”
While cost reduction should open the floodgates in all uses, most impact will be felt in consumer applications–estimated by the USGPS Industry Council to be 60% of the market in the year 2000–including such things as automobile navigation and integration of GPS in cellular phones and laptops, as well as portable units for outdoor recreation.
Automobile navigation alone is expected to grow tenfold from 1995 to 2000 to $3 billion and will be the largest single application. Drivers literally plug in a destination and are given turn-by-turn instructions as they are driving. They can pop in a CDROM holding information on popular restaurants, check out the menus, and make a selection, and the navigation system will direct them to the eatery. Key to this system is a digital map database of the locale that the navigation system can coordinate with.
“Currently mapping is available for some 15 major cities and connecting areas in the U.S.,” says Stevens; “The cost of GPS is almost a nonissue anymore, so the major driver will be how soon we can have national mapping data available.”
Hertz, Avis, and National have vehicles equipped with GPS in major markets, and Oldsmobile offers a driver-information system as an option.
THE BIG MARKET FOR automotive navigation right now, however, is in Japan, where 700,000 GPS units were installed in Japanese cars in 1995, compared with 20,000 in European luxury cars and 2,000 in U.S. cars, according to Stevens. Even in major Japanese cities, many streets lack name signs, and addresses are not in numerical order. Consequently, need has driven extensive digital mapping, providing the infrastructure for GPS navigation systems.
Consumer/cellular, the second-largest market category for GPS, is also expected to realize tenfold growth from 1995 to 2000, to greater than $2 billion. Already, savvy hikers use handheld GPS units no bigger than a TV remote control to aid navigation and store exact locations of scenic spots, while fisherman use GPS to return to productive fishing locations. These recreational models have dropped in price from about $3,000 in 1989 to less than $200 for an entry-level model, according to Jim White, spokesman for consumer-GPS specialist Magellan Corp., San Dimas, Calif.
Available at Kmart, Wal-Mart, L.L. Bean, and sporting-goods stores, Magellan units also have found application in tracking tortoises in the Mojave desert, mapping the ozone layer, tracking oil spills, navigating among oil wells in the jungles of New Guinea, and precisely locating the Titanic.
Skyrocketing growth of GPS in the consumer segment will likewise be propelled by its increasing combination with wireless communications. “GPS is one of the three legs of the stool involved in any mobile computing environment,” says Charlie Trimble, president of GPS equipment manufacturer Trimble Navigation Ltd., Sunnyvale, Calif. “It provides the mobile address you need to go with the communication link and computational capability.”
Wireless networks will use the precision timing capability for video conferencing, notes Scott Pace, GPS policy analyst at the Rand Corp., Washington. “You’ll need precision time information to handle the higher data rates that distributed multimedia requires. So you’ll see laptop computers with GPS cards embedded, not just for location but also for precision time. GPS and embedded time-support information will be a crucial utility.”
AT&T Corp. already uses GPS-originating time signals to synchronize the timing of digital communications across 300,000 kilometers of transmission network. And flow of information packets across the Internet is controlled via GPS-based timing by the Slow Start protocol, where release of packets can be delayed by milliseconds based on network availability to avoid congestion. Time-based applications also include security and authentication of electronic transactions, especially internationally.
In fact GPS satellites send out two different signals, one highly encrypted and difficult to jam for secure military use (P-code) and one for commercial users (C/A-code), which like any radio signal can be picked up freely by anyone. To discourage hostile use, the commercial signal is purposely made less accurate by the U.S. Air Force by adjusting the perceived accuracy of the time signal sent by the satellites. Accuracy from the C/A-code commercial signal is 80 to 100 meters, while the military P-code signal provides 10-to-15-meter accuracy.
This selective availability (SA) of a more accurate signal to the military is of concern to some commercial developers that equate accuracy with market potential. A marketing study by Booz, Allen & Hamilton shows a 50% increase in potential between 1995 and 2000 if SA were “turned off,” making the commercial signal accurate to 15 to 20 meters.
ON MAR. 29, THE OFFICE OF Science & Technology Policy released its national policy on GPS, which gives continued support via the defense budget for the foreseeable future, continued worldwide availability of free C/A-code signals, “and to replace SA with some other, more effective means to safeguard U.S. forces in the next four to 10 years,” Swiek summarizes.
The SA issue is a bit of a moot point, because the most accurate expression of GPS is not P-code, but differential GPS (DGPS), the route many commercial developers are choosing. In DGPS, a ground-based transmitter sends out a reference signal from a precisely known location. Comparing that to the GPS-based location, an error or bias–say, five meters northwest–is established for that vicinity. Establishing a communications link with the reference station allows constant refinement of the GPS data, resulting in significant increases in accuracy and less drift.
For instance, as of January, the U.S. Coast Guard has established some 50 DGPS reference transmitters for coastal U.S. and inland waterways, allowing a stated accuracy of less than 10 meters within 150 to 200 miles of the strategically located transmitters. Anyone equipped with a DGPS-beacon receiver added to the standard GPS receiver can pick up the signal.
Likewise, the Federal Aviation Administration is implementing a DGPS network to aid airline navigation in the U.S. and is encouraging other countries to do the same to create an international system of DGPS-based guidance.
The more number crunching that can be accomplished with the combination of GPS and DGPS signals, the more accuracy can be realized. For instance, scientific teams set up DGPS transmitters to allow “monitoring of earth and glacier movements to less than one centimeter with data post-processing,” says Len Kruczynski, director of strategic relationships at Ashtec. Surveyors can process data in real time in moving vehicles to yield 5-to-10-centimeter accuracy.
And for accuracy where it really counts, Proshot Golf, Newport Beach, Calif., offers a Trimble Navigation-based system that shows distance from golf cart to the green and even suggests club selection when set up at courses with appropriate transmitter and communications links.
DGPS correction signals are now available pretty much nationwide on a subscription basis from private companies that leverage FM radio signals and pager networks. “You subscribe to theft services like cable TV for about $10 per month, and that gets you down to the one-to-five-meter range,” says Swiek.
The three components of GPS are the satellites that transmit signals, the network of ground stations and the users’ passive receivers. The satellite constellation consists of 24 satellites. The first GPS satellite was launched on February 14, 1989, and the 24th – and last to complete the initial GPS constellation – was launched on March 9, 1994. The GPS satellite has an estimated life expectancy of about seven years. Thus, on March 27, 1996, the first replacement launch of a GPS satellite was conducted. Twenty replacement satellites, Block IIR, will be procured from Lockheed Martin for deployment in the late 1990s through 2009.
Eventually, all the Block IIR satellites will be launched and reach the end of their lives. Block IIF, a series of follow-on sustainment satellites, is currently in the planning stages. Deliveries of Block IIF birds are planned for 2001 through 2010. Built by Boeing North American, Inc. (formerly Rockwell Space Systems), the satellites will have a life expectancy of 12.7 years.
The satellites are deployed in a 10,900-nmi circular orbit with a 12-hr period. Four satellites are located in each of six planes inclined at 55 [degrees] to the plane of the earth’s equator. Each satellite continuously broadcasts pseudorandom codes on two frequencies, L1 at 1,575.42 MHz and L2 at 1,227.6 MHz. L1 is modulated with two types of code: the CA, coarse/acquisition, and the P, precision, code. L2 carries only the P-code.
The network of ground stations consists of monitors at widely spaced, precisely known locations. Transmissions from the satellites are received and data forwarded to the master station, where this data is analyzed and the GPS time and universal standard time are compared. The master station prepares signal-coding corrections and change orders for the satellite control facility, which uploads data to the satellites.
The final components are the users’ GPS receivers, which are the subject of this month’s sampling. A generic GPS receiver might consist of an antenna, one or more receiving channels (with required downconverters), microprocessor, memory, command and display units and an appropriate power supply. The antenna, usually of omnidirectional hemispheric-coverage design, may consist of a mono-pole, dipole, quadrifiliar helix (volute), spiral helix, microstrip patch or other configuration. System architectures include single-channel sequential, single-channel multiplexed and a single channel per satellite. With the single-channel sequential receiver, one of the required four satellites is tracked continuously for a number of seconds before the next is acquired or reacquired and tracked. Repeated sequencing through the four yields enough information to provide the desired fix.
With single-channel multiplexing, the sequencing rate is so high that the individual bits from the signals of the four satellites are viewed in an almost simultaneous fashion. This approach requires less time to deliver a first fix but suffers signal loss from self-jamming introduced by the multiplexing process. With multiple-channel receivers, at least four channels are required, one dedicated to each satellite. In practice, at least five channels are used – the fifth channel being used to acquire the next satellite needed for a continuous update.
MILITARY GPS USAGE
The first single-channel manpack receivers were awkward and heavy, tipping the scales at close to 20 lbs. In March 1993, Rockwell won a contract to deliver nearly 94,000 AN/PSN-11 Precision Lightweight GPS Receiver (PLGR) units to the US Department of Defense (DOD). The PLGR is a five-channel differential GPS receiver that weighs less than 3 lbs, has built-in or remote antenna capability and uses RS-232 and RS-422 data ports. To date, more than 60,000 PLGR units have been delivered to customers worldwide. PLGR, the first military GPS receiver built with “best commercial parts and practices,” has spawned a line of products for surface mobility applications.
The DOD awarded a $2.4 million contract in May 1995 for the introduction of enhanced PLGR features. Enhancements include reduced power consumption, improved operational software and a change in color from tan to green. Form fit and function are unchanged. The enhanced PLGR will be compatible with the standard PLGR.
Military applications have advanced far beyond the use of GPS to find positional information. The incorporation of GPS and inertial measurement units has converted dumb iron bombs into “smart” bombs (see Zachary Lum’s “New Concepts in Precision Air-to-Ground Targeting,” JED, July 1993, p. 32).
During the incubation of GPS, concern was raised over measures that might be taken to prevent an enemy from using the GPS system against the US. To thwart such usage, the ground-control station can deliberately introduce satellite timing and position errors into satellite transmissions. This strategy, called selective availability (SA), reduces the accuracy of civilian and unauthorized users to 100 m 90% of the time – enough for navigation but inadequate for weapons delivery. A further security measure is the encryption of the P-code, yielding the P(Y) mode. But the effects of SA can be virtually eliminated in a local area through differential GPS (DGPS), a technique requiring two receivers, one of which is at a known location. DGPS is widely used for precise surveying work even when SA is invoked.
On March 28, 1996, a Presidential Decision Directive (PDD) dealing with the future of the GPS system was issued by the White House. While reaffirming that the number-one goal of the system is national security, the PDD acknowledged the growing civil dependence upon the availability of accurate positional information. Therefore, the decision was made to cease the use of SA by the year 2006. Yearly reviews are scheduled to be held after 2000, with the goal of expediting the elimination of SA.
Depending on which company designs the search technology and how the client chooses to program it, a search engine using visual-navigation techniques might display information as 3-D maps, as starbursts of icons, or as folders within folders.
The goal of all this is not just to make pictures but to save time and eliminate frustration in the search process. “A vast amount of information is useless unless you can manage it,” says Karen Lachtanski, marketing communication manager for Inxight Software Inc. in Palo Alto, CA, a software company that is 80 percent owned by Xerox Corp.
Inxight’s Hyperbolic Tree design tool allows designers to organize large Web sites into branching-tree maps of related categories. Instead of drilling through page after page, the Hyperbolic Tree allows users to see separate icons for the related categories and subcategories. For example, a depiction of the Louvre Museum’s Web site (a demo can be seen on Inxight’s Web page at www.inxight. com) groups painters in one area, with separate clusters for impressionists, cubists, etc. Another area of the map is devoted to sculptors, with links that branch into various periods or styles. Yet another provides branching links to the different books and magazines published by the museum.
Knowledge Organizer, from Verity Software in San Francisco, uses site maps that resemble an overhead view of a golf course to display information. “Knowledge is organized around categories that are determined by the user,” says Laura Ramos, director of product marketing. And along with category titles, Knowledge Organizer uses color-coding. The more relevant the topics in a cluster of links, the brighter the color will be, she says, which makes the search process quicker.
Why would you want to see all this information on one page? The visual-display layout lets users see associated topics that may be of interest. “Most people have an idea of what they are searching for and look only for that,” says Inxight’s Lachtanski. “Seeing associated data may trigger them to think differently about their topic.”
One obvious use for the visual-navigation approach is knowledge management. Imagine the potential complexity of a Web page that tried to show every employee in a company along with each person’s areas of expertise. But if you organized that information in easy-to-navigate topic clusters, you could greatly speed the process of discovering who knows what about a given topic. A visual depiction of a customer information database, another potential application, might be clustered into states, products purchased or sales reps, according to the needs of the user.
VIT, a Palo Alto, CA, company that develops Web-based applications for tracking supply-chain data, uses Inxight’s Hyperbolic Tree in its See Chain supply-chain management tool. The visual-navigation metaphor lets customers view every step of the supply-chain process, says Subhash Chowdery, VIT’s chief technology officer and founder. Depending on how the user customizes the software, key performance indicators, such as on-time delivery or quotas, might show up in red if the numbers are down, he says. “This way you don’t have to look through 1,000 reports to find the problems.”
The drawback with standard navigation tools, Chowdery says, is that as you choose one path of information, you eliminate others. “The more you drill down, the less you see,” he says. “Visual navigation lets you zoom in on one type of information without giving up the rest. With this metaphor you don’t lose context.”
If embraced, visual navigation could alter the look and feel of the Web. Will the world be ready to learn and design for yet another Web tool once we’ve finally mastered the search software of today? Maybe – provided the new approach does a better job of delivering the information we want.