US20020069220A1 - Remote data access and management system utilizing handwriting input - Google Patents
Remote data access and management system utilizing handwriting input Download PDFInfo
- Publication number
- US20020069220A1 US20020069220A1 US09/533,564 US53356400A US2002069220A1 US 20020069220 A1 US20020069220 A1 US 20020069220A1 US 53356400 A US53356400 A US 53356400A US 2002069220 A1 US2002069220 A1 US 2002069220A1
- Authority
- US
- United States
- Prior art keywords
- data
- routine
- user
- spreadsheet
- coupled
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
Definitions
- the present invention relates to a data management system, and more particularly, to a data management system for a mobile computer.
- each cell can store formulas or special instructions specifying calculations to be performed on the numbers stored in the cells. Upon receipt of new data, the formulas were automatically updated to support “what if” scenarios.
- Computerized spreadsheets offered many advantages over the old pen-and-paper approach. For one, these spreadsheets were capable of supporting very large spreadsheets that would be unwieldy to maintain by hand. Further, the computerized spreadsheets were capable of supporting scenario calculations where the entered information may be quickly recalculated with different assumptions. Thus, these computerized spreadsheets offered dramatic improvements in ease of creating, editing and applying mathematical models such as financial forecasting. Similarly, databases allowed users to maintain vast quantities of data and to manipulate the information via query commands. Thus, the usefulness of spreadsheets, databases and other business applications made them staple software for data summary, advanced numerical analysis and charting applications.
- the portable computing appliance needed to provide information integration advantages, including the ability to capture data from scanners, barcode readers, or the Internet, over the cheaper pen and paper approach to further justify the expense associated with such electronic computer systems.
- information advantages arising from integrating data associated with a global positioning system (GPS) are needed in the management and control of field personnel to ensure that the employees are actually at the respective expected locations.
- GPS global positioning system
- an ability to link information generated at the client's site with follow-up discussions and letters necessary to close the transaction is needed to enhance the efficiency of field personnel.
- the present invention provides a spreadsheet and a database on a portable computer which accepts data from an input recognizer which includes a non-cursive handwriting recognizer or a speech recognizer.
- the portable computer can communicate data directly with another computer or over the Internet using wireless media such as radio and infrared frequencies or over a landline. It is endowed with a plurality of built-in or snap-on expansion accessories to enhance the data capture capability as well as the ease of reading data from the limited screen of the present invention.
- These accessories include a camera, a scanner, a voice recorder or voice capture unit, a global position system (GPS) receiver and a remote large screen television.
- GPS global position system
- the camera and scanner allows visual data to be capture
- the voice recorder allows the user to make quick verbal annotations into a solid state memory to minimize the main memory requirements
- the voice capture unit allows the voice to be captured into memory for subsequent transmission over the Internet or for voice recognition purposes.
- the spreadsheet or database receives data from the Internet or from the accessories and further can graph or manipulate the data entered into the spreadsheet as necessary.
- the database has a smart search engine interface which performs fuzzy search such that inexact queries can still result in matches. The smart search engine thus allows users to locate information even though the exact spelling or concept is not known.
- the spreadsheet and database can spawn and train an intelligent agent to capture data from a suitable remote source such as the Internet and transmit the data to the spreadsheet or database for further analysis.
- the user can capture data directly by scanning or dictating the information into the spreadsheet or database.
- the geographical information can be generated automatically via the GPS receiver.
- Data from the receiver is communicated via a suitable pager or wireless transceiver back to either a mapping application or other management tools to allow management to monitor the field user's whereabouts.
- a pan and zoom capability is provided to provide the user with an appropriately scaled view of the data for ease of reading.
- the portable computer is within range of a larger display device such as an appropriately equipped television display or a personal computer with a larger display
- the present invention's wireless link transmits the video information to the larger display to allow the user to view data the larger display unit.
- the portable computer when the portable computer is within range of a suitably equipped stereo receiver, the portable computer transmits MIDI data streams to the receiver such that the MIDI sound generator can produce high quality sound for multimedia applications running on the portable computer, even though the stereo receiver is not tethered to the portable computer of the present invention.
- FIG. 1 is a block diagram of a portable computer system for providing data management support in accordance with the present invention
- FIG. 1B is a flowchart illustrating a first embodiment of a file system for the computer of FIG. 1 which is IBM-PC compatible;
- FIG. 1C is a flowchart illustrating a second embodiment of the file system for the computer of FIG. 1;
- FIG. 2A is a block diagram showing in more detail a scanner from FIG. 1;
- FIG. 2B is a block diagram showing in more detail another scanner for the computer system of FIG. 1 having a wireless link;
- FIG. 3 is a block diagram showing a remote display unit with a wireless link which is adapted to communicate with the computer system of FIG. 1;
- FIG. 4 shows a block diagram showing in more detail protocol layers linking a network aware application operating on the computer of FIG. 1 and another application over the Internet;
- FIG. 5 is an illustration of a connectivity architecture of the computer system of FIG. 1 and the Internet as well as the data flow among computers connected to the Internet and the computer system of FIG. 1;
- FIG. 6 is a flowchart illustrating the process for handling events in a spreadsheet data management system in the computer system of FIG. 1;
- FIG. 7 is a flowchart illustrating the process for handling system events in FIG. 6;
- FIG. 8 is a flowchart illustrating in more detail the scroll process of FIG. 7;
- FIG. 9 is a flowchart illustrating the process for editing cell contents of FIG. 7;
- FIG. 10 is a flowchart illustrating the process to save a cell in FIG. 9;
- FIG. 11 is a flowchart illustrating the process for evaluating a formula of FIG. 10;
- FIG. 12 is a flowchart illustrating the process for handling menu events of FIG. 6;
- FIG. 13 is a flowchart illustrating the zoom process of FIG. 12;
- FIG. 14 is a flowchart illustrating the process for updating the spreadsheet cells of FIG. 6 using a remote database
- FIG. 15 is a flowchart illustrating the process for identifying rows/columns to update in FIG. 14;
- FIG. 16 is a flowchart illustrating the process for retrieving data over a network such as the Internet using a browser
- FIG. 16A is a flowchart illustrating the process for executing the browser of the present invention.
- FIG. 17 is flow chart of the process for scanning information using the scanner of FIG. 2A and updating the data management system of FIG. 1;
- FIG. 18 is flow chart of the process for copying information using the scanner of FIG. 2A and storing or transmitting the data using the computer system of FIG. 1;
- FIG. 19 is a flow chart of the process for linking and transmitting display information from the computer system of FIG. 1 to a larger display device for ease of reading;
- FIG. 19A is a flow chart of the process for teleconferencing with a remote user and for visually sharing an electronic chalkboard
- FIG. 20 is a flowchart illustrating the process for capturing voice annotation using a voice recorder shown in FIG. 1;
- FIG. 21 is a flowchart illustrating the process for capturing and processing voice commands and annotations using the microphone and analog to digital converter of FIG. 1;
- FIG. 22 is a flow chart of the process for operating an intelligent agent in conjunction with the computer system of FIG. 1;
- FIG. 23 is a flowchart illustrating the process for operating a database in accordance with the computer system of FIG. 1;
- FIG. 24 is a flowchart illustrating the process for generating a form for the database of FIG. 23;
- FIG. 25 is a flowchart illustrating the process for searching the database of FIG. 23;
- FIG. 26 is a flowchart illustrating the process for using the GPS receiver of FIG. 1;
- FIG. 27 is a flowchart illustrating an agent on the computer of the present invention which prepares information needed for a meeting
- FIG. 28 is a flowchart illustrating the process for,collecting data and interacting with various information networks using the computer of the present invention during the meeting;
- FIG. 29 is a flowchart illustrating the process for following up on outstanding action items using the data management computer of the present invention.
- FIG. 1 illustrates the computer system of the present invention for managing data.
- the computer system is preferably housed in a small, rectangular portable enclosure.
- a processor 20 or central processing unit (CPU) provides the processing capability for the sketching system of the present invention.
- the processor 20 can be a reduced instruction set computer (RISC) processor or a complex instruction set computer (CISC) processor.
- RISC reduced instruction set computer
- CISC complex instruction set computer
- the processor 20 is a low power CPU such as the MC 68328 V DragonBall device available from Motorola Inc.
- the processor 20 is connected to a read-only-memory (ROM) 21 for receiving executable instructions as well as certain predefined data and variables.
- the processor 20 is also connected to a random access memory (RAM) 22 for storing various run-time variables and data arrays, among others.
- the RAM 22 is sufficient to store user application programs and data. In this instance, the RAM 22 can be provided with a back-up battery to prevent the loss of data even when the computer system is turned off.
- it is generally desirable to have some type of long term storage such as a commercially available miniature hard disk drive, or non-volatile memory such as a programmable ROM such as an electrically erasable programmable ROM, a flash ROM memory in addition to the ROM 21 for data back-up purposes.
- the RAM 22 stores a database of the spreadsheet of the present invention, among others.
- the computer system 10 of the present invention has built-in applications stored in the ROM 21 or downloadable to the RAM 22 which include, among others, an appointment book to keep track of meetings and to-do lists, a phone book to store phone numbers and other contact information, a notepad for simple word processing applications, a world time clock which shows time around the world and city locations on a map, a database for storing user specific data, a stopwatch with an alarm clock and a countdown timer, a calculator for basic computations and financial computations, and a spreadsheet for more complex data modeling and analysis.
- add-on applications such as time and expense recording systems taught in U.S. application Ser. No.
- the processor 20 is also connected to an optional digital signal processor (DSP) 23 which is dedicated to handling multimedia streams such as voice and video information.
- DSP 23 is optimized for video compression using JPEG/MPEG standards known to those skilled in the art.
- the DSP 23 is equipped to handle the needs of a voice recognition engine.
- the DSP 23 is shown as a separate unit from the CPU 20 , the present invention contemplates that the DSP 23 can also be integrated with the CPU 20 whereby the CPU 20 can rapidly execute multiply-accumulate (MAC) instructions in either scalar or vector mode.
- MAC multiply-accumulate
- the computer system of the present invention receives instructions from the user via one or more switches such as push-button switches in a keypad 24 .
- the processor 20 is also connected to a real-time clock/timer 25 which tracks time.
- the clock/timer 25 can be a dedicated integrated circuit for tracking the real-time clock data, or alternatively, the clock/timer 25 can be a software clock where time is tracked based on the clock signal clocking the processor 20 .
- the clock/timer 25 is software-based, it is preferred that the software clock/timer be interrupt driven to minimize the CPU loading. However, even an interrupt-driven software clock/timer 25 requires certain CPU overhead in tracking time.
- the real-time clock/timer integrated circuit 25 is preferable where high processing performance is needed.
- the timer portion of the clock/timer 25 can measure a duration count spanning one or more start times and completion times, as activated by the user.
- the timer portion has a duration count register which is cleared upon the start of each task to be timed.
- the timer portion of the clock/timer 25 has an active state and a passive state. During operation, when the user toggles the timer portion into the active state, the duration count register is incremented. When the user toggles the timer portion into the suspended state, the duration count is preserved but not incremented. Finally, when the user completes a particular task, the value of the duration count register is indicated and stored in a database to track time spent on a particular task.
- the processor 20 drives a PCMCIA bus 26 which provides a high speed interface or expansion bus.
- PCMCIA represents both the PC Card standard (which specifies both card hardware and system software requirements), and the organization responsible for developing it. Originally, the standard was designed exclusively for memory cards (Release 1.0) used in small handheld and laptop systems in lieu of a floppy disk drive known as Type I cards. The next PCMCIA standard (Release 2.0 and up) were expanded to include I/O cards, such as modems or network cards. A thicker Type III card and Type IV card were also defined and are often used for hard drives. Each PCMCIA slot or socket is connected to one PCMCIA adapter to control one or more slots.
- Socket Services a memory resident driver
- Socket Services talks directly to the PCMCIA adapter hardware, and other programs talk to Socket Services to control a PC Card in one of that adapter's slots.
- the PCMCIA standard also describes a software layer called Card Services, which acts as a librarian of system resources.
- Card Services acts as a librarian of system resources.
- system resources I/O ranges, interrupts, and memory ranges
- the program that does the resource allocation may be a stand-alone program, or it maybe built into an enabler
- the resources used may be listed in a file, or might be specified on the command line of the enabler or Card Services.
- the PCMCIA port of the present invention can accept hard disk drives such as ATA compatible hard drives.
- ATA stands for AT Attachment (IBM AT personal computer attachment), and is an interface which is electrically identical to a common hard disk interface.
- ATA mass storage devices whether mechanical hard disks or solid-state memory cards which appear as disk drives, require another driver to be loaded in the system. ATA drivers must be loaded after Socket Services and Card Services.
- a filing system is provided.
- the disk drive is a solid state disk drive employing flash memory or other non-volatile memory.
- a Flash Filing System (FFS) is provided to handle the peculiarities flash memory cards, including a limited write cycle, often on the order of 10,000 writes or so, before wearing out as well as the delay associated with erasing and rewriting information on these cards.
- the FFS driver performs wear-balancing to avoid wearing out the media prematurely, and works to hide performance delays in writing to the card.
- the computer system can also acquire visual information via a charged coupled device (CCD) or a CIS unit 27 .
- the CCD/CIS unit 27 is further connected to a lens assembly 28 for receiving and focusing light beams to the CCD or CIS for digitization.
- the CCD/CIS unit 27 thus can be either a digital camera or a page scanner, as shown in more detail in FIG. 2. Images scanned via the CCD/CIS unit 27 can be compressed and transmitted via a suitable network such as the Internet, via cellular telephone channels or via facsimile to a remote site.
- the CPU 20 and/or the DSP 23 operate to meet the ITU's H. 324 standard on multimedia terminal for low-bit-rate visual services for analog telephony.
- the DSP 23 supports H.261 video encoding an decoding at CIF resolution of up to 15 frames per second, H.263 video and G.723 audio, MPEG-1 audio and video play-back, MPEG-1 video encoding, JPEG and motion JPEG, and advanced motion estimation, input and output video scaling, noise-reduction filters and forward error correction.
- the PCMCIA expansion bus 26 is also adapted to receive a radio tuner or a TV tuner 29 , which is in turn connected to a built-in antenna.
- the radio/TV tuner 29 receives radio and/or TV signals and digitizes the information for suitable processing by the CPU 20 . In this manner, the user of the computer 10 can listen to the radio or watch television while he or she works.
- the PCMCIA bus 26 is also adapted to receive a data storage device, or disk 30 . Additionally, the PCMCIA bus 26 can receive a wireless transceiver 31 , which is connected to an antenna 32 .
- the wireless communication device 31 satisfies the need to access electronic mail, paging, mode/facsimile, remote access to home computers and the Internet.
- wireless communication device 31 is an analog cellular telephone link where the user simply accesses a cellular channel similar to the making of a regular voice call.
- Digital wireless networks such as cellular digital packet data (CDPD) can be used.
- CDPD provides data services on a non-interfering basis with existing analog cellular telephone services.
- PCS Personal Communication Services
- the two-way communication device 31 can also be a two-way pager where the user can receive as well as transmit messages.
- the two-way communication device supports a Telocator Data Protocol by the Personal Communications Association for forwarding binary data to mobile computers. The standard facilitates transmission of images and faxes over paging and narrowband PCS networks.
- the two-way communication device 31 can be substituted with a cellular telephone, whose block diagram and operation are discussed in detail in a co-pending U.S. application Ser. No. 08/461,646, hereby incorporated-by-reference.
- the present invention contemplates that the two-way communication device 31 be compatible with pACT (personal Air Communications Technology), which is an open standard developed by Cirrus Logic—PCSI and AT&T Wireless Services Inc. (Kirkland, Wash.).
- pACT personal Air Communications Technology
- pACT is a narrowband, 900 Mhz range PCS technology derived from the cellular digital packet data transmission standard.
- pACT is optimized for applications such as acknowledgment paging, mobile e-mail, wireless Internet access, voice paging, personal home security and dispatch services.
- pACT provides full data encryption and authentication, enabling the efficient, full delivery of reliable and secure messages.
- a REFLEX protocol from Motorola Inc. may be used. The REFLEX protocol is commercially supported by SkyNet-Mtel Corporation.
- the two-way communication device 31 has a receiver, a transmitter, and a switch, all are controlled by the CPU 20 via the bus of the portable computer system of FIG. 1.
- the switch receives an input from the antenna 32 and appropriately routes the radio signal from the transmitter to the antenna 32 , or alternatively, the radio signal from the antenna 32 to the receiver in the event the processor 20 is expecting a message.
- the processor 20 controls the receiver, the transmitter, and the switch to coordinate the transmission and receipt of data packets.
- the receiver and transmitter are standard two-way paging devices or standard portable cellular communication chips available from Motorola, Inc. in Schaumburg, Ill. or Philips Semiconductors in Sunnyvale, Calif.
- the antenna 32 is preferably a loop antenna using flat-strip conductors such as printed circuit board wiring traces as flat strip conductors have lower skin effect loss in the rectangular conductor than that of antennas with round-wire conductors.
- a plurality of fields are provided, including a header field, a destination address field, a source address field, a date/time stamp field, a cyclic redundancy check field, and a data field.
- the header field functions as a synchronization field.
- the destination address field specifies the unique address for the receiving two-way communication device 31 .
- the source address field in the transmitting direction from the base station to the two-way communication device 31 specifies the base station identification address, which may change to account for base station rerouting in the event that two-way communication device roaming is allowed.
- the source address field contains the two-way communication device 31 address to permit the base station to identify the reply pages transmitted to a previously sent page.
- the date/time stamp field contains data reflecting the second, minute, hour, day, month and year of the transmitted packet.
- the date/time stamp information allows the base station to determine to send a time-out message to the individual initiating the page request in the event that the two-way communication device 31 does not timely acknowledge receipt of the page message.
- the cyclic redundancy check (CRC) field allows the two-way communication device 31 to verify the integrity of the messages received and transmitted by the two-way communication device 31 .
- the data status includes coherency information enabling the host computer and the computer of FIG. 1 to synchronize data.
- the data status field carries information such as whether the data has been modified, the staleness of the data, the replacement status of the data, among others.
- the data field is a variable length field which allows variable length messages to be transmitted to and from the two-way communication device 31 .
- the user can be paged and can reply as well.
- a page originator wishing to send an alphanumeric page to the user of the computer system of FIG. 1 places a call using a telephone or an originating computer.
- the telephone interface routes the call to the base station.
- the computer at the base station will either digitally or verbally query the page originator to enter an identification number such as a telephone number of the two-way communication device 31 .
- the base station computer further prompts the page originator for the message to be sent as well as the reply identification number, or the call-back number.
- the base computer Upon receipt of the paging information, the base computer transmits a page message to the computer system with the two-way communication device 31 of FIG. 1, which may reply using predetermined text templates, or more flexibly using keyboard, voice, handwriting, sketching or drawing, as discussed in the incorporated by reference U.S. patent application Ser. No. 08/684,842, entitled “GRAPHICAL DATA ENTRY SYSTEM.”
- the present invention is compatible with Always-On-Always-Connected (AOAC) mobile clients connected to the Internet via wireless communications.
- AOAC Always-On-Always-Connected
- the wireless messaging networks based on GSMISMS, pACT, Reflex, and similar narrowband 2-way paging services are significantly different from existing packet networks in that (1) Packet sizes are small (typically around 100 bytes); (2) Typical latency is much longer (on the order of seconds to minutes); and (3) The connection is much more intermittent.
- Narrowband sockets (NBS) was created to overcome these limitations by providing fragmentation and reassembly, reliability, and tolerance for intermittency to compliment circuit switched connections with low bandwidth connections over wireless messaging networks.
- the NBS enables a new class of mobile usage, AOAC, with exciting applications like automatic (background) forwarding of email, up to date news, weather, traffic, and personal messaging. It enables the existing cellular and wireless messaging infrastructure to send arbitrary data, rather than just alphanumeric pages.
- the present invention in conjunction with NBS, allows data to find the user, rather than the user always having to initiate the retrieval for the information.
- the Datagram Protocol for NBS is a general purpose, unreliable, connectionless datagram service. It is not intended for applications which require 100% reliable delivery, such as file transfers. NBS datagrams are usually formatted to be readable on devices without NBS. This is useful for sending text based messages to legacy pagers and voice phones NBS has a core set of required features that must be supported in order to provide consistent functionality to developers. However, each narrowband network has different features and header formats. The NBS stack will use existing transport features to implement these core requirements were possible. The NBS stack performs queries to the hardware interface (NB-SERIAL NDIS miniport) for available network features. If required features such as ports and fragmentation are not supported by the network, then the NBS stack adds these features to the payload of each message.
- NB-SERIAL NDIS miniport hardware interface
- Datagram packets are transferred using the message services of an underlying network, where typically the bandwidth is small and communication is wireless.
- the protocol assumes the device addressing (Destination Address and the Originating Address) is handled at a higher level, and only adds those features necessary (generally port level addressing and fragmentation).
- DstPort The logical port of the destination application (dd); SrcPort—the logical port of the sender application (as in UDP) (oo);
- RefNum the sequence number of the datagram (kk); MaxNum—the total number of fragments in a single datagram (mm); SeqNum—the sequence number of the fragment, inside the datagram (nn).
- the datagram protocol headers are part of the data segment of the NBS message fragment.
- the headers can be presented in a few formats, depending on the environment used. In the Windows environment, the NBS stack queries the hardware driver (through NDIS) for the current network and packet format. Messages sent shall primarily use the binary format, or the full text based header including the concatenation scheme.
- the processor 20 of the preferred embodiment accepts handwritings as an input medium from the user.
- a digitizer 34 , a pen 33 , and a display LCD panel 35 are provided to capture the handwriting.
- the digitizer 34 has a character input region and a numeral input region which are adapted to capture the user's handwritings on words and numbers, respectively.
- the LCD panel 35 has a viewing screen exposed along one of the planar sides of the enclosure are provided.
- the assembly combination of the digitizer 34 , the pen 33 and the LCD panel 35 serves as an input/output device. When operating as an output device, the screen 35 displays computer-generated images developed by the CPU 20 .
- the LCD panel 35 also provides visual feedback to the user when one or more application software execute.
- the digitizer 34 When operating as an input device, the digitizer 34 senses the position of the tip of the stylus or pen 33 on the viewing screen 35 and provides this information to the computer's processor 20 .
- the present invention contemplates that display assemblies capable of sensing the pressure of the stylus on the screen can be used to provide further information to the CPU 20 .
- the preferred embodiment accepts pen strokes from the user using the stylus or pen 33 which is positioned over the digitizer 34 .
- the position of the pen 33 is sensed by the digitizer 34 via an electromagnetic field as the user writes information to the data management computer system.
- the digitizer 34 converts the position information to graphic data that are transferred to a graphic processing software of the data logger computer system.
- the data entry/display assembly of pen-based computer systems permits the user to operate the data logging computer system as an electronic notepad. For example, graphical images can be input into the pen-based computer by merely moving the stylus over the surface of the screen.
- the CPU 20 senses the position and movement of the stylus, it generates a corresponding image on the screen to create the illusion that the pen or. stylus is drawing the image directly upon the screen.
- the data on the position and movement of the stylus is also provided to a handwriting recognition software, which is stored in the ROM 21 and/or the RAM 22 .
- the handwriting recognizer suitably converts the written instructions from the user into text data suitable for saving time and expense information. The process of converting the pen strokes into equivalent characters and/or drawing vectors using the handwriting recognizer is described below.
- Recognition of the input strokes and recognition of higher level combinations of strokes forming characters and words is performed using recognizers, or recognition domains, each of which performs a particular recognition task.
- a controller is provided for controlling the hypotheses database and for scheduling the recognition tasks in the recognition domains.
- Arbitration resolves conflicts among competing hypotheses associated with each interpretation.
- the recognition domains, or recognizers generate two or more competing interpretations for the same input.
- the recognizers use a data structure called a unit, where a unit is a set of sub-hypotheses together with all their interpretations generated by a single recognizer.
- the handwriting recognizer operates at a first level for identifying one or more groups of related sub-hypotheses using grouping knowledge. These grouped subhypotheses generate a unit with no interpretations for each group and store the unit in the database in what is called a piece-pool memory.
- the Beernink recognizer has a second level of operation where each unit generated in the grouping stage is classified to provide the unit with one or more interpretations. The classified units are stored in a unit pool memory. Two or more interpretations of the input data are combined in a hierarchical structure according to a predetermined scheme in successive steps to form higher level interpretations.
- the Beernink recognizer is flexible and does not require that the user learns special gestures, its accuracy is not perfect. Because the letters in a cursive-lettered word are connected, the recognizer must guess at how to segment the strokes into individual characters. Since ambiguities exist even in stellar samples of penmanship, cursive handwriting recognizers such as those in Beernink face a challenging task in deciphering handwritings. For example, handwriting recognizer have difficulties in trying to determine where a cursive lower-case “n” and “m” begin and end when the two letters, distinguishable from one another only by their number of humps, are strung together in a word.
- the handwriting recognizer tackles such ambiguities by looking in its dictionary-to determine, for instance, which words have “m” before “n” and which have “n” before “m.”
- the user can improve accuracy by writing characters further apart than usual, but that is inconvenient and contrary to the way humans write.
- the handwriting recognizer of the present invention recognizes non-cursive characters. Unlike the Beernink approach to recognizing handwriting in which the user can print or write cursively, the non-cursive handwriting recognizer requires the user to learn to print characters in its fixed style using a basic character set, preferably a 36-character alphanumeric character set. In addition to the basic 26 letters and 10 digits, the non-cursive handwriting recognizer includes multi-step pen strokes that can be used for punctuation, diacritical marks, and capitalization.
- the non-cursive handwriting recognizer is a software module called GRAFFITI, commercially available from U.S. Robotics, Palm Computing Division, located in Los Altos, Calif.
- Each letter in the non-cursive alphabet is a streamlined version of the standard block character—the letter A, for example, looks like a pointy croquet hoop, and the hoop must be started at the dot indicator at the lower right corner—as illustrated and discussed in more detail in the above incorporated-by-reference U.S. patent applications.
- the non-cursive handwriting recognizer achieves a more perfect recognition and, as with stenography, supports an alphabet consisting of characters that can be written much more quickly than conventional ones.
- the computer system is also connected to one or more input/output (I/O) ports 42 which allows the CPU 20 to communicate with other computers.
- I/O ports 42 may be a parallel port, a serial port, or alternatively a proprietary port to enable the computer system to dock with the host computer.
- the I/O port 42 is housed in a docking port 84 (FIG. 5)
- the I/O ports 42 and software located on a host computer 82 FIG. 5
- the synchronization software runs in the background mode on the host computer 82 and listens for a synchronization request or command from the computer system 10 of the present invention. Changes made on the computer system and the host computer will be reflected on both systems after synchronization.
- the synchronization software only synchronizes the portions of the files that have been modified to reduce the updating times.
- the I/O port 42 is preferably a high speed serial port such as an RS-232 port, a Universal Serial Bus, or a Fibre Channel for cost reasons, but can also be a parallel port for higher data transfer rate.
- the I/O port 42 has a housing which is adapted to snappably connect to the housing of a Musical Instrument Digital Interface (MIDI) player 37 , a fax modem 40 , a voice recorder 43 , a GPS receiver 46 and a barcode reader 48 .
- MIDI Musical Instrument Digital Interface
- the computer system 10 drives high quality audio speakers 38 and 39 which connect to the MIDI player 37 to support multimedia applications on the computer 10 .
- MIDI protocol is used in generating sound for games and multimedia applications.
- One advantage of MIDI is storage space, as MIDI data files are quite small when compared with sampled audio sounds. The reduction in storage space follows from the fact that MIDI file does not contain sampled audio data, but the instructions needed by a synthesizer to play the sound.
- Other advantages of MIDI is the ability to edit the file or to change the speed, pitch or key of the sound:
- the output of the MIDI player 37 is provided to an external multi-timbral MIDI synthesizer which can play many instruments such as piano, bass and drums simultaneously.
- the output of the MIDI player 37 can be connected to the synthesizer by wire or wirelessly such as by the infrared communication. In this manner, the MIDI player 37 generates high quality sound to enhance the user experience.
- a fax-modem 40 is adapted to receive information over a telephone 41 via a plain old telephone system (POTS) landline or over the radio frequencies and allow the user to access information untethered. Further, the modem 40 may serve as part of a wide-area-network to allow the user to access additional information.
- POTS plain old telephone system
- the fax-modem 40 can receive drawings and text annotations from the user and send the information over a transmission medium such as the telephone network or the wireless network to transmit the drawings/text to another modem or facsimile receiver, allowing the user to transmit information to the remote site on demand.
- the fax-modem 40 can be implemented in hardware or in software with a few additional components such as a DAA, as is known in the art.
- the fax-modem 40 is a 56 kbps modem using Lucent Technologies'DSP1643, a member of the Apollo# modem chip set.
- the Lucent Technologies' modem chips are designed to accommodate software upgrades for future enhancements to V.flex2 technology from Lucent, so customers'investments will be protected as standards for 56 kbps modems evolve.
- the present invention can achieve Internet connections at rates up to 56 kbps when both they and their Internet service providers (ISPs) and online service providers (OSPs) use modems that incorporate V.flex2 compatible modems.
- the fax-modem device 40 can be a two-way communication device which can receive text messages and graphics transmitted via radio frequency to the user for on-the-spot receipt of messages.
- the fax-modem 40 can also be a digital simultaneous voice and data modem (DSVD), also available from Lucent Technologies.
- DSVD digital simultaneous voice and data modem
- the modem as specified in ITU-729 and 729A specifications, appear as a conventional modem with a downline phone which allows users to place and carry telephone conversations and digital data on a single phone line. These modems multiplex voice data by capturing voice and digitally compress them. Next, the compressed voice data and the digital data are multiplexed and transmitted to the remote end, which if compatible, decompresses the voice data and converts the digital data back to sound. Further, the digital data is presented to the remote computer as is usual. In this manner, the DSVD modem allows audiographic conferencing systems that rely on modems for data communication. DSVD modems are utilized in the blackboard conferencing system in the present invention, as discussed in more detail below.
- the I/O port 42 can also receive a voice recorder 43 .
- voice can be captured, digitally processed by the DSP 23 or the CPU 20 , stored internally in the RAM 22 for conversion into text or inclusion in a document or a file to be transmitted via a suitable network such as the Internet to a remote site for review, as discussed below, voice data can be stored externally and more economically using the voice recorder 43 which stores audio information in its own storage rather than the RAM 22 and thus can operate independently of the computer system of FIG. 1.
- the voice recorder 43 is an ISD33240 from Information Storage Devices Inc. in San Jose, Calif.
- the voice recorder 43 captures analog sound data and stores the analog signals directly into solid state electrically erasable programmable ROM (EEPROM) memory cells which have been adapted to store 256 different voltage levels per cell. As the voice recorder 43 captures voice and audio signals directly into its EEPROM cells, the analog to digital conversion process is not needed.
- the CPU 20 communicates with the voice recorder 43 by sending the voice recorder 43 an address along with other control signals to the voice recorder 43 . In this manner, the CPU 20 can control the location where sound is to be played and/or recorded. Furthermore, as the voice recorder 43 can operate even when it is detached from the computer system of the present invention, the user can simply separate the computer system and carry only the voice recorder 43 when necessary.
- EEPROM electrically erasable programmable ROM
- the voice recorder 43 is connected to the processor 20 via the I/O port 42 .
- the I/O port 42 is connected to the CPU 20 via the bus and can forward commands from the processor 20 to the voice recorder 43 .
- the voice recorder 43 , the microphone 44 and the speaker 45 are located in an external housing which snappably connects to the housing of the computer 10 .
- the voice recorder 43 could be commanded by the CPU 20 to play or record audio segments at specific cell addresses when particular conditions are met.
- the CPU 20 allows the ability to perform on-the-fly edit, delete, or supplement messages stored by the voice recorder 43 .
- the voice recorder 43 Although the voice recorder is normally controlled by the CPU 20 , the voice recorder 43 also has one or more switches (not shown) to allow the user to manually operate the voice recorder 43 in the event that the voice recorder 43 has been ejected from the computer system.
- the switches provide user selectable options which include: “Goto Begin”, “Skip to Next”, “Record”, “Stop”, “Play Next”, “Play Last.” In this manner, even when the voice recorder 43 is separated from the computer of the present invention, the user can still use the voice recorder 43 in a stand-alone mode.
- a global position system (GPS) receiver 46 is connected to the I/O port 42 to sense the physical position of the user.
- the GPS receiver 46 senses positional data from a constellation of 24 satellites orbiting around the earth such that four satellites are visible at a time.
- the GPS receiver 46 provides a stream of data to the processor 20 which includes latitude, longitude, elevation and time information.
- the GPS receiver 46 is available from a number of sources, including NavTech Corporation and Rockwell International Corporation.
- a wand or a bar-code reader 48 can be removably attached to the I/O port 43 to allow the data management computer of the present invention to read bar codes.
- the wand is a pen-type of scanner that requires physical contact with the bar code when scanning.
- a laser bar code scanner is a non-contact scanner which uses a laser beam to read a bar code. Due to the active laser power, the laser bar code scanner is better at reading bar codes from a distance or when the bar code itself is poorly printed.
- the bar code reader 48 is snappably attached to the I/O port 31 such that the barcode reader 48 can be quickly attached and removed, as necessary.
- the barcode reader 48 captures the bar-code information from a barcoded label and converts the optically encoded information to serial data before they are transmitted to the computer of FIG. 1.
- the wired link can be replaced by a wireless link such as radio or infrared.
- the barcode reader 48 has an additional transceiver, which may be either radio-based or infrared based, and which can transmit captured data to the computer of FIG. 1 for subsequent processing.
- an infrared transceiver 49 can be connected directly to the bus of the computer 10 or to the I/O port 31 (not shown) to provide an infrared link to a nearby personal computer which is equipped with a corresponding infrared transceiver.
- the infrared transceiver is available from suppliers such as Hewlett-Packard, IBM, or Siemens.
- the transceiver 49 provides the received optical data to a Universal Asynchronous Transmitter/Receiver (UART) which converts the data into a suitable format for the bus.
- UART Universal Asynchronous Transmitter/Receiver
- a remote, large display device 52 is wirelessly linked to the computer 10 via the IR transceiver 49 or a radio transceiver 31 .
- the large display device 52 can be a suitably equipped television receiver with a wireless link and a video generator, as discussed further in FIG. 3, or it can simply be the display of a conventional personal computer having a matching transceiver.
- the large display device 52 thus enlarges the characters on to an easier to read display.
- the large display device 52 can offer higher resolution than available through the LCD display 35 . In such case, the computer 10 is suitably informed so that software running on the computer 10 can change its display interface to take advantage of the higher resolution, as discussed in FIG. 19.
- the present invention also supports remote stereo amplifier 93 and speakers 94 and 95 to provide a total multimedia experience to the user, even if the hand-held computer 10 cannot support high power amplifiers and speakers onboard.
- a receiver is provided to receive data transmission from either the IR transceiver 49 or the wireless transceiver 31 .
- the stereo amplifier is a MIDI compatible synthesizer or sound module.
- the MIDI protocol provides an efficient format for conveying musical performance data. Due to MIDI's more efficient data storage format, only a portion of the bandwidth of the transceivers 31 and 49 need be used to transmit MIDI instruction streams.
- the MIDI data stream is a unidirectional asynchronous bit stream at 31.25 kbits/sec with 10 bits transmitted per byte.
- the remote stereo 93 in turn can consist of a MIDI sequencer, which allows MIDI data sequences to be capture, stored, edited, combined, and ma replayed.
- the recipient of the MIDI data stream is provided to a MIDI sound generator which responds to the MIDI messages by playing sounds.
- the present invention also contemplates more elaborate remote stereo MIDI setups, where the music can be composed to have different parts for different instruments. Furthermore, in this set-up, a different sound module is used to play each part. However, sound modules that are capable of playing several different parts simultaneously, or multi-timbral, can still be used.
- the stereo 93 drives a pair of speakers 94 and 95 .
- the remote stereo unit 93 receives MIDI commands from the processor 20 and plays high quality sound on the speakers 94 and 95 .
- voice recognition can be used in conjunction with and/or replace the handwriting recognizer of the present invention.
- a microphone 51 is connected to an analog to digital converter (ADC) 50 which interfaces with the central processing unit (CPU) 20 .
- ADC analog to digital converter
- CPU central processing unit
- a speech recognizer is stored in the ROM 21 and/or the RAM 22 . The speech recognizer accepts the digitized speech from the ADC 50 and converts the speech into the equivalent text.
- the user's speech signal is next presented to a voice feature extractor which extracts features using linear predictive coding, fast Fourier transform, auditory model, fractal model, wavelet model, or combinations thereof.
- the input speech signal is compared with word models stored in a dictionary using a template matcher, a fuzzy logic matcher, a neural network, a dynamic programming system, a hidden Markov model (HMM), or combinations thereof.
- the word model is stored in a dictionary with an entry for each word, each entry having word labels and a context guide.
- a word pre-selector receives the output of the voice feature extractor and queries the dictionary to compile a list of candidate words with the most similar phonetic labels.
- candidate words are presented to a syntax checker for selecting a first representative word from the candidate words, as ranked by the context guide and the grammar structure, among others.
- the user can accept or reject the first representative word via a voice user interface. If rejected, the voice user interface presents the next likely word selected from the candidate words. If all the candidates are rejected by the user or if the word does not exist in the dictionary, the system can generate a predicted word based on the labels.
- the voice recognizer also allows the user to manually enter the word or spell the word out for the system. In this manner, a robust and efficient human-machine interface is provided for recognizing speaker independent, continuous speech and for converting the verbal instructions from the user into text data suitable for data management purposes.
- the case is a rectangular plastic casing with a major opening on the top of the case to receive the LCD panel 35 and the digitizer 34 .
- the case has a receptacle which is adapted to receive and store the pen 33 .
- a plurality of push-buttons in the keypad 24 are positioned on the top side of the case.
- the push-buttons of the keypad 24 preferably allows the user to invoke one or more pre-installed software on the computer of FIG. 1.
- the case has an opening on the backside which is adapted to receive a connector carrying the electrical impulses to and from the I/O port 42 .
- a Snap-On block which interlocks with the bottom of the computer and which is electrically connected to the I/O port 42 .
- the casing for FIG. 1 is resembles the Pilot handheld computer available from Palm Computing—US Robotics.
- the application enters ail event loop. It typically remains in that event loop until the system tells it to shut itself down by sending an appStopEvent.
- the call to SysHandleEvent may generate new events and put them on the queue.
- the system handles Graffiti input by translating the pen events to key events. Those, in turn, are put on the event queue and are eventually handled by the application.
- SysHandleEvent returns TRUE if the event was completely handled, that is, no further processing of the event is required. The application can then pick up the next event from the queue. If SysHandleEvent did not completely handle the event, the application calls MenuHandleEvent.
- MenuHandleEvent handles two types of events:—If the user has tapped in the area that invokes a menu, MenuHandleEvent brings up the menu.—If the user had tapped inside a menu to invoke a menu command, MenuHandleEvent removes the menu from the screen and puts the events that result from the command onto the event queue. MenuHandleEvent returns TRUE if the event was completely handled. 4 . If MenuHandleEvent did not completely handle the event, the application calls ApplicationHandleEvent. ApplicationHandleEvent handles only the frmLoadEvent for that event; it loads and activates application form resources and sets the event handler for the active form.
- FrmDispatchEvent If ApplicationHandleEvent did not completely handle the event, the application calls FrmDispatchEvent.
- FrmDispatchEvent first sends the event to the application's event handler for the active form. This is the event handler routine that was established in ApplicationHandleEvent. Thus the application's code is given the first opportunity to process events that pertain to the current form. The application's event handler may completely handle the event and return TRUE to calls FrmiDispatchEvent. In that case, calls FrmDispatchEvent returns to the application's event loop. Otherwise, calls FrmDispatchEvent calls FrmHandleEvent to provide the system's default processing for the event.
- an application frequently has to first close the current form and then open another one, as follows:—The application calls FrmGotoForm to bring up another form. FrmGotoForrn queues a frmCloseEvent for the currently active form, then queues frmLoadEvent and frmOpenEvent for the new form.—When the application gets the frmCloseEvent, it closes and erases the currently active form.—When the application gets the frmLoadEvent, it loads and then activates the new form. Normally, the form remains active until it is closes.
- the application's event handler for the new form is also established.—When the application gets the frmOpenEvent, it does whatever initialization of the form is required, then draws the form on the display. After FrmGotoForm has been called, any further events that come though the main event loop and to FrmDispatchEvent are dispatched to the event handler for the form that is currently active. The event handler knows for a particular dialog box or form how it should respond to events for example, opening, closing, among others. FrmHandleEvent invokes the default UI functionality. After the system has done all it can to handle the event for the specified form, the application finally calls the active form's own event handling function.
- FIG. 1B illustrates a flowchart for a process for accessing data in accordance with a first embodiment of a file management system that is compatible with an IBM personal computer file management system.
- the DOS needs to know the beginning of the data storage area.
- the disk has defined sections in addition to the boot sector and the partition table: a root directory and a file allocation table (FAT).
- the root directory starts after the boot sector and the FAT.
- the root directory holds the necessary information on location, size, date and time of the last change of the files and sub-directories, as well as a directory entry.
- the directory entry also contains a start cluster pointer and a file length field. The start cluster entry specifies the beginning of the file or subdirectory, and the file size field provides the length of the file.
- a dirty bit is provided in the directory entry for indicating whether the file has been updated since the last synchronization of the computer of FIG. 1 with the host computer. If the file has been updated, the dirty bit is set such that upon synchronization, the copy of the file on both the host computer and the computer of FIG. 1 is made consistent with each other using the same synchronization process performed by the database routines of the Pilot.
- the files on both the host computer and the Pilot handheld are correlated and updated to one coherent copy in both computers.
- the FAT values in the file management system of the first embodiment conforms to the FAT entries in the IBM compatible computers. These FAT entries essentially contain pointers to the next cluster in the cluster chain.
- the file management system has cluster chains that are only forward directed (forward chained), the present invention contemplates that bidirectional chain can be supported by using a doubly linked list of cluster chains.
- step 700 the routine proceeds to step 701 where it reads the directory entry.
- step 702 the routine determines the start cluster, as pointed to by the directory entry.
- step 703 it checks if the requested data is located in the present cluster. If not, the routine of FIG. 1B proceeds from step 703 to step 704 where it retrieves the FAT entry and traverse to the next cluster, as based on the cluster pointer. From step 703 , in the event that data is located in the current cluster, the routine of FIG.
- step 705 the routine of FIG. 1B proceeds to step 705 where it determines the sector containing the data and accesses the data from the cluster.
- step 706 the routine of FIG. 1B checks if the access has been successful. If not, the routine indicates a failure in step 707 . Alternatively, if the access is successful, the routine of FIG. 1B proceeds from step 706 to step 708 where it accesses the sector and transfer data to and from the application, as requested. From step 708 or 707 , the routine of FIG. 1B exits.
- the file management system of the present invention preferably is IBM PC compatible, it can also be Unix type of file management system.
- FIG. 2A a block diagram of the CCD unit 27 is shown in more detail.
- a CCD or CIS array 53 is connected to a CCD/CIS processor 55 .
- the CCD/CIS processor 55 is connected to a voltage reference 54 as well as an optional correction/data RAM 56 which can be eliminated for cost saving reasons.
- the CCD sensor may be a TCD1250D, a MN3610H or a similar CCD. Alternatively, CIS sensors such as those available from Dyna Image Corporation may be used.
- the CCD/CIS processor 55 is preferably a LM9801 IC from National Semiconductor, Inc.
- the output of the CCD/CIS processor 55 is provided to the bus to be presented to the processor 20 .
- a wireless scanner 27 ′ is shown.
- a wireless transceiver 58 is connected to a Universal Asynchronous Receiver/Transmitter (UART) 57 , which is in turn connected to the CCD/CIS processor 55 , as previously discussed in FIG. 2A.
- the UART 57 serializes data regarding the scanned image and presents the data to the wireless transceiver 58 for transmitting back to the computer 10 of FIG. 1.
- the wireless transceiver 58 can be an infrared unit for communicating with the IR transceiver 49 of the computer 10 .
- the wireless transceiver 58 can be a radiobased unit for communicating with the wireless transceiver 31 of FIG. 1.
- the scanner 27 'does not have to be physically connected to the computer 10 , thus providing more convenience and flexibility for the user during use.
- the use of the CCD/CIS processor 55 is disclosed, the present invention contemplates that, to cut cost, a operational amplifier, an analog to digital converter, and software running on the CPU 20 to compensate for scanning related signal noise are all that is needed to implement a low cost scanner system.
- gray-scale CCD/CIS devices are preferred for cost reasons, the present invention contemplates that color CCD devices may be used as well.
- a video driver and a large screen cathode ray tube (CRT) 52 is provided to deliver ease of reading information from the small mobile computer 10 is shown.
- the TV is not recommended for computing functions such as CAD/CAM, it is suitable for playing games and browsing the Internet.
- high level primitives of the display data is transmitted using a suitable media such as infrared or radio wave from the computer 10 to the CRT, preferably a television display unit commonly available to consumers.
- the high level primitive data transmitted, including characters and form definitions, is received by a wireless transceiver 60 and is presented to a UART 61 for conversion into parallel data.
- the data is presented to a video processor or controller 62 which is connected to a video RAM 63 and a character generator 64 for rasterization into bit-maps.
- the bit-mapped display data is delivered to a triple digital to analog converters (DACs) in the video controller 62 which generate suitable color RGB video signals.
- DACs digital to analog converters
- the video signal is provided into driver electronics for generating a composite video signal to be delivered to the video input of the TV.
- sound primitives are converted and delivered to an audio amplifier which drives the left and right audio inputs of the TV. In this manner, users with advanced age can have the benefits of reading ease and small factor portability.
- the high level video and sound primitives can be sent via the wireless network such as the infrared transmission (IrDA) and subsequently rasterized by the processor of the desktop computer to be displayed on the desktop display for ease of reading.
- a VGA graphics adapter may be used.
- a scan converter may be attached to the VGA adapter to generate the NTSC/PAL video signal. In performing the conversion from computer video to TV, VGA frequency is roughly twice that of the video frequency.
- VGA displays tend to be progressively scanned (non-interlaced) while TV video uses interlaced video, a remnant of the NTSC video scheme.
- the preferred embodiment provides an adaptive finite impulse response filter and highly linear D/A and A/D converters to minimize flickers.
- FIG. 4 the major protocol layers for connecting the computer of FIG. 1 to a suitable network such as an Internet 150 is shown.
- the user has connected to a suitable Internet service provider (ISP) 100 which in turn is connected to the backbone of the Internet 150 , typically via a T1 or a T3 line.
- the ISP 100 communicates with the computer of the present invention via a protocol such as point to point protocol (PPP) or a serial line Internet protocol (SLIP) 100 over one or more media or telephone network 102 , including landline, wireless line, or a combination thereof.
- PPP point to point protocol
- SLIP serial line Internet protocol
- a similar PPP or SLIP layer 103 is provided to communicate with the ISP 100 computer.
- a PPP or SLIP client layer 104 communicates with the PPP or SLIP layer 103 .
- a network aware application 105 such as a browser or a spreadsheet with Internet capability of the present invention receives and formats the data received over the Internet 150 in a manner suitable for the user.
- FIG. 5 a typical Internet system is shown with one or more portable computers 10 , 11 , 12 , and 13 shown dispersed in nearby cell regions.
- Computers 10 and 11 are located in one cell and communicate with a cell mobile support station (MSS) 70 .
- computers 12 and 13 communicate with a cell mobile support station 71 .
- MSS stations 60 and 61 are connected to a radio frequency (RF) network 151 which relays the messages via stations positioned on a global basis to ensure that the user is connected to the network, regardless of his or her reference to home.
- the RF network 151 eventually connects to a gateway 72 which is in turn connected to the Internet 150 .
- the gateway 72 provides routing as well as reachability information to the network such as the Internet 150 .
- a plurality of large scale computing resources such as a supercomputer 73 and a mainframe 72 are connected to the Internet 150 .
- the mainframe 52 in turn is connected to an corporate version of the Internet called Intranet 54 which supplies information to one or more office computers or workstations 55 .
- the Internet 150 is a super-network, or a network of networks, interconnecting a number of computers together using predefined protocols to tell the computers how to locate and exchange data with one another.
- the primary elements of the Internet 150 are host computers that are linked by a backbone telecommunications network and communicate using one or more protocols.
- the most fundamental of Internet protocols is called Transmission Control Protocol/Internet Protocol (TCP/IP), which is essentially an envelope where data resides.
- TCP protocol tells computers what is in the packet, and the IP protocol tells computers where to send the packet.
- the IP transmits blocks of data called datagrams from sources to destinations throughout the Internet 150 . As packets of information travel across the Internet 150 , routers throughout the network check the addresses of data packages and determine the best route to send them to their destinations. Furthermore, packets of information are detoured around non-operative computers if necessary until the information finds its way to the proper destination.
- the Internet 150 provides a pathway for users to communicate and share information, the original user interface had been rather unfriendly.
- a system was developed to link documents stored on different computers on the Internet 150 .
- the system is an elaborate distributed database of documents, graphics, and other multimedia development.
- the Web is based on a client/server model where Web pages reside on host computers that “serve up” pages when the user's computer (client computer) requests them.
- client computer As the user “surfs” the Web, a browser can request data from the database on a server computer that processes and replies the desired data back to the computer system of FIG. 1 and to display that request when the request is fulfilled, by the server.
- the client computer runs a browser software which asks for specific information by sending a HTTP request across the Internet 150 connection to the host computer. When the host computer receives the HTTP request, it responds by sending the data back to the client.
- the browser commonly features a graphical user interface with icons and menus across the top along with a field to supply the URL for retrieval purposes.
- Navigational buttons guide the users through cyberspace in a linear manner, either one page forward or backward at a time.
- Pull down menus provide a history of sites accessed so that the user can revisit previous pages.
- a stop button is typically provided to cancel the loading of a page.
- a bookmark is provided to hold the user's favorite URLs in a list such as a directory tree.
- the browser typically provides a temporary cache on the data storage device or in RAM. The cache allows a more efficient Internet access as it saves bandwidth and improves access performance significantly.
- each entry in the bookmark has a list of links typically accessed by the user while he or she accesses the Web site represented by the bookmark entry.
- the entry's Web page is displayed first.
- the browser retrieves pages of additional links associated with the bookmark entry in the background. In this manner, the browser prefetches pages likely to be accessed by the user when the bookmark entry page is clicked, thus avoiding delays when the user actually clicks on the links of the bookmark entry Web page.
- the use of the cache and the prefetcher enhances the Web viewing experience, as the user is not hampered by delays on-line.
- the browser also interprets HyperText Markup Language (HTML) which allows web site creators to specify a display format accessible by HTML compatible browsers.
- HTML HyperText Markup Language
- TCP/IP opens a connection between the host and client computers.
- the browser then generates a request header to ask for a specific HTML document.
- the server responds by sending the HTML document as text to the client via the TCP/IP pipeline.
- the client computer acknowledges receipt of the page and the connection is closed.
- the HTML document is stored in the browser's cache.
- the browser parses the HTML document for text and tags. If the browser runs across tags that link to images/pictures and sounds, the browser makes separate requests for these files to the server and displays or generates sounds to the user.
- Java was developed originally by Sun Microsystems of Mountain View, Calif.
- the specification for the Java language is stored at the Java web site http://java.sun.com/.
- the web site contains the Java development software, a HotJava web browser, and on-line documentation for all aspects of the Java language, hereby incorporated by reference.
- Java can download and play applets on a browser system of the receiver, or reader.
- Applets are Java programs that are downloaded over the Internet World Wide Web, as dictated by a tag such as ⁇ applet> tags and executed by a Web browser on the reader's machine.
- the compiler takes the instructions and generates bytecodes, which are system independent machine codes.
- a bytecode interpreter executes the bytecodes.
- the bytecode interpreter can execute stand-alone, or in the case of applets, the bytecode interpreter is built-in Java compatible browsers.
- the Internet 150 is transformed from a passive giant book of information into an active network capable of supporting electronic commerce and virtual ecosystems.
- the supercomputer 51 , the mainframe computer 52 and the gateway 59 are shown in FIG. 4 as being connected to the Internet 150 via landlines such as T1 and T3 lines, the Internet may be connected to a satellite transmission system 56 which transmits and receives high bandwidth data over a satellite 57 .
- the satellite 57 in turn relays the information to one or more local stations 58 which is connected to one or more servers 57 .
- the portable computer 10 can easily request information from a variety of sources which may exist locally or on the other side of the world via the Internet 150 .
- An important goal of the personal computer 10 is its ability to allow users to move about freely within and between cells while transparently maintaining all connections, particularly with the Internet 150 .
- the Internet 150 suite of protocols had been designed with an assumption that each user is assigned a fixed Internet 150 address associated to a fixed location.
- the movement or migration of users in the wireless network violates the implicit Internet 150 protocol.
- wireless bandwidth is at a premium, particularly when voice and video data are involved, it is inefficient to require end-to-end retransmission of packets as done in TCP.
- Due to the unpredictable movements of mobile computers with wireless links large variations exist in the available bandwidths in each cell and affect the transmission characteristics between the mobile computer 10 and the Internet 150 .
- a number of virtual circuits are used within the mobile network to route connections to mobile computers 10 - 13 via MSS 70 and 71 .
- every mobile computer 10 - 13 has a globally unique virtual Internet protocol (VIP) address and a IP address which is assigned by the gateway 72 or the MSS 70 or 71 .
- VIP virtual Internet protocol
- each of the MSS 70 and 71 has a VIP as well as a fixed IP address.
- Each of the MSS 70 and 71 also tracks all mobile nodes within its domain or cell range. The connection from a remote endpoint on the Internet 150 to the mobile computer 10 terminates at the fixed IP of the MSS 70 or 71 .
- the MSS 70 and 71 maintains a cache of time-stamped VIP to IP mappings, also called an address mapping table. Whenever the mobile computer 10 moves from one MSS 70 to another MSS 71 , the address mapping table is updated. During the table update, all packets destined to the mobile computer 10 continues to be sent to the old MSS 70 . These packets are returned to the sender, who forwards the returned message to the new MSS 71 . Thus, based on the address mapping table, the sender and the MSS 70 or 71 can route packets to the mobile computer 10 .
- the MSS 70 buffers the incoming packet and forwards the packet to the known or predicted cell covering the mobile computer 10 . Once received, the mobile computer 10 acknowledges receipt and requests the MSS 60 to discard the packet. Alternatively, in the event that the mobile compute 10 has moved to another cell which is covered by the MSS 71 , it is assigned a new IP address. The mobile computer 10 sends the new IP address and the VIP address to its home gateway 72 , which in turn sends the new IP address to intermediate gateways to update their address mapping tables as well. The MSS 70 continues to send packets to the mobile computer 10 until either the connections are closed or until the MSS 71 sets up its own connection via the address mapping table with the remote endpoint having the open connection with the mobile computer 10 .
- the MSS 70 or 71 preferably provides a loss profile transport sub-layer which determines the appropriate disposition of the data, based on markers placed on the packet by the sender and based on the available bandwidth negotiated between the MSS 70 and 71 and the mobile computers 10 , 11 , 12 or 13 .
- redundant non-critical data such as every other frames of a video clip may be appropriately edited by rearranging, clipping or compressing the data at the MSS 70 or 71 end before transmitting to the mobile computer 10 , 11 , 12 or 13 in the event that the bandwidth is severely constrained.
- a number of software modules may reside in the RAM 22 to provide added functionality to the portable computing appliance.
- the portable computing appliance may provide a sketching system in the incorporated by reference patent application entitled “GRAPHICAL DATA ENTRY SYSTEM” may be provided to support fast, convenient and accurate annotated drawings in the field.
- a spreadsheet and database engine may be used to support the analysis of data captured from a number of sources over the Internet 150 .
- FIG. 6 illustrates in more detail the spreadsheet of the present invention.
- the spreadsheet of the present invention is essentially a list of memory locations or data storage cells that are related, or linked together. Preferably, the data storage cells are organized using a linked list for ease of traversal.
- the data storage cells can be specified using row and column identifiers.
- the cells of the spreadsheet are linked using dynamic rows and columns. The ability to offer both dynamic rows and columns simplifies and reduces the data storage requirement on the system RAM 22 .
- the spreadsheet provides the user of the handheld computer with on-the-fly data processing capability.
- the spreadsheet can also acquire data via the barcode scanner 48 , the CCD unit 27 and OCR software, or alternatively via the microphone 51 , ADC 50 and a speech recognizer.
- the spreadsheet can deploy with intelligent agents to seamlessly hunt for information relevant to the user's needs. In this manner, the spreadsheet of the present invention turns the handheld computer system of the present invention into an intelligent data management system which can acquire and process data on the fly.
- FIG. 6 a spreadsheet handler 200 is shown.
- the spreadsheet handler initializes the spreadsheet in step 201 . Such initialization includes the clearing of the spreadsheet memory and the setting of the current row to “1” and column to “A”.
- step 202 the spreadsheet handler 200 of FIG. 6 draws the spreadsheet cells and displays the row/column labels as well as the menu.
- the spreadsheet handler 200 then proceeds to check in step 203 whether certain cells of the spreadsheet need to be updated using a remote data source such as an Internet database. Step 203 is illustrated in more detail in FIG. 14.
- step 203 the routine of FIG. 6 proceeds to set an event handler to the spreadsheet form in step 204 .
- step 204 the routine waits for an event in step 205 .
- the routine of FIG. 6 tests to see if the event is a system event in step 206 . If so, the routine processes the system in step 207 before it loops back to step 205 to process the next event.
- the routine of FIG. 4 tests whether the event is a menu event in step 208 . If so, the routine of FIG. 4 processes the menu event in step 209 before it loops back to step 205 . If the event is not a menu event in step 208 , the routine 200 tests whether the event is a form load event in step 210 . If so, the new form is loaded in step 211 . From step 211 , the routine of FIG. 6 loops back to step 205 to process the next event. If the event is not a form load event, the routine 200 of FIG. 6 checks to see if the application handler for the spreadsheet has completed operation in step 212 .
- routine 200 provides default processing for the application in step 212 before it exits in step 214 .
- the routine dispatches the event as necessary in step 213 before the routine 200 loops back to step 205 to process the next event in the spreadsheet handler 200 .
- the system event handler 207 of FIG. 6 is shown in more detail in FIG. 7. From step 207 , the routine of FIG. 7 checks to see if the user is actuating the pen in step 218 . If the pen 33 is pressed down on the LCD screen 35 in step 218 , the routine updates the active cell in step 220 and recalculates values of the spreadsheet in step 220 . Alternatively, if the pen 33 is not down in step 218 , the routine 207 checks if the pen 33 is in a scroll bar region in step 222 . If so, the routine 207 performs the scroll operation in step 223 before it exits.
- the routine 207 checks if the pen is dragged down in step 224 . If so, the user has selected a particular block of cells for purposes such as cutting, pasting, or generating graphs, among others. In such case, the routine 207 highlights and selects the blocked region in step 225 before it exits. Alternatively, if the pen 33 is not dragged in step 224 , the cell is being edited in step 226 . From step 226 , the routine checks if the user has selected menu items in step 227 . If so, the menu event is dispatched in step 228 before the routine 207 exits in step 229 . From step 227 , if no menu events have been generated, the routine 207 exits FIG. 7.
- step 223 the scroll routine checks to see if the user requested a scroll-up operation in step 241 . If so, the routine 223 then checks if the spreadsheet is already at the top of the page in step 242 . If not, the scroll up operation is performed to show the previous page in step 243 . Furthermore, the page pointer is updated in step 243 before the routine 223 is completed in step 253 . From step 242 , if the spreadsheet is already at the top of the page, the routine 223 is simply exited.
- the routine 223 checks if the user is requesting a scroll down in step 244 . If so, the routine 223 further checks if the spreadsheet is at the bottom most page in step 245 . If not, the routine 223 exits. Alternatively, if the spreadsheet is not at the bottom most page, the routine 223 shows the next page and updates the page pointer in step 246 before it exits in step 253 .
- step 244 if the operation is not scroll-down, the routine 223 checks if the user has requested a scroll left operation in step 247 . If so, the routine 223 checks to see if the user is already at the left most page in step 248 . If the current spreadsheet page is not the left most page in step 248 , the routine 223 proceeds to step 249 where it displays the left page and updates the pointer to point one page to the left. Alternatively, from step 248 if the user is already at the left most page, or from step 249 , the routine 223 exits in step 253 .
- step 247 in the event that the user is not requesting a scroll left operation, the routine 223 checks if the user has requested a scroll right operation in step 250 . If so, the routine checks if the current page is the right most page in step 251 . If not, the routine 223 transitions from step 251 to step 252 where it shows the page on the right of the current page and updates the page pointer appropriately. From step 252 , or from step 251 in the event that the user is already at the right most page, or from step 250 where the user did not request a scroll right operation, the routine 223 exits in step 253 .
- Scroll panels are supported. Scroll panels are used to lock the display of the spreadsheet in particular horizontal and/or vertical directions. For instance, scroll panels are useful for constantly displaying period information such as the months on the spreadsheet, regardless of the user's scrolls.
- the routine 223 of FIG. 8 checks if the user has specified a row and a column relating to scroll panels. If the user has specified that particular row and/or column be the scroll panels, the routine locks the row and/or column such that the locked row/column defines the top and left most pages of step 242 and 248 . Once locked, the locked row and/or column are always displayed relative to other rows and columns.
- step 226 the routine examines if the function button has been selected in step 261 . If so, the routine 226 proceeds to step 262 where it displays a function list before it proceeds to step 263 . In step 263 , the routine 226 checks if the cancel button has been pressed. If so, the routine loops backs to step 261 . If not, the routine 226 checks if the user has selected a function from the displayed list in step 264 . If not, the routine 226 loops back to step 262 where it awaits an action from the user.
- the routine 226 checks if the user has provided the correct parameter in step 265 . If the parameters are incorrect, the routine displays error messages in step 266 before it loops back to step 262 . On the other hand, if the correct parameters are provided, the routine 226 loops from step 265 back to step 261 to continue processing the cell edit operation.
- step 261 if the function button has not been selected, the routine 226 checks to see if the user has completed entering the formula in step 267 . If not, the routine loops from step 267 back to step 261 to continue processing user requests. Alternatively, if the user has completed entering the formula into the cell in step 267 , the routine 226 transitions from step 267 to step 268 where it saves the cell contents. Upon completing step 268 , the routine 226 of FIG. 9 exits in step 269 .
- a cell edit window is brought up.
- the cell edit window shows on the left hand the row, column index. Further, the content of the cell, which is editable, is displayed.
- an enter (done) button, a functions button, and a cancel button is provided.
- the enter or done button is used to indicate that the user has completed his or her editing operations and that the user wishes to accept the changes and go back to the display of the rest of the spreadsheet.
- the functions button when actuated, shows a scrollable list of the functions supported by the spreadsheet system.
- the functions button further awaits for selection of one of the functions displayed. spreadsheet to support number intensive data analysis. When a function has been selected, the function is entered into the space and may be edited if necessary.
- the cancel button allows the user to terminate his or her cell editing function and return back to the spreadsheet without affecting anything.
- step 268 the routine of FIG. 10 first checks to see if the current cell exists already in step 271 . If no, the routine transitions to step 272 where it checks if sufficient memory exists for the new cell. If the remaining memory is insufficient, the routine 268 transitions from step 272 to step 274 where it displays one or more error messages before transferring to step 280 . Alternatively, from step 273 , in the event that sufficient memory exists to support another spreadsheet cell, the routine 268 creates a new cell and link the new cells to prior cells.
- step 271 From step 271 , if the cell exists already, or from step 273 where the new cell had been created and if the result of the formula evaluation is successful, the routine 268 displays the result in step 278 and saves the formula in the current cell in step 279 before the routine 268 exits in step 280 .
- the routine 268 transitions from step 277 to step 274 where it displays an error message before it moves to step 279 to save the formula, even if the formula has an error of some type such that the user can subsequently edit the formula. From step 279 , the routine 268 exits in step 280 .
- step 276 a formula evaluation routine is discussed in more detail. This routine is executed by a pen down event handler to process the data input.
- the routine 276 first checks if the current character in the string under evaluation is an “(” in step 281 . If so, the routine recursively calls itself in step 276 to resolve the value of the string enclosed within the “( )” pair. This method is known in the art as “substringing.” From step 281 , the routine 276 next checks if the character is a “)” in step 282 . If so, the routine returns to its caller in step 289 .
- the routine gets the next string which could be a number or an arithmetic operator in step 283 .
- the routine exits with an error indication in step 285 .
- the routine exits in step 289 without an error indication.
- step 283 in the event that the character is an arithmetic operator such as +, ⁇ , / or * in step 286 , the routine 276 performs the operation in step 287 before it loops back to step 281 to continue the string processing.
- the routine 276 next checks if the character is a NULL character in step 288 . If so, the routine 276 transitions to step 289 where it returns with an OK indication. Alternatively, if the character in step 288 is not a NULL character, the routine loops back to step 281 to continue processing the string in the formula entered.
- step 209 the routine checks if the user has selected the “File” menu event in step 291 . If so, the routine 209 transitions to step 292 where selections in the file menu event are processed. These selections include, among others: open, close, save, save as, and quit. Upon completion of the file menu items in step 292 , the routine 209 exits in step 299 .
- step 291 in the event that the file menu item is not selected, the routine 209 transitions to step 293 where it checks if the user selected an edit menu. If so, the routine 209 transitions to step 294 to handle the edit menu items.
- the edit menu items include: undo, redo, cut, copy, paste, find, replace, goto, insert, delete, format, and width changes, among others. .
- the routine 209 exits in step 299 .
- step 295 the routine checks in step 295 whether a graph menu has been selected. If so, the routine 209 transitions to step 296 where the graph requests are handled.
- the graph operations performed in step 296 include a selection of the graph type (pie, line, bar, area, hi-lo, and scatter) and a layout control (series, axis, grid/tick, title, legend, and label).
- the routine 209 exits in step 299 .
- step 295 if the graph menu item is not selected, the routine 209 checks if the view menu item is selected in step 297 . If not, the routine 209 simply exits in step 299 . Alternatively, if the view option has been selected, the routine 209 transitions to step 298 to perform zoom operations.
- the zoom operation is important in a palmtop computer as the display 35 is relatively small and can display in full only small sketches.
- the routine of FIG. 13 causes the display 35 to select a particular magnification factor such as 50%, 75%, 100%, 150%, 200% and a user selectable scale
- step 298 centers in on the cell last edited and enlarges or shrinks the spreadsheet display as requested. When the user is done viewing the scaled display, the user can click on a button such as an OK button or a GoBack button to move back to the spreadsheet functionality.
- the present invention also contemplates a second embodiment where, underneath the menu is an outline which fences in a spreadsheet area where the user can enter information or manipulate the spreadsheet. Further, upon selecting the magnifier option from the View menu, a magnifier is displayed whenever the pen input device 33 touches the LCD screen 35 of the computer system. Additionally, a zoomed drawing area is displayed along with the movement of the pen input device and the magnifier. When the user successively and quickly depresses (double clicks) the pen 33 twice when the pen is in the magnifier mode, the sketching system enters into a zoomed viewing mode, as further described below.
- the zoomed drawing area is enlarged into an enlarged zoomed drawing area which occupies the space formerly belonging to the drawing area.
- the enlarged area provides the user with a enlarged, more comfortable view of the object(s) such as the spreadsheet cell(s) or database records being edited.
- a miniaturized illustration of the entire drawing is shown as a floating insert or a draggable window which is shown at the top left corner of the enlarged zoomed drawing area.
- the floating insert or draggable window showing the miniaturized illustration may be freely moved within the enlarged zoomed area.
- the miniaturized illustration thus provides a bird's eye view of the entire drawing and further shows an outline of the zoomed drawing area or the enlarged zoomed drawing area.
- the outline may be moved to scroll around the drawing.
- the display region is also updated to reflect the objects located at the current position of the outline.
- the outline also has four sides which can be adjusted to adjust the zoom ratio of the objects shown on the enlarged zoomed drawing area.
- the user can vary the enlargement view on the area.
- the magnifier and the miniaturized illustration balances between the need for showing the entire drawing with the limited display region afforded by the portable computing appliance.
- the user can select the full drawing view option from the Edit menu to display the entire drawing on the screen.
- the routine 298 obtains a zoom range in step 311 .
- the routine 298 computes the zoom ratio based on the zoom range, or alternatively, from a zoom ratio input from the user which may range from 50% to 200% to a user selectable ratio.
- the routine 298 performs a rasterization based on the zoom ratio and the display window.
- the routine 298 puts up the rasterized bit map on the display 30 , along with a Goback or Done button.
- the routine waits for the user to select the Goback button. Once the user has indicated that he or she has completed viewing the zoomed image, the routine 298 exits via step 316 .
- the present invention also contemplates that, in a second embodiment, a magnifier icon is displayed whenever the pen 33 touches the LCD screen 35 of the computer system. Further, an outline box is displayed around the magnifier icon to indicate the viewing area available when the magnifier zooms.
- the routine displays an enlarged view of the drawing at the point where the pen 33 touches the screen 35 of the computer system, much like a conventional magnifier would.
- the routine 298 also displays a Bird's Eye (BE) view of the entire drawing in a BE window. Further, a zoom box is displayed inside the BE window to indicate to the user his or her relative position on the drawing. The zoom box has four sides which are selectable by the pen to adjust the zoom scale, as discussed below.
- BE Bird's Eye
- the second embodiment of the routine 298 checks for a pen event occurring within the BE window. If not, the pen event belongs to the primary window and representing either a draw or edit event. Thus, if the pen is not in the BE window, the routine calls the cell edit routine (FIG. 9). In the event that the pen is in the BE window, the routine checks for adjustments to the zoom box to change the scaling factor of the main display window. In the event that the zoom box has been adjusted by clicking on the zoom box and adjusting either the horizontal lengths or the vertical lengths, the routine computes the zoom ratio based on the zoom box adjustment.
- the zoom ratio is computed as a function of the ratio of the length of the zoom box to the length of the BE window and further as a function of the ratio of the width of the zoom box to the width of the BE window.
- the alternate routine then applies the newly computed zoom ratio and refreshes the main display by performing the appropriate enlargement or shrinkage on the objects encompassed within the newly adjusted zoom box. Furthermore, if the zoom box has been dragged to a new viewing location, the routine receives the new coordinates of the zoom box relative to the BE window and updates the content and location of the zoom box in the BE window.
- step 204 the routine determines the rows and columns that need to be updated in step 321 which is discussed in more detail in FIG. 15.
- step 204 checks to see if the cells had been updated earlier during the day in step 322 . If so, there is no need to perform the update and the routine 204 exits in step 330 .
- the routine proceeds from step 322 to step 323 where it creates a query designed to obtain the proper information.
- the routine connects to the server over the Internet or a suitable media.
- the routine submits the query.
- the routine 204 passes the query created in step 323 to the server in a query string which contains the name of a Common Gateway Interface (CGI) script.
- CGI Common Gateway Interface
- the CGI script sends the search to a database located on the server, receives the result of the query, along with the HTML page created by the database to contain the result, and passes it back to the server to be sent back to the routine 204 .
- CGI Common Gateway Interface
- step 326 the routine 204 waits until the reply is complete. If so, the routine parses the reply into formatted data in step 327 and in step 328 stores the new data in the cells determined in step 321 . Step 328 also copies the formulas from related cells into adjacent cells and recalculates the spreadsheet. The formulas from the related cells represent mathematical relationships to the new information. For instance, in an income statement, the gross profit is determined by obtaining new data on sales and cost of goods and subtracting the cost of goods from the sales information. From step 328 , the routine 204 refreshes the Last_Updated flag to reflect the most recent time that the spreadsheet had been updated to prevent needless updating. From step 329 , the routine 204 exits in step 330 .
- step 321 the routine prompts the user to click on a row or column identifier.
- step 340 the routine checks to see if a label exists for the selected row or column. The label is important as the query of step 323 will be based on the label information. If the label does not exist, the routine 321 prompts the user for a label in step 341 . From steps 340 or 341 , the routine 321 checks if the label has an alias. The use of an alias allows the label to be more descriptive while satisfying specific naming constraints on the remote database.
- the routine 343 applies the alias to the label. From steps 343 or 343 , the routine 321 checks if all rows and columns to be updated have been identified. If not, the routine 321 loops back to step 340 to obtain information on the next row or column to be updated. Alternatively, if all rows and columns to be updated have been identified, the routine 321 exits in step 345 .
- OLE Object Linking and Embedding
- FIG. 16 another method to update the spreadsheet is shown.
- the user can activate a browser to view information on the Internet or other suitable network and designate the information to be retrieved into the spreadsheet.
- the routine executes a TCP/IP layer module in step 351 .
- the PPP client layer 352 is then invoked.
- the data from the PPP client layer 352 is provided to a compression/decompression engine 353 .
- the decompressed data is provided to a message manager in step 354 . If the message is a Java based message in step 35 , the routine 350 provides the message to a Java interpreter, a just-in-time Java compiler, or a Java flash compiler in step 356 .
- the routine then checks the remaining messages to see if it is in a Mark-up Language in step 357 . If so, the routine 350 provides the message to a HTML or a HDML interpreter in step 358 . From steps 357 or 358 , if the incoming message is provided to a default custom interpreter in step 359 to handle. special protocols supported by the user's application. From step 360 , the browser routine 350 exits in step 360 .
- the user can view the contents of databases located on the Internet and download the data via an appropriate protocol such as file transfer protocol (FTP).
- FTP file transfer protocol
- the incoming packet is executed if it is in Java, interpreted if it is HTML/HDML or custom protocol, and ultimately provided to the spreadsheet of the present invention.
- the browser of the present invention preferably supplies a user interface with a menu bar, a tool bar, a URL bar in addition to the active window displaying the Web page.
- the routine displays a menu bar, a tool bar and a URL bar in step 371 ;.
- the toolbar preferably allows the user to move backward/forward through various Web pages, reload a page, travel to a home page, print a page, stop the current load, among others.
- an icon is available that, if dragged onto the window of the portable computer's desktop, creates a double-clickable link to that site on the desktop.
- the traditional “http://www” or “ftp://” can be automatically supplied by the browser.
- the browser preferably displays a key which indicates the page's built-in security feature.
- a status line is supplied to indicate the completion rate of the page download.
- the tool bar and the status line are made to be hideable.
- step 371 the routine accepts the user's URL, retrieves the HTML file from that URL, and parses the HTML file in step 372 .
- the routine of FIG. 16A adds the locations of the hyperlinks, as indicated in the respective HTML tags, into the event manager for watching. Once the hyperlink locations have been entered, the event manager catches double clickings on the hyperlinks and appropriately processes the requests for the hyperlinks.
- step 373 the routine checks for occurrences of menu bar events in step 374 . If so, the routine jumps to the menu event handler in step 375 . Alternatively, if no menu bar event occurred, the routine proceeds from step 374 to step 376 to check for tool bar events.
- step 376 the routine proceeds from step 376 to step 377 where it handles tool bar events.
- the routine proceeds to step 378 to check if the user has specified a new URL location. If so, the routine loads the new HTML file from the new URL location, parses the new HTML file, and adds the locations of the hyperlinks in the new HTML file in step 379 .
- the routine checks if the user wants to exit the browser in step 382 . If so, the routine exits the browser in step 383 .
- step 382 or from steps 375 , 377 , 379 , or 381 , the routine of FIG.
- the browser routine 370 has a cache which improves access performance.
- Two types of cache are provided: a memory cache and a file cache.
- the memory cache buffers short term storage of graphic objects whereas the file cache is for intermediate term storage of data objects, as known to those skilled in the art.
- the browser of the present invention also provides an off-line browsing capability to compensate for long delays associated with Web traffic.
- the computer of the present invention instructs a host server to perform searches during off-peak time and save the search result for subsequent viewing at a much faster pace. In this manner, the Web experience is preserved without the frustrating delays typically encountered when accessing the Web at peak hours.
- the browser of the present invention has an integrated front end to Web search engines and directory, allowing users to issue a query using multiple search engines such as Lycos and Yahoo. As the front end generates direct inquiry to the CGI compatible databases, the front-end is relatively compact.
- a data filter is provided to reduce the amount of documents to be viewed.
- the browser can be configured to download specific objects, such as text only such that large graphics files can be discarded if the user does not want to view graphics.
- the browser deploys agents to monitor the Bookmark mentioned above and rerun the search at specified intervals and notify the user when new results are found.
- the browser of the present invention replicate the Web experience by preserving the URL.
- the browser of the invention supports a news ticker capability which automatically download news files at night according to a user defined schedule. The news ticker is subsequently presented to the user when the computer is idle in an analogous manner to a screen saver.
- the routine 400 for receiving images via the CCD/CIS unit 27 is shown in more detail. From the scan step 400 , the routine issues a reset instruction to the CCD/CIS unit 27 in step 401 . Next, the routine 400 checks if the scan button on the CCD/CIS unit 27 is depressed. If so, the routine 400 acquires the image from the CCD/CIS system 27 in step 403 and loops back to step 402 to continue the image acquisition until the scan button is released in step 402 . Once the scan button is released in step 402 , the routine 400 performs an optical character recognition (OCR) process in step 404 .
- OCR optical character recognition
- the optical character recognition may perform a combination of feature detection and template matching methods for recognition of characters, as disclosed in U.S. Pat. No. 5,436,983, or may utilize neural networks as is known in the art.
- step 404 the routine 400 formats the OCR data in step 405 .
- the routine 400 places the formatted data in the cells of the spreadsheet in step 406 .
- the routine 400 also copies related formulas to appropriate cells in the spreadsheet.
- step 407 may copy the formula for computing gross profit below the cost of goods line if the gross profit formula is used for current sales figures.
- step 408 the routine performs a spreadsheet recalculation before it exits in step 409 . In this manner, the computer and scanner may be used to optimize the data acquisition process.
- the routine issues a reset instruction to the CCD/CIS unit 27 in step 421 .
- the routine 420 checks if the scan button on the CCD/CIS unit 26 is depressed. If so, the routine 420 acquires the image from the CCD/CIS system 27 in step 423 and loops back to step 422 to continue the image acquisition until the scan button is released in step 422 . Once the scan button is released in step 422 , the routine 420 performs an optional image enhancement step 424 using known image signal processing routines.
- routine 420 proceeds to step 425 where, if the raster to vector option is picked, the bitmap is vectorized in step 426 before the routine 420 continues. From steps 425 or 426 , the routine 420 proceeds to step 427 where it checks if the compression option has been selected. If the compression option has been selected in step 427 , the routine performs the bitmap compression process in step 428 . From steps 427 or 428 , the routine 420 exits.
- the computer of the present invention allows the user to scan in images on the fly, digitally enhance the images, and store the images for subsequent printing in the event that the user simply wishes to copy the images, or include the images in a document or a file to be transmitted via a suitable network, or send the images via facsimile or other medium in the event that the user wishes to fax the image to a remote site for review.
- step 440 the routine of FIG. 19 checks if a link with the remote display or CRT device 52 is active in step 441 . This is preferably done by scanning the IR frequency for the presence of a remote display or CRT device 52 . Once the handshake indicating that a remote device exists is completed in step 441 , the portable computer proceeds to step 442 . Otherwise, the routine of FIG. 19 simply exits if no remote display or CRT device 52 exists. Alternatively, if an active link to the remote display device 52 exists, the routine of FIG.
- step 442 it turns off the LCD screen 35 on the portable computer 10 and further sends an acknowledgment return signal to the remote CRT device 52 .
- the routine trap graphic calls in step 443 so that a custom version of the graphic routines supporting a higher resolution display are used in place of the original graphic routines supporting the LCD display 35 .
- step 444 the routine examines the identification data sent by the remote display 52 to determine whether the display 52 is a high resolution device or not. If so, the routine modifies the display range resolution in the graphic routines to support the higher resolution in step 445 .
- the display resolution information is not updated such that the display on the remote low resolution display 52 shows the same information as on the LCD display 35 , but merely with enlarged and brighter images for ease of reading.
- the routine of FIG. 19 sends the trapped graphics primitives to the remote display 52 in step 446 .
- the routine of FIG. 19 receives the high level graphics primitives sent in step 446 and decodes or rasterizes the primitives before displaying them on the remote display 52 .
- the routine of FIG. 19 exits in step 448 .
- FIG. 19A is a flow chart of the process for teleconferencing with a remote user and for visually sharing an electronic chalkboard.
- the chalkboard conferencing process of FIG. 19A requires DSVD modems as well as the software carrying out the process of FIG. 19A to be installed both on the portable computer of the present invention as well as on the remote computer.
- the chalkboard process proceeds to step 491 where a connection with the remote computer having a DSVD modem is established.
- the routine selects a file to be viewed on the blackboard.
- the file may be a text file as is conventional or may be a graphical document such as a document generated by the graphical drawing tool disclosed in the previously incorporated Ser. No.
- step 493 the respective files on both the local and remote computers are synchronized in step 493 .
- step 494 the voice data from the remote end is received, demultiplexed, decompressed, and reconstructed for the user to listen in step 494 , as is conventional in the DSVD specification.
- step 495 digital data is received.
- step 495 the data is decoded and checked if changes to the document have been made by the user at the local end.
- step 495 the routine proceeds from step 495 to step 496 where it captures the changes and transmits the update packet to the remote unit for synchronizing the blackboard.
- a suitable compression software can be used to minimize the data transmitted.
- the compression software can convert the strokes into vectors and transmit the vector information rather than bitmaps to conserve bandwidth on the digital channel.
- the routine of FIG. 19A further checks if changes have been made at the remote end in step 497 . If so, the routine proceeds from step 497 to step 498 where it receives the incoming packet and updates the chalkboard. From step 494 , 497 or 498 , the routine checks in step 499 if the user has completed the remote conference. If not, the routine loops back to step 495 to continue the chalkboard updates and to transmit voice between the users. Alternatively, if the user has completed the conferencing session in step 499 , the routine of FIG. 19A exits.
- the routine to record a voice note using the voice recorder 43 is shown in more detail.
- the routine initializes the voice recorder 43 in step 461 .
- the routine of FIG. 20 checks for the desired action in step 462 .
- the routine sends a record (REC) command to the voice recorder 43 and saves the current address of the memo in a message management record (MMR) in step 463 .
- MMR message management record
- the index to the message management record is also saved by the application such that when the user wishes to replay the previously recorded note, the address can be retrieved from the MMR to send to the voice recorder 43 to play.
- the routine exits in step 470 .
- step 462 if the application does not need to save a memo, the routine of FIG. 20 checks if the user wishes to play a previously recorded memo in step 464 . If so, the routine sends a PLAY command to the voice recorder 43 in step 465 using the address retrieved from the MMR. From step 465 , the routine exits in step 470 .
- step 464 if the user does not wish to play a memo, the routine checks if the user wishes to edit the memo in step 466 . If so, the new message is recorded in part or in whole over the old memo in step 467 before the routine of FIG. 20 exits in step 470 .
- step 468 the routine proceeds to step 469 where it removes the current address from the MMR and mark the space as being available for additional messages.
- the routine of FIG. 20 allows the user to quickly record and edit his or her messages in the voice recorder 43 without consuming main memory 22 , as voice messages can be rather memory intensive.
- the voice recorder 43 is detachable from the computer 10 , the user can carry only the voice recorder in the event the user needs a reminder/recorder device when available space is very small.
- the data processor of the present invention also provides speech recognition capability as another mode of data entry.
- FIG. 21 the routine waits for a speech pattern directed at the computer system 10 of FIG. 1. If no speech patterns exist, the routine simply exits in step 480 . Alternatively, if voice is directed at the system, the routine 480 checks if the voice data is a command or an annotation in step 482 . If a command, the routine performs the voice command in step 483 before it exits in step 487 . Alternatively, if the speech pattern does not relate to a voice command, the routine proceeds to step 484 to check on voice annotations.
- step 484 If the voice input is a voice data annotation, the routine proceeds from step 484 to step 485 where voice is converted into computer readable text. From step 484 , the routine formats the converted data in step 486 such that the spreadsheet can process the dictated spreadsheet data entry. From steps 481 , 483 or 486 , the routine exits FIG. 18 in step 487 .
- data can be collected into the spreadsheet by scanning and performing OCR on images or by capturing voice and performing a speech to text recognition on the dictation.
- intelligent agents can be used in conjunction with the computer system of FIG. 1 to locate/process information over a network via the two-way communication device 31 .
- Smart agents can automatically route user-specified data from the Web, other on-line services, and E-mail messages and faxes, to the computer of FIG. 1.
- a software entity called an “agent” serves as an independent source of expertise designed to perform particular tasks or sets of tasks. These agents continually process and update requests, even though the user is no longer connected to the network. These agents can also “mine” sites for information and retrieve only data relevant to the user.
- the agents can be activated on demand to serve a particular purpose and then be deactivated after accomplishing solely that purpose.
- the agents navigate through computer networks such as the Internet to search for information and perform tasks for their users.
- the collected data from the search or the results of the execution of the tasks are compressed and delivered to the portable computer system of FIG. 1 the next time a wireless connection is established with the two-way communication device 31 .
- FIG. 22 a flow chart showing the process of specifying an intelligent agent capable of operating with the two-way communication device 31 is shown.
- the routine accepts rules and parameters for the intelligent agent in step 501 .
- the intelligent agent of FIG. 22 is rule driven. Rules can be specified with a simple English like syntax to set slot values, create objects, among others, and can be organized into rulebases when the rules deal with related conditions and actions. A set of rules activates when its rulebase is loaded and deactivates when the rulebase is flushed. This collection of rulebases is collectively called the “agent.”
- Agents can use rules to inference about the data, create new data or modify existing data.
- the two fundamental search strategies of these agents include forward and backward chaining.
- Forward chaining which is a data driven process, proceeds from premises or data to conclusions.
- backward chaining or goal-driven approach, proceeds from a tentative conclusion backward to the premises to determine whether the data supports that conclusion.
- a combination of forward and backward chaining can be used to optimally solve a particular problem. Details of various expert systems suitable for use are discussed in James P. Ignizio's book Introduction to Expert Systems—The Development and Implementation of Rule - Based Expert Systems , hereby incorporated by reference.
- the present invention contemplates that other artificial intelligence constructs, including neural networks, fuzzy logic system, and others known to those skilled in the art may be applied in place of the expert system.
- the intelligent agent can be specified using an object oriented language such as Java such that it is free to roam the Internet and other networks with Java compatible browsers.
- the routine trains the agent in step 502 with training data, if necessary in the event that neural networks and the like are to be used in implementing the agent.
- the routine sends the agent over a network such as the Internet in step 503 .
- the agent checks if the data it encountered satisfy the rules and parameters that the agent is looking for. If not, the routine proceeds to search the next database. From step 504 , the routine checks if all databases have been mined in step 505 . If not, the routine moves to the next database in step 506 before it loops back to step 504 . Alternatively, from step 505 , if all databases have been mined and the agent still has not located responsive information, it puts itself to sleep in step 507 . The agent periodically wakes up and broadcasts itself once more in step 503 to continue searching for responsive materials.
- step 504 in the event that responsive documents have been located, the agent checks in step 505 whether it is instructed to call other agents in step 508 . If so authorized, the agent invokes other agents in step 509 . When the agent reports back with information, the host computer proceeds to notify the user with data located in step 510 .
- step 511 if the user accepts the data, the routine of FIG. 22 stores the data and updates the spreadsheet with information in step 512 .
- the routine checks if the user has invoked additional agents in response to the information detected by the original agent. If so, the routine proceeds from step 512 to step 513 where additional agents are launched or additional routines are executed. From step 513 , if the user does not invoke additional agents, the routine of FIG.
- the agent of FIG. 22 can respond to the verbal, handwritten, hand-drawn, or typed command or data from the user and intelligently perform the requested action, be it a data search or various user specified actions.
- these agents are capable of gathering information resourcefully, negotiating deals, and performing transactions on their user's behalf.
- the agent can have contingency plans such as they are aware of their environment when situated in different places and act accordingly.
- the present invention also provides a database management system which acts in conjunction with the spreadsheet of the present invention to optimally manage data on behalf of the user.
- the purpose of a database management system is to store, maintain, and retrieve database records in files.
- a file collects records of the same format that serve a common purpose.
- general purpose database management systems use four language interfaces between the application programming language and the database manager: a data definition language, a data manipulation language, a query language and a report writer language.
- the data definition language defines the format, or schema, of the database by identifying the files, record formats, and relationship between files.
- the data manipulation language is the applications program interface to the database management system such as the opening, closing of the database or the adding, changing or deleting of records of the database.
- the query language allows the database to be searched according to a search criteria, while the report writer language allows the user to generate a report based on the result of the query.
- step 515 the routine of FIG. 23 generates one or more database forms in step 516 .
- the database forms are the user interface for the records of data stored in the database.
- the process for creating the form of step 516 is shown in more detail in FIG. 24.
- the routine of FIG. 23 allows the user to enter information into the records of the database in accordance with the data entry format specified in the form creation step 516 .
- the data may be entered by writing, scanning, dictating or by an agent sent over the Internet for hunting down relevant data.
- step 517 the routine of FIG. 23 checks if the user wishes to perform a database search in step 518 . If not, the routine exits. Alternatively, if a search is to be done, the routine prompts the user for a database query in step 519 . From step 519 , a search is carried out in step 520 before the database routine of FIG. 23 exits in step 521 .
- the data definition language stored in the form specification is preferably iconized such that the user can quickly layout a data entry form using a graphical specification.
- the icons displayed include control objects such as text boxes, check boxes, dialogs, option buttons, labels, among others.
- the user can simply pick the control icon and place the control icon on its appropriate position on the LCD display.
- the user select the appropriate attributes of the control icon such as caption, font, and dimensions, in a manner similar to the selection and customization of controls in Visual Basic or Visual C++, available from Microsoft Corporation of Redmond, Wash. The user repeats this process until all the data element have been defined, formatted, and positioned on the form.
- the database can simply a free-text database without any control icon restrictions, or in the event that control icons are appropriate and necessary, a smart agent can be used to help a new user to select the right controls and their attributes by asking the user questions about the type of data to be stored by the database, generating the icons, and allowing the user to move and/or adjust the attributes.
- the routine checks if the user needs the assistance of an intelligent agent in formulating the form of the database in step 530 . If the user requires assistance, the intelligent guides the user through the form set up process in step 531 by asking the user information about the relevant fields, their formats, among others. Once the fields and their characteristics have been identified, the intelligent agent generates a generic form with the fields required for the user's application and saves the new form in step 538 before it exits.
- the routine of FIG. 24 checks if the user wishes to, create the form using a visual format in step 532 . If not, the routine of FIG. 24 proceeds to step 533 where it accepts textual specifications of the database form from the user. Next, the form information is saved in step 538 prior to exiting the routine. In the event that the user wishes to create the form layout graphically in step 532 , the routine of FIG. 24 displays a palette of control objects available for the form in step 534 .
- control objects include a check box, a pop-up menu, a pop list, a text field, a numeric field, a table field, a date/time field, currency filed, ink field, a formulas field, a look-up field, a barcode field, and a GPS field.
- the routine waits until the user select an object in step 535 .
- the routine transitions to step 536 where it displays the object on the form and requests the user to enter the object parameters, including the object dimensions and formatting characteristics, among others.
- the routine loops back to step 535 to wait for the next object selection.
- the routine checks if the user is done with the form creation process in step 537 .
- routine simply loops back to step 535 to await the next user selection.
- the routine simply loops back to step 535 to await the next user selection.
- the routine simply loops back to step 535 to await the next user selection.
- the user has completed the graphical form creation process in step 537
- the form created is saved in step 538 before the routine of FIG. 24 exits in step 539 .
- data can be entered into the database by writing data to the fields of each record in the database.
- the data can be imported from raw text, from other dBase files, or from the spreadsheet data files of the present invention.
- the data can be scanned in using the scanner/bar code reader discussed above, dictated in using the speech recognition engine, or delivered from an external By source such as the Internet 150 using smart agents as discussed above or live-data databases that respond to data changes and events such as those discussed in K. C. Hopson and Stephen E. Ingram's book Developing Professional Java Applets (1996) or the live-data product available from Cycle Software, Inc. located in Quincy, Mass.
- the database of the present invention automatically classifies and handles information presented to the database.
- the first barcode data captured will be assigned to the first barcode field, while the subsequent barcode data will be assigned to the second barcode field, even if the barcode fields are not adjacent to each other.
- the same processing is provided to handle GPS data.
- the user can generate reports and/or update the database via the query language, the manipulation language and the report writer.
- the query language interface can simply be a standard dBase-like query commands, as known to those skilled in the art.
- the query language interface can be an easy to query language where the user simply enters the field to search and the appropriate search parameters.
- the search is then conducted in accordance with the parameters.
- FIG. 25 the routine to handle the search process 520 of FIG. 23 is shown.
- the routine first searches the database as exactly requested in step 550 .
- the search process can be an indexed search or a binary search for speed reasons, as known in the art.
- the routine checks if the user has designated that non-traditional searches are to be done in step 551 . If so, the routine proceeds to step 532 where it performs an inexact fuzzy search by looking for records with fields that almost, but not exactly, matched the query request, in a manner analogous to the fuzzy search done in speech recognition, as discussed in the incorporated by reference U.S. patent application Ser. No. 08/461,646. Furthermore, a probabilistic search is performed in step 553 . The probabilistic search looks up equivalent words using a thesaurus and the replaces the keyword with equivalent words according a probability distribution, in accordance with the context of word usage. The present invention also contemplates that Soundex expansion techniques, as known in the art, may be used to expand the keyword search.
- step 560 the current coordinate data is queried from the GPS receiver 46 in step 561 .
- step 561 the routine of FIG. 26 proceeds to step 562 where the routine detects whether the portable computer is within a predetermined proximity.
- step 562 if the portable computer has not moved, the routine proceeds to step 563 where the routine puts itself to sleep.
- step 563 after a predetermined period, the routine of FIG. 26 wakes up and proceeds to step 561 to check whether the portable computer has moved.
- step 564 the beginning coordinate and time information are saved.
- step 565 the routine samples the output of the GPS receiver 46 in step 565 .
- step 566 the routine checks if the position of the GPS receiver 46 has changed. If not, the routine loops back to step 565 to continue acquiring GPS data. Alternatively, if data from the GPS receiver 46 indicate that the GPS receiver 46 has altered or moved, its position, the routine of FIG. 26 proceeds from step 566 to step 567 where it waits until the GPS receiver 46 has stopped moving, typically by checking if the proximity remains unchanged for a predetermined period of time. When the GPS receiver 46 stops moving, the routine proceeds to step 567 where it saves the ending. coordinate and ending time. Furthermore, the routine computes the mileage incurred for the trip in-step 567 .
- the routine collects other business data in step 568 .
- the type of data collect varies with the application. For instance, for lawyers, the data collected may simply be time and expense and case management applications. For medical practitioners, the data collected may consist of patient information, drug interaction, type of treatment provided, and billing related information, among others. For salesperson, the data collected may relate to order taking, inventory checking, creating to-do list, and pricing, among others.
- the routine collates the data into one or more packets, compresses the packets and transmits the data via a suitable wireless transmitter such as the pager or the wireless transceiver 31 before the routine of FIG. 26 exits in step 570 .
- routines for supporting a meeting are shown.
- the routines in FIGS. 27 through 29 provide automated support for mobile users and in effect act as an intelligent researcher or agent for the users.
- the agent is necessary to protect the user from an increasing information overload in modern life while allowing the user to maintain control over .
- FIG. 27 illustrates the detail of the data search and preparation before a meeting takes place.
- the routine of FIG. 27 proceeds to step 581 where the routine checks with the calendar engine for meetings scheduled for a particular date. From step 581 , for each meeting calendared, the routine performs a search on the company and individuals scheduled for the meeting in step 582 . Next, the routine checks in its internal records for historical data of prior meetings in step 583 . In this step, the routine also attempts to identify areas of agreement and disagreement, as well as the personal information of the people in the meeting to remind and prepare the user of hot-spots to be careful on in the meeting.
- step 584 the routine of FIG. 27 proceeds to step 584 where it searches for information relating to the competition as well as other potential stakeholders.
- the search starts with in-house data and sweeps outwardly toward the Internet 150 .
- This step preferably deploys the intelligent agent of FIG. 22.
- the agent of FIG. 22 enters the respective competitor's name into search engines such as Yahoo, AltaVista, HotBot or Infoseek.
- the agent may also check the competitor's financial health by performing a search in Hoover's Online, located at http://www.hoovers.com, and a search at the U.S. Securities & Exchange Commission, located at http://www.sec.gov.
- Other sites with financial information on public and private companies that can be searched by the agent of FIG. 22 include http://www.pathfinder.com, http://www.avetech.com, http://www.dbisna.com.
- the agent of FIG. 22 can search Ecola's 24-hour newsstand, located at http://www.ecola.com, which links to more than 2,000 newspapers, journals, magazines and publications. Additionally, the agent can search CNN Interactive at http://www.cnn.com for archived information going back a few weeks. Furthermore, the agent of FIG. 22 can search the Knowledge Index on CompuServer, and the Electric Library, available at http://www.elibrary.com, for scouring magazines, reference works and news wires. Furthermore, MediaFinder, located at http://www.mediafinder.com, provides an index and description of thousands of newsletters, catalogs and magazines.
- the agent of FIG. 22 also provides the ability to listen in on conversations regarding a particular company by news groups and discussion groups prevalent in the Usenet section of the Internet 50 .
- the agent of FIG. 22 reviews Deja News Research Service, located at http://www.dejanews.com, and Liszt, located at http://www.liszt.com.
- the agent of step 584 checks sites that have compiled good collections of business resources, including John Makulowich's Awesome Lists, located at http://www.clark.net, American Demographics, located at http://www.demographics.com, ProfNet, located at http://www.vyne.com, StartingPoint, located at http://www.stpt.com, Babson College, located at http://babson.edu, and Competitive Intelligence Guide, located at http://www.fuld.com.
- the present invention contemplates that yet other sites can be searched as well for competitive information, including the Lexis/Nexis database, the Westlaw database, various judicial decisions at Villanova University, licensing information from Licensing Executive Society at http://www.les.org, and the patent abstract information database from the U.S. Patent & Trademark Office, or alternatively, abstracts from MicroPatent, located at http://www.micropat.com, among other sites.
- step 584 the routine formats the collected information of steps 583 - 584 in step 585 .
- step 586 the routine checks to see if it is time to meet. If so, the routine proceeds to step 587 where it notifies the user of the meeting and displays the formatted report of step 585 in step 588 . Alternatively, if it is not yet meeting time in step 586 , the routine proceeds to step 589 where it puts itself to sleep until the next check interval. From step 588 or 589 , the routine of FIG. 27 exits in step 590 .
- step 600 the routine proceeds to step 601 where it displays the reports generated in step 585 of FIG. 27. Further, the routine checks in step 602 whether the customer or client has specific questions. If so, the routine proceeds to step 603 where, in the event that the user does not know the answer already, the routine jumps to step 604 where it queries a database and allows the user to electronically mail questions to the technical staff in step 605 .
- step 606 the routine downloads the requisite computer aided design (CAD) file for editing purposes.
- step 608 the design can be updated using a number of tools, including the tools disclosed in the incorporated by reference patent applications.
- CAD computer aided design
- the routine of FIG. 28 also checks if the customer desires an alternative pricing in step 609 . If so, the routine downloads pricing information from the host to the portable computer in step 610 and applies the spreadsheet discussed above to the data in step 611 .
- step 612 the routine checks if outstanding questions remains. If not, the routine proceeds from step 612 to step 613 where it flags that a standard follow-up letter without questions is to be used. Alternatively, in the event that outstanding questions remain to be answered, the routine proceeds from step 612 to step 614 where it adds to the list of follow-up questions. From step 613 or 614 , the routine of FIG. 28 exits.
- step 621 the routine loads a standard letter template which provides the foundational structure for the correspondence.
- step 621 the routine checks if unresolved questions remain in step 621 . If so, the routine proceeds to step 623 where it displays the question list in step 623 to remind the user of the items to be addressed in the correspondence.
- step 624 in the event that the answer requires an expert, the routine proceeds from step 624 to step 625 where the routine forwards the question to the appropriate person.
- step 626 the routine proceeds to step 626 where the question is answered.
- step 627 the routine checks if it is done with all questions. If not, the routine loops back to step 623 to answer the next question in the list. Alternatively, if all questions have been answered in step 627 or step 622 , the routine proceeds to step 628 where it applies standard closing paragraphs as well as a signature facsimile. Furthermore, to the extent some personalized compliments or congratulations can be made, as identified in step 582 of FIG. 27, the routine also applies these congratulatory remarks to the correspondence.
- step 629 the routine prints, e-mail, postal mail, or fax the correspondence to the client or customer before exiting in step 631 of FIG. 29.
- the present invention thus provides a convenient system for accepting and manipulating data using a spreadsheet or a database such that the user can quickly write commands or data on a mobile computer with a relatively compact screen. Further, the present invention integrates speech and typed data entry to provide a user friendly computer system. Further, the spreadsheet or database system of the present invention can be used in two-way messaging systems to support object linking and embedding like capabilities. Data can be imported into the spreadsheet or database by scanning or dictating the information to the computer system. The present invention also supports an intelligent agent operating with the computer to locate responsive information, as specified by the spreadsheet or database system of the present invention.
Abstract
A spreadsheet and a browser on a portable computer accept data from an input recognizer, including a non-cursive handwriting recognizer or a speech recognizer and communicate data directly with another computer or over the Internet using wireless media such as radio and infrared frequencies or over a landline. The computer is endowed with a plurality of built-in or snap-on expansion accessories to enhance the data capture capability as well as the ease of reading data from the limited screen of the present invention. These accessories include a camera, a scanner, a voice recorder or voice capture unit, and a remote large screen television. The camera and scanner allows visual data to be capture, the voice recorder allows the user to make quick verbal annotations into a solid state memory to minimize the main memory requirements, while the voice capture unit allows the voice to be captured into memory for subsequent transmission over the Internet or for voice recognition purposes. The spreadsheet or database receives data from the Internet or from the accessories and further can graph or manipulate the data entered into the spreadsheet as necessary. Furthermore, the database has a smart search engine interface which performs fuzzy search such that inexact queries can still result in matches. The smart search engine thus allows users to locate information even though the exact spelling or concept is not known. To minimize user's work in locating information to analyze, the spreadsheet and database can spawn and train an intelligent agent to capture data from a suitable remote source such as the Internet and transmit the data to the spreadsheet or browser for further analysis. Alternatively, the user can capture data directly by scanning or dictating the information into the spreadsheet or browser. In another aspect of the invention, a pan and zoom capability is provided to provide the user with an appropriately scaled view of the data for ease of reading. Alternatively, when the portable computer is within range of a larger display device such as an appropriately equipped television display or a personal computer with a larger display, the present invention's wireless link transmits the video information to the larger display to allow the user to view data the larger display unit. Similarly, the present invention provides a remote stereo receiver adapted to receive sound data stream from the portable computer and driving high quality speakers to support multimedia applications on the portable computer.
Description
- The present invention relates to a data management system, and more particularly, to a data management system for a mobile computer.
- Before the advent of computers, the assimilation and interpretation of information required extensive manual data collection as well as error-prone hand calculations carried out by many individuals. The manual tabulation of large quantities of data typically resulted in a small percentage of errors in the collected data. Furthermore, additional errors had been introduced through the use of laborious manual numerical analyses. Although the adoption of accountant's columnar pads to create paper spreadsheets eased the assimilation of data and reduced the error propagation, the manual preparation of such spreadsheet was rather tedious error prone to calculation errors, and expensive in labor.
- The advent of personal computers brought forth electronic spreadsheets such as VISICALC, LOTUS-1-2-3, EXCEL and QUATRO-PRO and databases such as D-BASE, VISUAL FOX-PRO and ACCESS which provide convenient systems for quickly organizing information. As discussed in U.S. Pat. Ser. No. 5,502,805, entitled “SYSTEM AND METHODS FOR IMPROVED SPREADSHEET INTERFACE WITH USER-FAMILIAR OBJECTS”, typical spreadsheet programs configured the memory of the computer to resemble the column/row or grid format of an accountant's columnar pad, thus providing a visible calculator for a user in each cell of the column/row format. To communicate the location of the cell, a common scheme assigned a number to each row in the spreadsheet and a letter to each column. Thus, the cell represented the basic addressable storage location of the spreadsheet at each intersection of a row with a column. In addition to holding text descriptions and numeric data, each cell can store formulas or special instructions specifying calculations to be performed on the numbers stored in the cells. Upon receipt of new data, the formulas were automatically updated to support “what if” scenarios.
- Computerized spreadsheets offered many advantages over the old pen-and-paper approach. For one, these spreadsheets were capable of supporting very large spreadsheets that would be unwieldy to maintain by hand. Further, the computerized spreadsheets were capable of supporting scenario calculations where the entered information may be quickly recalculated with different assumptions. Thus, these computerized spreadsheets offered dramatic improvements in ease of creating, editing and applying mathematical models such as financial forecasting. Similarly, databases allowed users to maintain vast quantities of data and to manipulate the information via query commands. Thus, the usefulness of spreadsheets, databases and other business applications made them staple software for data summary, advanced numerical analysis and charting applications.
- Although computerized spreadsheets and databases offered significant productivity gains in modeling complex data, none was as intuitive to use as the old, but familiar paper and pencil. To use the new technology, the user had to type information into the cells of the spreadsheet. In the hand of inexperienced users, the data entry aspect was unpleasant. Further, the verification for correct data entry was time consuming. Additionally, the user had to master many complex and arbitrary operations. For example, to find the proper commands, the user needed to traverse several nodes of a menu. Advances in computer technology had not simplified life for users, since these advances have been largely employed to build more complex functions and modeling capability into the spreadsheet with even more menus and sub-menus. Since the alternative of perusing through a staggering array of incomprehensible icons was not also palatable to users, most users only used a fraction of the available commands and features. Furthermore, conventional computerized spreadsheets and databases still required users to manually enter the information.
- Additionally, applications such as spreadsheets, databases, project planning tools and CAD/CAM systems required large display areas to quickly and conveniently interact with users. However, portable computing appliances must balance the conflicting requirements of the readability of the displayed characters and the size of their display screens. On one hand, the portability requirement implied that the screen be small. On the other hand, the readability requirement pushed in the opposite direction and dictated that the display area be as large as possible. However, as computing appliances with large screens consumed more power, were more fragile, expensive and bulkier, most portable computers offered only a small display surface. The selection of a small display size restricted the user into making undesirable choices between displaying either larger characters or more information. For busy executives, attorneys, doctors and other professionals, such restrictions were impractical. Thus, the display system need to be portable, cost effective, and easy to use in comparison with the pen and paper approach before the conventional pen and paper method can be replaced.
- In addition to being as easy to use as the pen and paper approach, the portable computing appliance needed to provide information integration advantages, including the ability to capture data from scanners, barcode readers, or the Internet, over the cheaper pen and paper approach to further justify the expense associated with such electronic computer systems. Furthermore, as portable computers are typically deployed in field applications by service providers where employees are scattered over a wide geographic area, the information advantages arising from integrating data associated with a global positioning system (GPS) are needed in the management and control of field personnel to ensure that the employees are actually at the respective expected locations. Additionally, an ability to link information generated at the client's site with follow-up discussions and letters necessary to close the transaction is needed to enhance the efficiency of field personnel.
- The present invention provides a spreadsheet and a database on a portable computer which accepts data from an input recognizer which includes a non-cursive handwriting recognizer or a speech recognizer. The portable computer can communicate data directly with another computer or over the Internet using wireless media such as radio and infrared frequencies or over a landline. It is endowed with a plurality of built-in or snap-on expansion accessories to enhance the data capture capability as well as the ease of reading data from the limited screen of the present invention. These accessories include a camera, a scanner, a voice recorder or voice capture unit, a global position system (GPS) receiver and a remote large screen television. The camera and scanner allows visual data to be capture, the voice recorder allows the user to make quick verbal annotations into a solid state memory to minimize the main memory requirements, while the voice capture unit allows the voice to be captured into memory for subsequent transmission over the Internet or for voice recognition purposes. The spreadsheet or database receives data from the Internet or from the accessories and further can graph or manipulate the data entered into the spreadsheet as necessary. Furthermore, the database has a smart search engine interface which performs fuzzy search such that inexact queries can still result in matches. The smart search engine thus allows users to locate information even though the exact spelling or concept is not known. To minimize user's work in locating information to analyze, the spreadsheet and database can spawn and train an intelligent agent to capture data from a suitable remote source such as the Internet and transmit the data to the spreadsheet or database for further analysis. Alternatively, the user can capture data directly by scanning or dictating the information into the spreadsheet or database. The geographical information can be generated automatically via the GPS receiver. Data from the receiver is communicated via a suitable pager or wireless transceiver back to either a mapping application or other management tools to allow management to monitor the field user's whereabouts. In another aspect of the invention, a pan and zoom capability is provided to provide the user with an appropriately scaled view of the data for ease of reading. Alternatively, when the portable computer is within range of a larger display device such as an appropriately equipped television display or a personal computer with a larger display, the present invention's wireless link transmits the video information to the larger display to allow the user to view data the larger display unit. Similarly, when the portable computer is within range of a suitably equipped stereo receiver, the portable computer transmits MIDI data streams to the receiver such that the MIDI sound generator can produce high quality sound for multimedia applications running on the portable computer, even though the stereo receiver is not tethered to the portable computer of the present invention.
- The accompanying drawings, which are incorporated in and form a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention:
- FIG. 1 is a block diagram of a portable computer system for providing data management support in accordance with the present invention;
- FIG. 1B is a flowchart illustrating a first embodiment of a file system for the computer of FIG. 1 which is IBM-PC compatible;
- FIG. 1C is a flowchart illustrating a second embodiment of the file system for the computer of FIG. 1;
- FIG. 2A is a block diagram showing in more detail a scanner from FIG. 1;
- FIG. 2B is a block diagram showing in more detail another scanner for the computer system of FIG. 1 having a wireless link;
- FIG. 3 is a block diagram showing a remote display unit with a wireless link which is adapted to communicate with the computer system of FIG. 1;
- FIG. 4 shows a block diagram showing in more detail protocol layers linking a network aware application operating on the computer of FIG. 1 and another application over the Internet;
- FIG. 5 is an illustration of a connectivity architecture of the computer system of FIG. 1 and the Internet as well as the data flow among computers connected to the Internet and the computer system of FIG. 1;
- FIG. 6 is a flowchart illustrating the process for handling events in a spreadsheet data management system in the computer system of FIG. 1;
- FIG. 7 is a flowchart illustrating the process for handling system events in FIG. 6;
- FIG. 8 is a flowchart illustrating in more detail the scroll process of FIG. 7;
- FIG. 9 is a flowchart illustrating the process for editing cell contents of FIG. 7;
- FIG. 10 is a flowchart illustrating the process to save a cell in FIG. 9;
- FIG. 11 is a flowchart illustrating the process for evaluating a formula of FIG. 10;
- FIG. 12 is a flowchart illustrating the process for handling menu events of FIG. 6;
- FIG. 13 is a flowchart illustrating the zoom process of FIG. 12;
- FIG. 14 is a flowchart illustrating the process for updating the spreadsheet cells of FIG. 6 using a remote database;
- FIG. 15 is a flowchart illustrating the process for identifying rows/columns to update in FIG. 14;
- FIG. 16 is a flowchart illustrating the process for retrieving data over a network such as the Internet using a browser;
- FIG. 16A is a flowchart illustrating the process for executing the browser of the present invention;
- FIG. 17 is flow chart of the process for scanning information using the scanner of FIG. 2A and updating the data management system of FIG. 1;
- FIG. 18 is flow chart of the process for copying information using the scanner of FIG. 2A and storing or transmitting the data using the computer system of FIG. 1;
- FIG. 19 is a flow chart of the process for linking and transmitting display information from the computer system of FIG. 1 to a larger display device for ease of reading;
- FIG. 19A is a flow chart of the process for teleconferencing with a remote user and for visually sharing an electronic chalkboard;
- FIG. 20 is a flowchart illustrating the process for capturing voice annotation using a voice recorder shown in FIG. 1;
- FIG. 21 is a flowchart illustrating the process for capturing and processing voice commands and annotations using the microphone and analog to digital converter of FIG. 1;
- FIG. 22 is a flow chart of the process for operating an intelligent agent in conjunction with the computer system of FIG. 1;
- FIG. 23 is a flowchart illustrating the process for operating a database in accordance with the computer system of FIG. 1;
- FIG. 24 is a flowchart illustrating the process for generating a form for the database of FIG. 23;
- FIG. 25 is a flowchart illustrating the process for searching the database of FIG. 23;
- FIG. 26 is a flowchart illustrating the process for using the GPS receiver of FIG. 1;
- FIG. 27 is a flowchart illustrating an agent on the computer of the present invention which prepares information needed for a meeting;
- FIG. 28 is a flowchart illustrating the process for,collecting data and interacting with various information networks using the computer of the present invention during the meeting; and
- FIG. 29 is a flowchart illustrating the process for following up on outstanding action items using the data management computer of the present invention.
- FIG. 1 illustrates the computer system of the present invention for managing data. The computer system is preferably housed in a small, rectangular portable enclosure. Referring now to FIG. 1, a general purpose architecture for entering information into the data management by writing or speaking to the computer system is illustrated. In FIG. 1, a
processor 20 or central processing unit (CPU) provides the processing capability for the sketching system of the present invention. Theprocessor 20 can be a reduced instruction set computer (RISC) processor or a complex instruction set computer (CISC) processor. Preferably, theprocessor 20 is a low power CPU such as the MC68328V DragonBall device available from Motorola Inc. - The
processor 20 is connected to a read-only-memory (ROM) 21 for receiving executable instructions as well as certain predefined data and variables. Theprocessor 20 is also connected to a random access memory (RAM) 22 for storing various run-time variables and data arrays, among others. TheRAM 22 is sufficient to store user application programs and data. In this instance, theRAM 22 can be provided with a back-up battery to prevent the loss of data even when the computer system is turned off. However, it is generally desirable to have some type of long term storage such as a commercially available miniature hard disk drive, or non-volatile memory such as a programmable ROM such as an electrically erasable programmable ROM, a flash ROM memory in addition to theROM 21 for data back-up purposes. TheRAM 22 stores a database of the spreadsheet of the present invention, among others. - The
computer system 10 of the present invention has built-in applications stored in theROM 21 or downloadable to theRAM 22 which include, among others, an appointment book to keep track of meetings and to-do lists, a phone book to store phone numbers and other contact information, a notepad for simple word processing applications, a world time clock which shows time around the world and city locations on a map, a database for storing user specific data, a stopwatch with an alarm clock and a countdown timer, a calculator for basic computations and financial computations, and a spreadsheet for more complex data modeling and analysis. In addition to the built-in applications, add-on applications such as time and expense recording systems taught in U.S. application Ser. No. 08/650,293, entitled “TIME AND EXPENSE LOGGING SYSTEM”, and sketching/drawing tools as filed in U.S. application Ser. No. 08/684,842, entitled “GRAPHICAL DATA ENTRY SYSTEM”, both of which are hereby incorporated by reference, can be added to increase the user's efficiency. Additionally, project planning tools, and CAD/CAM systems, Internet browsers, among others, may be added to increase the functionality of portable computing appliances. Users benefit from these software, as the software allow users to be more productive when they travel as well as when they are in their offices. - The
processor 20 is also connected to an optional digital signal processor (DSP) 23 which is dedicated to handling multimedia streams such as voice and video information. TheDSP 23 is optimized for video compression using JPEG/MPEG standards known to those skilled in the art. Furthermore, theDSP 23 is equipped to handle the needs of a voice recognition engine. Although theDSP 23 is shown as a separate unit from theCPU 20, the present invention contemplates that theDSP 23 can also be integrated with theCPU 20 whereby theCPU 20 can rapidly execute multiply-accumulate (MAC) instructions in either scalar or vector mode. - The computer system of the present invention receives instructions from the user via one or more switches such as push-button switches in a
keypad 24. Theprocessor 20 is also connected to a real-time clock/timer 25 which tracks time. The clock/timer 25 can be a dedicated integrated circuit for tracking the real-time clock data, or alternatively, the clock/timer 25 can be a software clock where time is tracked based on the clock signal clocking theprocessor 20. In the event that the clock/timer 25 is software-based, it is preferred that the software clock/timer be interrupt driven to minimize the CPU loading. However, even an interrupt-driven software clock/timer 25 requires certain CPU overhead in tracking time. Thus, the real-time clock/timer integratedcircuit 25 is preferable where high processing performance is needed. - Further, the timer portion of the clock/
timer 25 can measure a duration count spanning one or more start times and completion times, as activated by the user. The timer portion has a duration count register which is cleared upon the start of each task to be timed. Further, the timer portion of the clock/timer 25 has an active state and a passive state. During operation, when the user toggles the timer portion into the active state, the duration count register is incremented. When the user toggles the timer portion into the suspended state, the duration count is preserved but not incremented. Finally, when the user completes a particular task, the value of the duration count register is indicated and stored in a database to track time spent on a particular task. - To provide for expandability, the
processor 20 drives aPCMCIA bus 26 which provides a high speed interface or expansion bus. The acronym PCMCIA represents both the PC Card standard (which specifies both card hardware and system software requirements), and the organization responsible for developing it. Originally, the standard was designed exclusively for memory cards (Release 1.0) used in small handheld and laptop systems in lieu of a floppy disk drive known as Type I cards. The next PCMCIA standard (Release 2.0 and up) were expanded to include I/O cards, such as modems or network cards. A thicker Type III card and Type IV card were also defined and are often used for hard drives. Each PCMCIA slot or socket is connected to one PCMCIA adapter to control one or more slots. Further, a memory resident driver called Socket Services must be present. Once loaded, Socket Services talks directly to the PCMCIA adapter hardware, and other programs talk to Socket Services to control a PC Card in one of that adapter's slots. The PCMCIA standard also describes a software layer called Card Services, which acts as a librarian of system resources. When Card Services is started, system resources (I/O ranges, interrupts, and memory ranges) are given to it; the resources it is given are usually configurable by running a program provided with the PCMCIA system software. The program that does the resource allocation may be a stand-alone program, or it maybe built into an enabler The resources used may be listed in a file, or might be specified on the command line of the enabler or Card Services. As each PC Card is inserted in the system, Card Services hands out these resources as needed to configure the card; when cards are removed, these resources are returned to Card Services to reuse. This allows any combination of cards to work without conflict All Socket Services drivers must be loaded before Card Services, because Card Services uses Socket Services to access the adapter hardware. The PCMCIA port of the present invention can accept hard disk drives such as ATA compatible hard drives. ATA stands for AT Attachment (IBM AT personal computer attachment), and is an interface which is electrically identical to a common hard disk interface. ATA mass storage devices, whether mechanical hard disks or solid-state memory cards which appear as disk drives, require another driver to be loaded in the system. ATA drivers must be loaded after Socket Services and Card Services. When hard disk drives are used, a filing system is provided. In the event that the disk drive is a solid state disk drive employing flash memory or other non-volatile memory. In the invention, a Flash Filing System (FFS) is provided to handle the peculiarities flash memory cards, including a limited write cycle, often on the order of 10,000 writes or so, before wearing out as well as the delay associated with erasing and rewriting information on these cards. The FFS driver performs wear-balancing to avoid wearing out the media prematurely, and works to hide performance delays in writing to the card. - Via the
PCMCIA bus 26, the computer system can also acquire visual information via a charged coupled device (CCD) or aCIS unit 27. The CCD/CIS unit 27 is further connected to alens assembly 28 for receiving and focusing light beams to the CCD or CIS for digitization. The CCD/CIS unit 27 thus can be either a digital camera or a page scanner, as shown in more detail in FIG. 2. Images scanned via the CCD/CIS unit 27 can be compressed and transmitted via a suitable network such as the Internet, via cellular telephone channels or via facsimile to a remote site. - In the event where the CCD/
CIS unit 27 is a camera and where the application is videoconferencing, theCPU 20 and/or theDSP 23 operate to meet the ITU's H.324 standard on multimedia terminal for low-bit-rate visual services for analog telephony. Preferably, theDSP 23 supports H.261 video encoding an decoding at CIF resolution of up to 15 frames per second, H.263 video and G.723 audio, MPEG-1 audio and video play-back, MPEG-1 video encoding, JPEG and motion JPEG, and advanced motion estimation, input and output video scaling, noise-reduction filters and forward error correction. - The
PCMCIA expansion bus 26 is also adapted to receive a radio tuner or aTV tuner 29, which is in turn connected to a built-in antenna. The radio/TV tuner 29 receives radio and/or TV signals and digitizes the information for suitable processing by theCPU 20. In this manner, the user of thecomputer 10 can listen to the radio or watch television while he or she works. ThePCMCIA bus 26 is also adapted to receive a data storage device, ordisk 30. Additionally, thePCMCIA bus 26 can receive awireless transceiver 31, which is connected to anantenna 32. Thewireless communication device 31 satisfies the need to access electronic mail, paging, mode/facsimile, remote access to home computers and the Internet. One simple form ofwireless communication device 31 is an analog cellular telephone link where the user simply accesses a cellular channel similar to the making of a regular voice call. However, the transmission of digital data over an analog cellular telephone network can give rise to data corruption. Digital wireless networks such as cellular digital packet data (CDPD) can be used. CDPD provides data services on a non-interfering basis with existing analog cellular telephone services. In addition to CDPD, a communication service called Personal Communication Services (PCS) allows wireless access into the public service telephone network. - The two-
way communication device 31 can also be a two-way pager where the user can receive as well as transmit messages. The two-way communication device supports a Telocator Data Protocol by the Personal Communications Association for forwarding binary data to mobile computers. The standard facilitates transmission of images and faxes over paging and narrowband PCS networks. Alternatively, the two-way communication device 31 can be substituted with a cellular telephone, whose block diagram and operation are discussed in detail in a co-pending U.S. application Ser. No. 08/461,646, hereby incorporated-by-reference. - In the event that two-way pagers are used, the present invention contemplates that the two-
way communication device 31 be compatible with pACT (personal Air Communications Technology), which is an open standard developed by Cirrus Logic—PCSI and AT&T Wireless Services Inc. (Kirkland, Wash.). pACT is a narrowband, 900 Mhz range PCS technology derived from the cellular digital packet data transmission standard. pACT is optimized for applications such as acknowledgment paging, mobile e-mail, wireless Internet access, voice paging, personal home security and dispatch services. Based on the Internet IP standard, pACT provides full data encryption and authentication, enabling the efficient, full delivery of reliable and secure messages. Alternatively, in place of pACT, a REFLEX protocol from Motorola Inc. may be used. The REFLEX protocol is commercially supported by SkyNet-Mtel Corporation. - The two-
way communication device 31 has a receiver, a transmitter, and a switch, all are controlled by theCPU 20 via the bus of the portable computer system of FIG. 1. The switch receives an input from theantenna 32 and appropriately routes the radio signal from the transmitter to theantenna 32, or alternatively, the radio signal from theantenna 32 to the receiver in the event theprocessor 20 is expecting a message. Via thebus 26, theprocessor 20 controls the receiver, the transmitter, and the switch to coordinate the transmission and receipt of data packets. The receiver and transmitter are standard two-way paging devices or standard portable cellular communication chips available from Motorola, Inc. in Schaumburg, Ill. or Philips Semiconductors in Sunnyvale, Calif. Theantenna 32 is preferably a loop antenna using flat-strip conductors such as printed circuit board wiring traces as flat strip conductors have lower skin effect loss in the rectangular conductor than that of antennas with round-wire conductors. - Turning now to the structure of the data packet to be transmitted/received to and from the two way two-
way communication device 31. A plurality of fields are provided, including a header field, a destination address field, a source address field, a date/time stamp field, a cyclic redundancy check field, and a data field. As is known in the art, the header field functions as a synchronization field. In the transmitting direction from the base station to the two-way communication device 31, the destination address field specifies the unique address for the receiving two-way communication device 31. The source address field in the transmitting direction from the base station to the two-way communication device 31 specifies the base station identification address, which may change to account for base station rerouting in the event that two-way communication device roaming is allowed. In the transmitting direction from the two-way communication device 31 to the base station during a reply session, the source address field contains the two-way communication device 31 address to permit the base station to identify the reply pages transmitted to a previously sent page. - The date/time stamp field contains data reflecting the second, minute, hour, day, month and year of the transmitted packet. The date/time stamp information allows the base station to determine to send a time-out message to the individual initiating the page request in the event that the two-
way communication device 31 does not timely acknowledge receipt of the page message. The cyclic redundancy check (CRC) field allows the two-way communication device 31 to verify the integrity of the messages received and transmitted by the two-way communication device 31. The data status includes coherency information enabling the host computer and the computer of FIG. 1 to synchronize data. The data status field carries information such as whether the data has been modified, the staleness of the data, the replacement status of the data, among others. The data field is a variable length field which allows variable length messages to be transmitted to and from the two-way communication device 31. - In the two-way paging embodiment using the two-
way communication device 31, the user can be paged and can reply as well. A page originator wishing to send an alphanumeric page to the user of the computer system of FIG. 1 places a call using a telephone or an originating computer. The telephone interface routes the call to the base station. The computer at the base station will either digitally or verbally query the page originator to enter an identification number such as a telephone number of the two-way communication device 31. After entry of the identification, the base station computer further prompts the page originator for the message to be sent as well as the reply identification number, or the call-back number. Upon receipt of the paging information, the base computer transmits a page message to the computer system with the two-way communication device 31 of FIG. 1, which may reply using predetermined text templates, or more flexibly using keyboard, voice, handwriting, sketching or drawing, as discussed in the incorporated by reference U.S. patent application Ser. No. 08/684,842, entitled “GRAPHICAL DATA ENTRY SYSTEM.” - Preferably, the present invention is compatible with Always-On-Always-Connected (AOAC) mobile clients connected to the Internet via wireless communications. The wireless messaging networks based on GSMISMS, pACT, Reflex, and similar narrowband 2-way paging services are significantly different from existing packet networks in that (1) Packet sizes are small (typically around 100 bytes); (2) Typical latency is much longer (on the order of seconds to minutes); and (3) The connection is much more intermittent. Narrowband sockets (NBS) was created to overcome these limitations by providing fragmentation and reassembly, reliability, and tolerance for intermittency to compliment circuit switched connections with low bandwidth connections over wireless messaging networks. The NBS enables a new class of mobile usage, AOAC, with exciting applications like automatic (background) forwarding of email, up to date news, weather, traffic, and personal messaging. It enables the existing cellular and wireless messaging infrastructure to send arbitrary data, rather than just alphanumeric pages. Thus, the present invention, in conjunction with NBS, allows data to find the user, rather than the user always having to initiate the retrieval for the information.
- The Datagram Protocol for NBS is a general purpose, unreliable, connectionless datagram service. It is not intended for applications which require 100% reliable delivery, such as file transfers. NBS datagrams are usually formatted to be readable on devices without NBS. This is useful for sending text based messages to legacy pagers and voice phones NBS has a core set of required features that must be supported in order to provide consistent functionality to developers. However, each narrowband network has different features and header formats. The NBS stack will use existing transport features to implement these core requirements were possible. The NBS stack performs queries to the hardware interface (NB-SERIAL NDIS miniport) for available network features. If required features such as ports and fragmentation are not supported by the network, then the NBS stack adds these features to the payload of each message. Datagram packets are transferred using the message services of an underlying network, where typically the bandwidth is small and communication is wireless. The protocol assumes the device addressing (Destination Address and the Originating Address) is handled at a higher level, and only adds those features necessary (generally port level addressing and fragmentation). In order to implement a NBS datagram protocol the following fields are necessary: DstPort—The logical port of the destination application (dd); SrcPort—the logical port of the sender application (as in UDP) (oo); RefNum—the sequence number of the datagram (kk); MaxNum—the total number of fragments in a single datagram (mm); SeqNum—the sequence number of the fragment, inside the datagram (nn). The datagram protocol headers are part of the data segment of the NBS message fragment. The headers can be presented in a few formats, depending on the environment used. In the Windows environment, the NBS stack queries the hardware driver (through NDIS) for the current network and packet format. Messages sent shall primarily use the binary format, or the full text based header including the concatenation scheme.
- The
processor 20 of the preferred embodiment accepts handwritings as an input medium from the user. Adigitizer 34, apen 33, and adisplay LCD panel 35 are provided to capture the handwriting. Preferably, thedigitizer 34 has a character input region and a numeral input region which are adapted to capture the user's handwritings on words and numbers, respectively. TheLCD panel 35 has a viewing screen exposed along one of the planar sides of the enclosure are provided. The assembly combination of thedigitizer 34, thepen 33 and theLCD panel 35 serves as an input/output device. When operating as an output device, thescreen 35 displays computer-generated images developed by theCPU 20. TheLCD panel 35 also provides visual feedback to the user when one or more application software execute. When operating as an input device, thedigitizer 34 senses the position of the tip of the stylus orpen 33 on theviewing screen 35 and provides this information to the computer'sprocessor 20. In addition to the vector information, the present invention contemplates that display assemblies capable of sensing the pressure of the stylus on the screen can be used to provide further information to theCPU 20. - The preferred embodiment accepts pen strokes from the user using the stylus or
pen 33 which is positioned over thedigitizer 34. As the user “writes,” the position of thepen 33 is sensed by thedigitizer 34 via an electromagnetic field as the user writes information to the data management computer system. Thedigitizer 34 converts the position information to graphic data that are transferred to a graphic processing software of the data logger computer system. The data entry/display assembly of pen-based computer systems permits the user to operate the data logging computer system as an electronic notepad. For example, graphical images can be input into the pen-based computer by merely moving the stylus over the surface of the screen. As theCPU 20 senses the position and movement of the stylus, it generates a corresponding image on the screen to create the illusion that the pen or. stylus is drawing the image directly upon the screen. The data on the position and movement of the stylus is also provided to a handwriting recognition software, which is stored in theROM 21 and/or theRAM 22. The handwriting recognizer suitably converts the written instructions from the user into text data suitable for saving time and expense information. The process of converting the pen strokes into equivalent characters and/or drawing vectors using the handwriting recognizer is described below. - With suitable recognition software, text and numeric information can also be entered into the pen-based computer system in a similar fashion. For example, U.S. Pat. Ser. No. 5,463,696, issued on Oct. 31, 1995 to Beernink et al., and hereby incorporated by reference, discloses a technique for analyzing and interpreting cursive user inputs to a computer, such as strokes or key depressions to the computer system. Inputs to the system are received at a user interface, such as a dual function display/input screen from users in the form of pen strokes or gestures. A database stores cursive input data strokes and hypotheses regarding possible interpretations of the strokes. Recognition of the input strokes and recognition of higher level combinations of strokes forming characters and words is performed using recognizers, or recognition domains, each of which performs a particular recognition task. A controller is provided for controlling the hypotheses database and for scheduling the recognition tasks in the recognition domains. Arbitration resolves conflicts among competing hypotheses associated with each interpretation. The recognition domains, or recognizers generate two or more competing interpretations for the same input. The recognizers use a data structure called a unit, where a unit is a set of sub-hypotheses together with all their interpretations generated by a single recognizer.
- In Beernink, the handwriting recognizer operates at a first level for identifying one or more groups of related sub-hypotheses using grouping knowledge. These grouped subhypotheses generate a unit with no interpretations for each group and store the unit in the database in what is called a piece-pool memory. The Beernink recognizer has a second level of operation where each unit generated in the grouping stage is classified to provide the unit with one or more interpretations. The classified units are stored in a unit pool memory. Two or more interpretations of the input data are combined in a hierarchical structure according to a predetermined scheme in successive steps to form higher level interpretations.
- Although the Beernink recognizer is flexible and does not require that the user learns special gestures, its accuracy is not perfect. Because the letters in a cursive-lettered word are connected, the recognizer must guess at how to segment the strokes into individual characters. Since ambiguities exist even in stellar samples of penmanship, cursive handwriting recognizers such as those in Beernink face a challenging task in deciphering handwritings. For example, handwriting recognizer have difficulties in trying to determine where a cursive lower-case “n” and “m” begin and end when the two letters, distinguishable from one another only by their number of humps, are strung together in a word. The handwriting recognizer tackles such ambiguities by looking in its dictionary-to determine, for instance, which words have “m” before “n” and which have “n” before “m.” The user can improve accuracy by writing characters further apart than usual, but that is inconvenient and contrary to the way humans write.
- Preferably, the handwriting recognizer of the present invention recognizes non-cursive characters. Unlike the Beernink approach to recognizing handwriting in which the user can print or write cursively, the non-cursive handwriting recognizer requires the user to learn to print characters in its fixed style using a basic character set, preferably a 36-character alphanumeric character set. In addition to the basic 26 letters and 10 digits, the non-cursive handwriting recognizer includes multi-step pen strokes that can be used for punctuation, diacritical marks, and capitalization. Preferably, the non-cursive handwriting recognizer is a software module called GRAFFITI, commercially available from U.S. Robotics, Palm Computing Division, located in Los Altos, Calif. Each letter in the non-cursive alphabet is a streamlined version of the standard block character—the letter A, for example, looks like a pointy croquet hoop, and the hoop must be started at the dot indicator at the lower right corner—as illustrated and discussed in more detail in the above incorporated-by-reference U.S. patent applications. By restricting the way the user writes, the non-cursive handwriting recognizer achieves a more perfect recognition and, as with stenography, supports an alphabet consisting of characters that can be written much more quickly than conventional ones.
- The computer system is also connected to one or more input/output (I/O)
ports 42 which allows theCPU 20 to communicate with other computers. Each of the I/O ports 42 may be a parallel port, a serial port, or alternatively a proprietary port to enable the computer system to dock with the host computer. In the event that the I/O port 42 is housed in a docking port 84 (FIG. 5), after docking, the I/O ports 42 and software located on a host computer 82 (FIG. 5) support an automatic synchronization of data between the computer system and the host computer. During operation, the synchronization software runs in the background mode on thehost computer 82 and listens for a synchronization request or command from thecomputer system 10 of the present invention. Changes made on the computer system and the host computer will be reflected on both systems after synchronization. Preferably, the synchronization software only synchronizes the portions of the files that have been modified to reduce the updating times. - The I/
O port 42 is preferably a high speed serial port such as an RS-232 port, a Universal Serial Bus, or a Fibre Channel for cost reasons, but can also be a parallel port for higher data transfer rate. Preferably, the I/O port 42 has a housing which is adapted to snappably connect to the housing of a Musical Instrument Digital Interface (MIDI)player 37, afax modem 40, avoice recorder 43, aGPS receiver 46 and abarcode reader 48. When the I/O port 42 is connects to theMIDI player 37, thecomputer system 10 drives highquality audio speakers MIDI player 37 to support multimedia applications on thecomputer 10. - Originally developed to allow musicians to connect synthesizers together, the MIDI protocol is used in generating sound for games and multimedia applications. One advantage of MIDI is storage space, as MIDI data files are quite small when compared with sampled audio sounds. The reduction in storage space follows from the fact that MIDI file does not contain sampled audio data, but the instructions needed by a synthesizer to play the sound. Other advantages of MIDI is the ability to edit the file or to change the speed, pitch or key of the sound: The output of the
MIDI player 37 is provided to an external multi-timbral MIDI synthesizer which can play many instruments such as piano, bass and drums simultaneously. The output of theMIDI player 37 can be connected to the synthesizer by wire or wirelessly such as by the infrared communication. In this manner, theMIDI player 37 generates high quality sound to enhance the user experience. - Additionally, via the
serial port 42, a fax-modem 40 is adapted to receive information over atelephone 41 via a plain old telephone system (POTS) landline or over the radio frequencies and allow the user to access information untethered. Further, themodem 40 may serve as part of a wide-area-network to allow the user to access additional information. The fax-modem 40 can receive drawings and text annotations from the user and send the information over a transmission medium such as the telephone network or the wireless network to transmit the drawings/text to another modem or facsimile receiver, allowing the user to transmit information to the remote site on demand. The fax-modem 40 can be implemented in hardware or in software with a few additional components such as a DAA, as is known in the art. Preferably, the fax-modem 40 is a 56 kbps modem using Lucent Technologies'DSP1643, a member of the Apollo# modem chip set. The Lucent Technologies' modem chips are designed to accommodate software upgrades for future enhancements to V.flex2 technology from Lucent, so customers'investments will be protected as standards for 56 kbps modems evolve. Using the device, the present invention can achieve Internet connections at rates up to 56 kbps when both they and their Internet service providers (ISPs) and online service providers (OSPs) use modems that incorporate V.flex2 compatible modems. Alternatively, the fax-modem device 40 can be a two-way communication device which can receive text messages and graphics transmitted via radio frequency to the user for on-the-spot receipt of messages. - The fax-
modem 40 can also be a digital simultaneous voice and data modem (DSVD), also available from Lucent Technologies. The modem, as specified in ITU-729 and 729A specifications, appear as a conventional modem with a downline phone which allows users to place and carry telephone conversations and digital data on a single phone line. These modems multiplex voice data by capturing voice and digitally compress them. Next, the compressed voice data and the digital data are multiplexed and transmitted to the remote end, which if compatible, decompresses the voice data and converts the digital data back to sound. Further, the digital data is presented to the remote computer as is usual. In this manner, the DSVD modem allows audiographic conferencing systems that rely on modems for data communication. DSVD modems are utilized in the blackboard conferencing system in the present invention, as discussed in more detail below. - The I/
O port 42 can also receive avoice recorder 43. Although voice can be captured, digitally processed by theDSP 23 or theCPU 20, stored internally in theRAM 22 for conversion into text or inclusion in a document or a file to be transmitted via a suitable network such as the Internet to a remote site for review, as discussed below, voice data can be stored externally and more economically using thevoice recorder 43 which stores audio information in its own storage rather than theRAM 22 and thus can operate independently of the computer system of FIG. 1. Preferably, thevoice recorder 43 is an ISD33240 from Information Storage Devices Inc. in San Jose, Calif. Thevoice recorder 43 captures analog sound data and stores the analog signals directly into solid state electrically erasable programmable ROM (EEPROM) memory cells which have been adapted to store 256 different voltage levels per cell. As thevoice recorder 43 captures voice and audio signals directly into its EEPROM cells, the analog to digital conversion process is not needed. TheCPU 20 communicates with thevoice recorder 43 by sending thevoice recorder 43 an address along with other control signals to thevoice recorder 43. In this manner, theCPU 20 can control the location where sound is to be played and/or recorded. Furthermore, as thevoice recorder 43 can operate even when it is detached from the computer system of the present invention, the user can simply separate the computer system and carry only thevoice recorder 43 when necessary. - The
voice recorder 43 is connected to theprocessor 20 via the I/O port 42. The I/O port 42 is connected to theCPU 20 via the bus and can forward commands from theprocessor 20 to thevoice recorder 43. Preferably, thevoice recorder 43, themicrophone 44 and thespeaker 45 are located in an external housing which snappably connects to the housing of thecomputer 10. Through the I/O port 31, thevoice recorder 43 could be commanded by theCPU 20 to play or record audio segments at specific cell addresses when particular conditions are met. Furthermore, via a message management record as known to those skilled in the art, theCPU 20 allows the ability to perform on-the-fly edit, delete, or supplement messages stored by thevoice recorder 43. - Although the voice recorder is normally controlled by the
CPU 20, thevoice recorder 43 also has one or more switches (not shown) to allow the user to manually operate thevoice recorder 43 in the event that thevoice recorder 43 has been ejected from the computer system. The switches provide user selectable options which include: “Goto Begin”, “Skip to Next”, “Record”, “Stop”, “Play Next”, “Play Last.” In this manner, even when thevoice recorder 43 is separated from the computer of the present invention, the user can still use thevoice recorder 43 in a stand-alone mode. - Additionally, a global position system (GPS)
receiver 46 is connected to the I/O port 42 to sense the physical position of the user. TheGPS receiver 46 senses positional data from a constellation of 24 satellites orbiting around the earth such that four satellites are visible at a time. TheGPS receiver 46 provides a stream of data to theprocessor 20 which includes latitude, longitude, elevation and time information. TheGPS receiver 46 is available from a number of sources, including NavTech Corporation and Rockwell International Corporation. - Furthermore, a wand or a bar-
code reader 48 can be removably attached to the I/O port 43 to allow the data management computer of the present invention to read bar codes. The wand is a pen-type of scanner that requires physical contact with the bar code when scanning. In contrast, a laser bar code scanner is a non-contact scanner which uses a laser beam to read a bar code. Due to the active laser power, the laser bar code scanner is better at reading bar codes from a distance or when the bar code itself is poorly printed. Thebar code reader 48 is snappably attached to the I/O port 31 such that thebarcode reader 48 can be quickly attached and removed, as necessary. Thebarcode reader 48 captures the bar-code information from a barcoded label and converts the optically encoded information to serial data before they are transmitted to the computer of FIG. 1. Alternatively, the present invention contemplates that the wired link can be replaced by a wireless link such as radio or infrared. In such instances, thebarcode reader 48 has an additional transceiver, which may be either radio-based or infrared based, and which can transmit captured data to the computer of FIG. 1 for subsequent processing. - Additionally, an
infrared transceiver 49 can be connected directly to the bus of thecomputer 10 or to the I/O port 31 (not shown) to provide an infrared link to a nearby personal computer which is equipped with a corresponding infrared transceiver. The infrared transceiver is available from suppliers such as Hewlett-Packard, IBM, or Siemens. In the event that theIR transceiver 49 is directly connected to the bus of thecomputer system 10, thetransceiver 49 provides the received optical data to a Universal Asynchronous Transmitter/Receiver (UART) which converts the data into a suitable format for the bus. - Additionally, to improve the ease of reading from the
small screen 35 of the computer of the present invention, a remote,large display device 52 is wirelessly linked to thecomputer 10 via theIR transceiver 49 or aradio transceiver 31. Thelarge display device 52 can be a suitably equipped television receiver with a wireless link and a video generator, as discussed further in FIG. 3, or it can simply be the display of a conventional personal computer having a matching transceiver. Thelarge display device 52 thus enlarges the characters on to an easier to read display. Additionally, thelarge display device 52 can offer higher resolution than available through theLCD display 35. In such case, thecomputer 10 is suitably informed so that software running on thecomputer 10 can change its display interface to take advantage of the higher resolution, as discussed in FIG. 19. - In addition to the
large display device 52, the present invention also supportsremote stereo amplifier 93 andspeakers computer 10 cannot support high power amplifiers and speakers onboard. In theremote stereo 93, a receiver is provided to receive data transmission from either theIR transceiver 49 or thewireless transceiver 31. Preferably, the stereo amplifier is a MIDI compatible synthesizer or sound module. The MIDI protocol provides an efficient format for conveying musical performance data. Due to MIDI's more efficient data storage format, only a portion of the bandwidth of thetransceivers remote stereo 93 in turn can consist of a MIDI sequencer, which allows MIDI data sequences to be capture, stored, edited, combined, and ma replayed. The recipient of the MIDI data stream is provided to a MIDI sound generator which responds to the MIDI messages by playing sounds. The present invention also contemplates more elaborate remote stereo MIDI setups, where the music can be composed to have different parts for different instruments. Furthermore, in this set-up, a different sound module is used to play each part. However, sound modules that are capable of playing several different parts simultaneously, or multi-timbral, can still be used. To allow realistic sounds to be reproduced, thestereo 93 drives a pair ofspeakers remote stereo unit 93 receives MIDI commands from theprocessor 20 and plays high quality sound on thespeakers - In addition to the handwriting recognizer previously discussed, voice recognition can be used in conjunction with and/or replace the handwriting recognizer of the present invention. As shown in FIG. 1, a
microphone 51 is connected to an analog to digital converter (ADC) 50 which interfaces with the central processing unit (CPU) 20. Furthermore, a speech recognizer is stored in theROM 21 and/or theRAM 22. The speech recognizer accepts the digitized speech from theADC 50 and converts the speech into the equivalent text. As disclosed in U.S. application Ser. No. 08/461,646, filed Jun. 5, 1995 by the present inventor and hereby incorporated by reference, the user's speech signal is next presented to a voice feature extractor which extracts features using linear predictive coding, fast Fourier transform, auditory model, fractal model, wavelet model, or combinations thereof. The input speech signal is compared with word models stored in a dictionary using a template matcher, a fuzzy logic matcher, a neural network, a dynamic programming system, a hidden Markov model (HMM), or combinations thereof. The word model is stored in a dictionary with an entry for each word, each entry having word labels and a context guide. Next, a word pre-selector receives the output of the voice feature extractor and queries the dictionary to compile a list of candidate words with the most similar phonetic labels. These candidate words are presented to a syntax checker for selecting a first representative word from the candidate words, as ranked by the context guide and the grammar structure, among others. The user can accept or reject the first representative word via a voice user interface. If rejected, the voice user interface presents the next likely word selected from the candidate words. If all the candidates are rejected by the user or if the word does not exist in the dictionary, the system can generate a predicted word based on the labels. Finally, the voice recognizer also allows the user to manually enter the word or spell the word out for the system. In this manner, a robust and efficient human-machine interface is provided for recognizing speaker independent, continuous speech and for converting the verbal instructions from the user into text data suitable for data management purposes. - The casing design and layout of the shell for the computer of FIG. 1 are discussed next. Preferably, the case is a rectangular plastic casing with a major opening on the top of the case to receive the
LCD panel 35 and thedigitizer 34. The case has a receptacle which is adapted to receive and store thepen 33. Furthermore, a plurality of push-buttons in thekeypad 24 are positioned on the top side of the case. The push-buttons of thekeypad 24 preferably allows the user to invoke one or more pre-installed software on the computer of FIG. 1. Additionally, the case has an opening on the backside which is adapted to receive a connector carrying the electrical impulses to and from the I/O port 42. To electrically access the I/O port 42, a Snap-On block which interlocks with the bottom of the computer and which is electrically connected to the I/O port 42. As noted above, the casing for FIG. 1 is resembles the Pilot handheld computer available from Palm Computing—US Robotics. - To better explain the operation of the software processes of the present invention, the operation of typical routines executing on the Pilot computer is discussed next. As detailed in a manual from Pilot entitled “PalmOS Developing Palm OS Application—Part 1: System and User Interface Management,” when an application receives an action code Launch Command, it begins with a startup routine, then goes into an event loop, and finally exits with a stop routine. The Startup Routine is the application's opportunity to perform actions which need to happen once, and only once, at startup. A typical startup routine opens databases, reads saved state information (such as UI preferences) and initializes the application's global data. The Event Loop fetches events from the queue and dispatches them, taking advantage of default system functionality as appropriate. The Stop Routine is the application's opportunity to perform cleanup activities before exiting. Typical activities include closing databases and saving state information. During the startup routine, an application has to follow these steps:
- 1. Get system-wide preferences (for example for numeric or date and time formats) and use them to initialize global variables that will be referenced throughout the application;
- 2. Find the application database by creator type. If none exists, create it and initialize it;
- 3. Get application-specific preferences and initialize related global variables; and
- 4. Initialize any other global variables.
- When startup is complete, the application enters ail event loop. It typically remains in that event loop until the system tells it to shut itself down by sending an appStopEvent. In the process of handling an event, the call to SysHandleEvent may generate new events and put them on the queue. For example, the system handles Graffiti input by translating the pen events to key events. Those, in turn, are put on the event queue and are eventually handled by the application. SysHandleEvent returns TRUE if the event was completely handled, that is, no further processing of the event is required. The application can then pick up the next event from the queue. If SysHandleEvent did not completely handle the event, the application calls MenuHandleEvent. MenuHandleEvent handles two types of events:—If the user has tapped in the area that invokes a menu, MenuHandleEvent brings up the menu.—If the user had tapped inside a menu to invoke a menu command, MenuHandleEvent removes the menu from the screen and puts the events that result from the command onto the event queue. MenuHandleEvent returns TRUE if the event was completely handled.4. If MenuHandleEvent did not completely handle the event, the application calls ApplicationHandleEvent. ApplicationHandleEvent handles only the frmLoadEvent for that event; it loads and activates application form resources and sets the event handler for the active form. If ApplicationHandleEvent did not completely handle the event, the application calls FrmDispatchEvent. FrmDispatchEvent first sends the event to the application's event handler for the active form. This is the event handler routine that was established in ApplicationHandleEvent. Thus the application's code is given the first opportunity to process events that pertain to the current form. The application's event handler may completely handle the event and return TRUE to calls FrmiDispatchEvent. In that case, calls FrmDispatchEvent returns to the application's event loop. Otherwise, calls FrmDispatchEvent calls FrmHandleEvent to provide the system's default processing for the event. Further, in the process of handling an event, an application frequently has to first close the current form and then open another one, as follows:—The application calls FrmGotoForm to bring up another form. FrmGotoForrn queues a frmCloseEvent for the currently active form, then queues frmLoadEvent and frmOpenEvent for the new form.—When the application gets the frmCloseEvent, it closes and erases the currently active form.—When the application gets the frmLoadEvent, it loads and then activates the new form. Normally, the form remains active until it is closes. The application's event handler for the new form is also established.—When the application gets the frmOpenEvent, it does whatever initialization of the form is required, then draws the form on the display. After FrmGotoForm has been called, any further events that come though the main event loop and to FrmDispatchEvent are dispatched to the event handler for the form that is currently active. The event handler knows for a particular dialog box or form how it should respond to events for example, opening, closing, among others. FrmHandleEvent invokes the default UI functionality. After the system has done all it can to handle the event for the specified form, the application finally calls the active form's own event handling function.
- Although the database saving and synchronization capability of the operating system for the Pilot is adequate for specific applications such as the built-in datebook, telephone directory, todo list, memo, and time and expense tracking system, among others, the database oriented storage mechanism of the Pilot is not optimal for a number of data storage intensive applications. For one, the Pilot operating system limits the number of database “files” to sixteen and further that the database “files” conformed to predetermined formats which are not flexible. Thus, file intensive applications such as CAD require a file management system. FIG. 1B illustrates a flowchart for a process for accessing data in accordance with a first embodiment of a file management system that is compatible with an IBM personal computer file management system. In the IBM PC system, for file management with the IBM Disk Operating System (DOS), the DOS needs to know the beginning of the data storage area. Typically, the disk has defined sections in addition to the boot sector and the partition table: a root directory and a file allocation table (FAT). The root directory starts after the boot sector and the FAT. In the first embodiment compatible with the IBM PC, the root directory holds the necessary information on location, size, date and time of the last change of the files and sub-directories, as well as a directory entry. In this embodiment, the directory entry also contains a start cluster pointer and a file length field. The start cluster entry specifies the beginning of the file or subdirectory, and the file size field provides the length of the file. Additionally, a dirty bit is provided in the directory entry for indicating whether the file has been updated since the last synchronization of the computer of FIG. 1 with the host computer. If the file has been updated, the dirty bit is set such that upon synchronization, the copy of the file on both the host computer and the computer of FIG. 1 is made consistent with each other using the same synchronization process performed by the database routines of the Pilot. Thus, to upon docking and with the activation of the synchronization button on the docking port of the Pilot, the files on both the host computer and the Pilot handheld are correlated and updated to one coherent copy in both computers. Turning now to the FAT of the first embodiment of the file management system of the present invention, the FAT values in the file management system of the first embodiment conforms to the FAT entries in the IBM compatible computers. These FAT entries essentially contain pointers to the next cluster in the cluster chain. Although the file management system has cluster chains that are only forward directed (forward chained), the present invention contemplates that bidirectional chain can be supported by using a doubly linked list of cluster chains.
- Referring now to FIG. 1B, the process for accessing data from the data storage system using the first embodiment of the file management system is shown in detail. In FIG. 1B, from
step 700, the routine proceeds to step 701 where it reads the directory entry. Next instep 702, the routine determines the start cluster, as pointed to by the directory entry. Fromstep 702, the routine proceeds to step 703 where it checks if the requested data is located in the present cluster. If not, the routine of FIG. 1B proceeds fromstep 703 to step 704 where it retrieves the FAT entry and traverse to the next cluster, as based on the cluster pointer. Fromstep 703, in the event that data is located in the current cluster, the routine of FIG. 1B proceeds to step 705 where it determines the sector containing the data and accesses the data from the cluster. Next, instep 706, the routine of FIG. 1B checks if the access has been successful. If not, the routine indicates a failure instep 707. Alternatively, if the access is successful, the routine of FIG. 1B proceeds fromstep 706 to step 708 where it accesses the sector and transfer data to and from the application, as requested. Fromstep - Referring now to FIG. 2A, a block diagram of the
CCD unit 27 is shown in more detail. In FIG. 2A, a CCD orCIS array 53 is connected to a CCD/CIS processor 55. The CCD/CIS processor 55 is connected to avoltage reference 54 as well as an optional correction/data RAM 56 which can be eliminated for cost saving reasons. The CCD sensor may be a TCD1250D, a MN3610H or a similar CCD. Alternatively, CIS sensors such as those available from Dyna Image Corporation may be used. The CCD/CIS processor 55 is preferably a LM9801 IC from National Semiconductor, Inc. (Santa Clara, Calif.) which linearizes the pixel stream from the CCD/CIS array in the analog domain and further provides correlated double sampling for black level and offset compensation. The output of the CCD/CIS processor 55 is provided to the bus to be presented to theprocessor 20. - Turning now to FIG. 2B, a
wireless scanner 27′ is shown. In this unit, awireless transceiver 58 is connected to a Universal Asynchronous Receiver/Transmitter (UART) 57, which is in turn connected to the CCD/CIS processor 55, as previously discussed in FIG. 2A. TheUART 57 serializes data regarding the scanned image and presents the data to thewireless transceiver 58 for transmitting back to thecomputer 10 of FIG. 1. Thewireless transceiver 58 can be an infrared unit for communicating with theIR transceiver 49 of thecomputer 10. Alternatively, thewireless transceiver 58 can be a radiobased unit for communicating with thewireless transceiver 31 of FIG. 1. In this manner, the scanner 27'does not have to be physically connected to thecomputer 10, thus providing more convenience and flexibility for the user during use. Although the use of the CCD/CIS processor 55 is disclosed, the present invention contemplates that, to cut cost, a operational amplifier, an analog to digital converter, and software running on theCPU 20 to compensate for scanning related signal noise are all that is needed to implement a low cost scanner system. Furthermore, although gray-scale CCD/CIS devices are preferred for cost reasons, the present invention contemplates that color CCD devices may be used as well. - Referring to FIG. 3, a video driver and a large screen cathode ray tube (CRT)52 is provided to deliver ease of reading information from the small
mobile computer 10 is shown. Although the TV is not recommended for computing functions such as CAD/CAM, it is suitable for playing games and browsing the Internet. In FIG. 3, high level primitives of the display data is transmitted using a suitable media such as infrared or radio wave from thecomputer 10 to the CRT, preferably a television display unit commonly available to consumers. The high level primitive data transmitted, including characters and form definitions, is received by awireless transceiver 60 and is presented to aUART 61 for conversion into parallel data. The data is presented to a video processor orcontroller 62 which is connected to avideo RAM 63 and acharacter generator 64 for rasterization into bit-maps. The bit-mapped display data is delivered to a triple digital to analog converters (DACs) in thevideo controller 62 which generate suitable color RGB video signals. The video signal is provided into driver electronics for generating a composite video signal to be delivered to the video input of the TV. Furthermore, sound primitives are converted and delivered to an audio amplifier which drives the left and right audio inputs of the TV. In this manner, users with advanced age can have the benefits of reading ease and small factor portability. Alternatively, in the event that the user is within range of a computer, the high level video and sound primitives can be sent via the wireless network such as the infrared transmission (IrDA) and subsequently rasterized by the processor of the desktop computer to be displayed on the desktop display for ease of reading. Additionally, in the event that the user wishes to drive a VGA monitor directly in place of the TV, a VGA graphics adapter may be used. Furthermore, a scan converter may be attached to the VGA adapter to generate the NTSC/PAL video signal. In performing the conversion from computer video to TV, VGA frequency is roughly twice that of the video frequency. Furthermore, computer VGA displays tend to be progressively scanned (non-interlaced) while TV video uses interlaced video, a remnant of the NTSC video scheme. However, since a strict translation of VGA to NTSC may produce flicker, the preferred embodiment provides an adaptive finite impulse response filter and highly linear D/A and A/D converters to minimize flickers. - Turning now to FIG. 4, the major protocol layers for connecting the computer of FIG. 1 to a suitable network such as an
Internet 150 is shown. In FIG. 4, the user has connected to a suitable Internet service provider (ISP) 100 which in turn is connected to the backbone of theInternet 150, typically via a T1 or a T3 line. TheISP 100 communicates with the computer of the present invention via a protocol such as point to point protocol (PPP) or a serial line Internet protocol (SLIP) 100 over one or more media ortelephone network 102, including landline, wireless line, or a combination thereof. On the portable computer side, a similar PPP orSLIP layer 103 is provided to communicate with theISP 100 computer. Further, a PPP orSLIP client layer 104 communicates with the PPP orSLIP layer 103. Finally, a networkaware application 105 such as a browser or a spreadsheet with Internet capability of the present invention receives and formats the data received over theInternet 150 in a manner suitable for the user. - Turning now to FIG. 5, a typical Internet system is shown with one or more
portable computers Computers computers mobile support station 71.MSS stations network 151 which relays the messages via stations positioned on a global basis to ensure that the user is connected to the network, regardless of his or her reference to home. TheRF network 151 eventually connects to agateway 72 which is in turn connected to theInternet 150. Thegateway 72 provides routing as well as reachability information to the network such as theInternet 150. A plurality of large scale computing resources such as asupercomputer 73 and amainframe 72 are connected to theInternet 150. Themainframe 52 in turn is connected to an corporate version of the Internet calledIntranet 54 which supplies information to one or more office computers orworkstations 55. - The
Internet 150 is a super-network, or a network of networks, interconnecting a number of computers together using predefined protocols to tell the computers how to locate and exchange data with one another. The primary elements of theInternet 150 are host computers that are linked by a backbone telecommunications network and communicate using one or more protocols. The most fundamental of Internet protocols is called Transmission Control Protocol/Internet Protocol (TCP/IP), which is essentially an envelope where data resides. The TCP protocol tells computers what is in the packet, and the IP protocol tells computers where to send the packet. The IP transmits blocks of data called datagrams from sources to destinations throughout theInternet 150. As packets of information travel across theInternet 150, routers throughout the network check the addresses of data packages and determine the best route to send them to their destinations. Furthermore, packets of information are detoured around non-operative computers if necessary until the information finds its way to the proper destination. - Although the
Internet 150 provides a pathway for users to communicate and share information, the original user interface had been rather unfriendly. Eventually, a system was developed to link documents stored on different computers on theInternet 150. Known as a World Wide Web, the system is an elaborate distributed database of documents, graphics, and other multimedia development. Like other distributed applications, the Web is based on a client/server model where Web pages reside on host computers that “serve up” pages when the user's computer (client computer) requests them. As the user “surfs” the Web, a browser can request data from the database on a server computer that processes and replies the desired data back to the computer system of FIG. 1 and to display that request when the request is fulfilled, by the server. The client computer runs a browser software which asks for specific information by sending a HTTP request across theInternet 150 connection to the host computer. When the host computer receives the HTTP request, it responds by sending the data back to the client. - The browser commonly features a graphical user interface with icons and menus across the top along with a field to supply the URL for retrieval purposes. Navigational buttons guide the users through cyberspace in a linear manner, either one page forward or backward at a time. Pull down menus provide a history of sites accessed so that the user can revisit previous pages. A stop button is typically provided to cancel the loading of a page. To preserve favorite sites, a bookmark is provided to hold the user's favorite URLs in a list such as a directory tree. Furthermore, the browser typically provides a temporary cache on the data storage device or in RAM. The cache allows a more efficient Internet access as it saves bandwidth and improves access performance significantly. In the present invention, each entry in the bookmark has a list of links typically accessed by the user while he or she accesses the Web site represented by the bookmark entry. When the user clicks on the bookmark entry, the entry's Web page is displayed first. While the user is reading the page represented by the bookmark entry, the browser retrieves pages of additional links associated with the bookmark entry in the background. In this manner, the browser prefetches pages likely to be accessed by the user when the bookmark entry page is clicked, thus avoiding delays when the user actually clicks on the links of the bookmark entry Web page. The use of the cache and the prefetcher enhances the Web viewing experience, as the user is not hampered by delays on-line.
- The browser also interprets HyperText Markup Language (HTML) which allows web site creators to specify a display format accessible by HTML compatible browsers. Typically, when the user types in the URL or clicks on a hyperlink, TCP/IP opens a connection between the host and client computers. The browser then generates a request header to ask for a specific HTML document. The server responds by sending the HTML document as text to the client via the TCP/IP pipeline. The client computer acknowledges receipt of the page and the connection is closed. The HTML document is stored in the browser's cache. The browser then parses the HTML document for text and tags. If the browser runs across tags that link to images/pictures and sounds, the browser makes separate requests for these files to the server and displays or generates sounds to the user.
- To supply more intelligent processing of information over the
Internet 150, a language such as Java may be utilized. Java was developed originally by Sun Microsystems of Mountain View, Calif. The specification for the Java language is stored at the Java web site http://java.sun.com/. The web site contains the Java development software, a HotJava web browser, and on-line documentation for all aspects of the Java language, hereby incorporated by reference. Designed to be small, simple and portable across processor platforms and operating systems, Java can download and play applets on a browser system of the receiver, or reader. Applets are Java programs that are downloaded over the Internet World Wide Web, as dictated by a tag such as <applet> tags and executed by a Web browser on the reader's machine. In Java, the compiler takes the instructions and generates bytecodes, which are system independent machine codes. A bytecode interpreter executes the bytecodes. The bytecode interpreter can execute stand-alone, or in the case of applets, the bytecode interpreter is built-in Java compatible browsers. Thus, with a Java compatible client-server, theInternet 150 is transformed from a passive giant book of information into an active network capable of supporting electronic commerce and virtual ecosystems. - Although the
supercomputer 51, themainframe computer 52 and the gateway 59 are shown in FIG. 4 as being connected to theInternet 150 via landlines such as T1 and T3 lines, the Internet may be connected to asatellite transmission system 56 which transmits and receives high bandwidth data over asatellite 57. Thesatellite 57 in turn relays the information to one or morelocal stations 58 which is connected to one ormore servers 57. Thus, as shown in FIG. 3, theportable computer 10 can easily request information from a variety of sources which may exist locally or on the other side of the world via theInternet 150. - An important goal of the
personal computer 10 is its ability to allow users to move about freely within and between cells while transparently maintaining all connections, particularly with theInternet 150. However, theInternet 150 suite of protocols had been designed with an assumption that each user is assigned a fixedInternet 150 address associated to a fixed location. Thus, for mobile computers with a wireless physical link, the movement or migration of users in the wireless network violates theimplicit Internet 150 protocol. As wireless bandwidth is at a premium, particularly when voice and video data are involved, it is inefficient to require end-to-end retransmission of packets as done in TCP. Furthermore, due to the unpredictable movements of mobile computers with wireless links, large variations exist in the available bandwidths in each cell and affect the transmission characteristics between themobile computer 10 and theInternet 150. - In the preferred embodiment, a number of virtual circuits are used within the mobile network to route connections to mobile computers10-13 via
MSS gateway 72 or theMSS MSS MSS Internet 150 to themobile computer 10 terminates at the fixed IP of theMSS MSS mobile computer 10 moves from oneMSS 70 to anotherMSS 71, the address mapping table is updated. During the table update, all packets destined to themobile computer 10 continues to be sent to theold MSS 70. These packets are returned to the sender, who forwards the returned message to thenew MSS 71. Thus, based on the address mapping table, the sender and theMSS mobile computer 10. - To illustrate the operation of the VIP, events associated with the delivery of a packet of data to the
mobile computer 10 is discussed next. TheMSS 70 buffers the incoming packet and forwards the packet to the known or predicted cell covering themobile computer 10. Once received, themobile computer 10 acknowledges receipt and requests theMSS 60 to discard the packet. Alternatively, in the event that themobile compute 10 has moved to another cell which is covered by theMSS 71, it is assigned a new IP address. Themobile computer 10 sends the new IP address and the VIP address to itshome gateway 72, which in turn sends the new IP address to intermediate gateways to update their address mapping tables as well. TheMSS 70 continues to send packets to themobile computer 10 until either the connections are closed or until theMSS 71 sets up its own connection via the address mapping table with the remote endpoint having the open connection with themobile computer 10. - To address the problems associated with a bandwidth variations caused by the wireless environment, the
MSS MSS mobile computers MSS mobile computer - As indicated earlier, a number of software modules may reside in the
RAM 22 to provide added functionality to the portable computing appliance. For instance, the portable computing appliance may provide a sketching system in the incorporated by reference patent application entitled “GRAPHICAL DATA ENTRY SYSTEM” may be provided to support fast, convenient and accurate annotated drawings in the field. Additionally, a spreadsheet and database engine may be used to support the analysis of data captured from a number of sources over theInternet 150. FIG. 6 illustrates in more detail the spreadsheet of the present invention. The spreadsheet of the present invention is essentially a list of memory locations or data storage cells that are related, or linked together. Preferably, the data storage cells are organized using a linked list for ease of traversal. Further, the data storage cells can be specified using row and column identifiers. Preferably, the cells of the spreadsheet are linked using dynamic rows and columns. The ability to offer both dynamic rows and columns simplifies and reduces the data storage requirement on thesystem RAM 22. The spreadsheet provides the user of the handheld computer with on-the-fly data processing capability. The spreadsheet can also acquire data via thebarcode scanner 48, theCCD unit 27 and OCR software, or alternatively via themicrophone 51,ADC 50 and a speech recognizer. Furthermore, the spreadsheet can deploy with intelligent agents to seamlessly hunt for information relevant to the user's needs. In this manner, the spreadsheet of the present invention turns the handheld computer system of the present invention into an intelligent data management system which can acquire and process data on the fly. - Turning now to FIG. 6, a
spreadsheet handler 200 is shown. In FIG. 6, from thestep 200, the spreadsheet handler initializes the spreadsheet instep 201. Such initialization includes the clearing of the spreadsheet memory and the setting of the current row to “1” and column to “A”. Next, instep 202, thespreadsheet handler 200 of FIG. 6 draws the spreadsheet cells and displays the row/column labels as well as the menu. Thespreadsheet handler 200 then proceeds to check instep 203 whether certain cells of the spreadsheet need to be updated using a remote data source such as an Internet database. Step 203 is illustrated in more detail in FIG. 14. - From
step 203, the routine of FIG. 6 proceeds to set an event handler to the spreadsheet form instep 204. Fromstep 204, the routine waits for an event instep 205. Once the event is received, the routine of FIG. 6 tests to see if the event is a system event instep 206. If so, the routine processes the system instep 207 before it loops back to step 205 to process the next event. - Alternatively, if the event is not a system event in
step 206, the routine of FIG. 4 tests whether the event is a menu event instep 208. If so, the routine of FIG. 4 processes the menu event instep 209 before it loops back tostep 205. If the event is not a menu event instep 208, the routine 200 tests whether the event is a form load event instep 210. If so, the new form is loaded instep 211. Fromstep 211, the routine of FIG. 6 loops back to step 205 to process the next event. If the event is not a form load event, the routine 200 of FIG. 6 checks to see if the application handler for the spreadsheet has completed operation instep 212. If a quit event had been generated, the routine 200 provides default processing for the application instep 212 before it exits instep 214. Alternatively, if the application handler has not completed operation instep 212, the routine dispatches the event as necessary instep 213 before the routine 200 loops back to step 205 to process the next event in thespreadsheet handler 200. - The
system event handler 207 of FIG. 6 is shown in more detail in FIG. 7. Fromstep 207, the routine of FIG. 7 checks to see if the user is actuating the pen instep 218. If thepen 33 is pressed down on theLCD screen 35 instep 218, the routine updates the active cell instep 220 and recalculates values of the spreadsheet instep 220. Alternatively, if thepen 33 is not down instep 218, the routine 207 checks if thepen 33 is in a scroll bar region in step 222. If so, the routine 207 performs the scroll operation instep 223 before it exits. - If not, the routine207 checks if the pen is dragged down in step 224. If so, the user has selected a particular block of cells for purposes such as cutting, pasting, or generating graphs, among others. In such case, the routine 207 highlights and selects the blocked region in
step 225 before it exits. Alternatively, if thepen 33 is not dragged in step 224, the cell is being edited instep 226. Fromstep 226, the routine checks if the user has selected menu items in step 227. If so, the menu event is dispatched instep 228 before the routine 207 exits instep 229. From step 227, if no menu events have been generated, the routine 207 exits FIG. 7. - Referring now to FIG. 8, the
scroll routine 223 of FIG. 7 is shown in more detail. Fromstep 223, the scroll routine checks to see if the user requested a scroll-up operation in step 241. If so, the routine 223 then checks if the spreadsheet is already at the top of the page instep 242. If not, the scroll up operation is performed to show the previous page instep 243. Furthermore, the page pointer is updated instep 243 before the routine 223 is completed instep 253. Fromstep 242, if the spreadsheet is already at the top of the page, the routine 223 is simply exited. - Alternatively, if the scroll operation is not scroll-up in step241, the routine 223 checks if the user is requesting a scroll down in
step 244. If so, the routine 223 further checks if the spreadsheet is at the bottom most page instep 245. If not, the routine 223 exits. Alternatively, if the spreadsheet is not at the bottom most page, the routine 223 shows the next page and updates the page pointer instep 246 before it exits instep 253. - From
step 244, if the operation is not scroll-down, the routine 223 checks if the user has requested a scroll left operation in step 247. If so, the routine 223 checks to see if the user is already at the left most page instep 248. If the current spreadsheet page is not the left most page instep 248, the routine 223 proceeds to step 249 where it displays the left page and updates the pointer to point one page to the left. Alternatively, fromstep 248 if the user is already at the left most page, or fromstep 249, the routine 223 exits instep 253. - From step247, in the event that the user is not requesting a scroll left operation, the routine 223 checks if the user has requested a scroll right operation in
step 250. If so, the routine checks if the current page is the right most page instep 251. If not, the routine 223 transitions fromstep 251 to step 252 where it shows the page on the right of the current page and updates the page pointer appropriately. Fromstep 252, or fromstep 251 in the event that the user is already at the right most page, or fromstep 250 where the user did not request a scroll right operation, the routine 223 exits instep 253. - Although not shown, the present invention contemplates that scroll panels are supported. Scroll panels are used to lock the display of the spreadsheet in particular horizontal and/or vertical directions. For instance, scroll panels are useful for constantly displaying period information such as the months on the spreadsheet, regardless of the user's scrolls. In such a case, the routine223 of FIG. 8 checks if the user has specified a row and a column relating to scroll panels. If the user has specified that particular row and/or column be the scroll panels, the routine locks the row and/or column such that the locked row/column defines the top and left most pages of
step - Referring now to FIG. 9, the
edit cell step 226 of FIG. 7 is shown in more detail. Fromstep 226, the routine examines if the function button has been selected instep 261. If so, the routine 226 proceeds to step 262 where it displays a function list before it proceeds to step 263. In step 263, the routine 226 checks if the cancel button has been pressed. If so, the routine loops backs to step 261. If not, the routine 226 checks if the user has selected a function from the displayed list instep 264. If not, the routine 226 loops back to step 262 where it awaits an action from the user. Alternatively, if the user selected a function instep 264, the routine 226 checks if the user has provided the correct parameter in step 265. If the parameters are incorrect, the routine displays error messages instep 266 before it loops back tostep 262. On the other hand, if the correct parameters are provided, the routine 226 loops from step 265 back to step 261 to continue processing the cell edit operation. - Alternatively, from
step 261, if the function button has not been selected, the routine 226 checks to see if the user has completed entering the formula instep 267. If not, the routine loops fromstep 267 back to step 261 to continue processing user requests. Alternatively, if the user has completed entering the formula into the cell instep 267, the routine 226 transitions fromstep 267 to step 268 where it saves the cell contents. Upon completingstep 268, the routine 226 of FIG. 9 exits instep 269. - Thus, in the present invention, when the
pen 33 is double-clicked on a particular spreadsheet cell, a cell edit window is brought up. The cell edit window shows on the left hand the row, column index. Further, the content of the cell, which is editable, is displayed. Additionally, an enter (done) button, a functions button, and a cancel button is provided. The enter or done button is used to indicate that the user has completed his or her editing operations and that the user wishes to accept the changes and go back to the display of the rest of the spreadsheet. The functions button, when actuated, shows a scrollable list of the functions supported by the spreadsheet system. The functions button further awaits for selection of one of the functions displayed. spreadsheet to support number intensive data analysis. When a function has been selected, the function is entered into the space and may be edited if necessary. Finally, the cancel button allows the user to terminate his or her cell editing function and return back to the spreadsheet without affecting anything. - Turning now to FIG. 10, more detail on the save
cell step 268 is shown. Fromstep 268, the routine of FIG. 10 first checks to see if the current cell exists already instep 271. If no, the routine transitions to step 272 where it checks if sufficient memory exists for the new cell. If the remaining memory is insufficient, the routine 268 transitions fromstep 272 to step 274 where it displays one or more error messages before transferring to step 280. Alternatively, fromstep 273, in the event that sufficient memory exists to support another spreadsheet cell, the routine 268 creates a new cell and link the new cells to prior cells. - From
step 271, if the cell exists already, or fromstep 273 where the new cell had been created and if the result of the formula evaluation is successful, the routine 268 displays the result instep 278 and saves the formula in the current cell instep 279 before the routine 268 exits instep 280. Alternatively, if the result of the formula evaluation instep 276 is not successful, the routine 268 transitions from step 277 to step 274 where it displays an error message before it moves to step 279 to save the formula, even if the formula has an error of some type such that the user can subsequently edit the formula. Fromstep 279, the routine 268 exits instep 280. - Referring now to FIG. 11, a formula evaluation routine is discussed in more detail. This routine is executed by a pen down event handler to process the data input. In FIG. 11, the routine276 first checks if the current character in the string under evaluation is an “(” in
step 281. If so, the routine recursively calls itself instep 276 to resolve the value of the string enclosed within the “( )” pair. This method is known in the art as “substringing.” Fromstep 281, the routine 276 next checks if the character is a “)” instep 282. If so, the routine returns to its caller instep 289. Alternatively, in the event that the character is not a “)” , the routine gets the next string which could be a number or an arithmetic operator instep 283. Fromstep 283, in the event that the number evaluator fails to reduce the string to a number instep 284, the routine exits with an error indication instep 285. Alternatively, if the number evaluated instep 284 results in a valid number, the routine exits instep 289 without an error indication. - From
step 283, in the event that the character is an arithmetic operator such as +, −, / or * instep 286, the routine 276 performs the operation instep 287 before it loops back to step 281 to continue the string processing. Alternatively, in the event that the character is not an arithmetic operator, the routine 276 next checks if the character is a NULL character instep 288. If so, the routine 276 transitions to step 289 where it returns with an OK indication. Alternatively, if the character instep 288 is not a NULL character, the routine loops back to step 281 to continue processing the string in the formula entered. - Turning now to FIG. 12, the routine to process menu events of FIG. 6 is shown in more detail. From
step 209, the routine checks if the user has selected the “File” menu event instep 291. If so, the routine 209 transitions to step 292 where selections in the file menu event are processed. These selections include, among others: open, close, save, save as, and quit. Upon completion of the file menu items instep 292, the routine 209 exits instep 299. - From
step 291, in the event that the file menu item is not selected, the routine 209 transitions to step 293 where it checks if the user selected an edit menu. If so, the routine 209 transitions to step 294 to handle the edit menu items. The edit menu items include: undo, redo, cut, copy, paste, find, replace, goto, insert, delete, format, and width changes, among others. . Upon completion of the edit menu items instep 294, the routine 209 exits instep 299. - From
step 293, if the edit menu item is not selected, the routine checks instep 295 whether a graph menu has been selected. If so, the routine 209 transitions to step 296 where the graph requests are handled. The graph operations performed instep 296 include a selection of the graph type (pie, line, bar, area, hi-lo, and scatter) and a layout control (series, axis, grid/tick, title, legend, and label). Upon completion of the graph menu items instep 296, the routine 209 exits instep 299. - Alternatively, from
step 295, if the graph menu item is not selected, the routine 209 checks if the view menu item is selected instep 297. If not, the routine 209 simply exits instep 299. Alternatively, if the view option has been selected, the routine 209 transitions to step 298 to perform zoom operations. The zoom operation is important in a palmtop computer as thedisplay 35 is relatively small and can display in full only small sketches. To resolve this issue, the routine of FIG. 13 causes thedisplay 35 to select a particular magnification factor such as 50%, 75%, 100%, 150%, 200% and a user selectable scale Upon receipt of the magnification factor, step 298 centers in on the cell last edited and enlarges or shrinks the spreadsheet display as requested. When the user is done viewing the scaled display, the user can click on a button such as an OK button or a GoBack button to move back to the spreadsheet functionality. - The present invention also contemplates a second embodiment where, underneath the menu is an outline which fences in a spreadsheet area where the user can enter information or manipulate the spreadsheet. Further, upon selecting the magnifier option from the View menu, a magnifier is displayed whenever the
pen input device 33 touches theLCD screen 35 of the computer system. Additionally, a zoomed drawing area is displayed along with the movement of the pen input device and the magnifier. When the user successively and quickly depresses (double clicks) thepen 33 twice when the pen is in the magnifier mode, the sketching system enters into a zoomed viewing mode, as further described below. - Upon depressing twice (double clicking) the pen when it is in the magnifier view mode, the zoomed drawing area is enlarged into an enlarged zoomed drawing area which occupies the space formerly belonging to the drawing area. The enlarged area provides the user with a enlarged, more comfortable view of the object(s) such as the spreadsheet cell(s) or database records being edited. To provide the user with a perspective of the particular portion of the drawing which he or she is working on, a miniaturized illustration of the entire drawing is shown as a floating insert or a draggable window which is shown at the top left corner of the enlarged zoomed drawing area. The floating insert or draggable window showing the miniaturized illustration may be freely moved within the enlarged zoomed area. The miniaturized illustration thus provides a bird's eye view of the entire drawing and further shows an outline of the zoomed drawing area or the enlarged zoomed drawing area. The outline may be moved to scroll around the drawing. When the outline is moved, the display region is also updated to reflect the objects located at the current position of the outline.
- The outline also has four sides which can be adjusted to adjust the zoom ratio of the objects shown on the enlarged zoomed drawing area. Thus, by adjusting the dimension of the outline, the user can vary the enlargement view on the area. Thus, the magnifier and the miniaturized illustration balances between the need for showing the entire drawing with the limited display region afforded by the portable computing appliance. Upon completion of the zoomed viewing sequence, the user can select the full drawing view option from the Edit menu to display the entire drawing on the screen.
- Referring now to FIG. 13, the process for handling zoom requests in the
view menu 297 of FIG. 12 is shown in more detail. In FIG. 13, the routine 298 obtains a zoom range instep 311. Next, instep 312, the routine 298 computes the zoom ratio based on the zoom range, or alternatively, from a zoom ratio input from the user which may range from 50% to 200% to a user selectable ratio. Instep 313, the routine 298 performs a rasterization based on the zoom ratio and the display window. Next, instep 314, the routine 298 puts up the rasterized bit map on thedisplay 30, along with a Goback or Done button. In step 315, the routine waits for the user to select the Goback button. Once the user has indicated that he or she has completed viewing the zoomed image, the routine 298 exits viastep 316. - The present invention also contemplates that, in a second embodiment, a magnifier icon is displayed whenever the
pen 33 touches theLCD screen 35 of the computer system. Further, an outline box is displayed around the magnifier icon to indicate the viewing area available when the magnifier zooms. Thus, when activated, the routine displays an enlarged view of the drawing at the point where thepen 33 touches thescreen 35 of the computer system, much like a conventional magnifier would. In this embodiment, the routine 298 also displays a Bird's Eye (BE) view of the entire drawing in a BE window. Further, a zoom box is displayed inside the BE window to indicate to the user his or her relative position on the drawing. The zoom box has four sides which are selectable by the pen to adjust the zoom scale, as discussed below. The second embodiment of the routine 298 checks for a pen event occurring within the BE window. If not, the pen event belongs to the primary window and representing either a draw or edit event. Thus, if the pen is not in the BE window, the routine calls the cell edit routine (FIG. 9). In the event that the pen is in the BE window, the routine checks for adjustments to the zoom box to change the scaling factor of the main display window. In the event that the zoom box has been adjusted by clicking on the zoom box and adjusting either the horizontal lengths or the vertical lengths, the routine computes the zoom ratio based on the zoom box adjustment. Preferably, the zoom ratio is computed as a function of the ratio of the length of the zoom box to the length of the BE window and further as a function of the ratio of the width of the zoom box to the width of the BE window. The alternate routine then applies the newly computed zoom ratio and refreshes the main display by performing the appropriate enlargement or shrinkage on the objects encompassed within the newly adjusted zoom box. Furthermore, if the zoom box has been dragged to a new viewing location, the routine receives the new coordinates of the zoom box relative to the BE window and updates the content and location of the zoom box in the BE window. - Turning to FIG. 14, the routine to update the cells of the spreadsheet of the present invention is illustrated. In FIG. 12, from
step 204, the routine determines the rows and columns that need to be updated instep 321 which is discussed in more detail in FIG. 15. Next, the routine 204 checks to see if the cells had been updated earlier during the day instep 322. If so, there is no need to perform the update and the routine 204 exits instep 330. - Alternatively, if the cells had not been updated in the day, certain cells need to be updated. In that case, the routine proceeds from
step 322 to step 323 where it creates a query designed to obtain the proper information. Next, instep 324, the routine connects to the server over the Internet or a suitable media. Instep 325, after establishing the connection, the routine submits the query. The routine 204 passes the query created instep 323 to the server in a query string which contains the name of a Common Gateway Interface (CGI) script. The CGI script sends the search to a database located on the server, receives the result of the query, along with the HTML page created by the database to contain the result, and passes it back to the server to be sent back to the routine 204. - Next, in
step 326, the routine 204 waits until the reply is complete. If so, the routine parses the reply into formatted data instep 327 and instep 328 stores the new data in the cells determined instep 321. Step 328 also copies the formulas from related cells into adjacent cells and recalculates the spreadsheet. The formulas from the related cells represent mathematical relationships to the new information. For instance, in an income statement, the gross profit is determined by obtaining new data on sales and cost of goods and subtracting the cost of goods from the sales information. Fromstep 328, the routine 204 refreshes the Last_Updated flag to reflect the most recent time that the spreadsheet had been updated to prevent needless updating. Fromstep 329, the routine 204 exits instep 330. - Turning now to FIG. 15, the process to identify the rows and columns to be updated is disclosed in more detail. From
step 321, the routine prompts the user to click on a row or column identifier. Next, instep 340, the routine checks to see if a label exists for the selected row or column. The label is important as the query ofstep 323 will be based on the label information. If the label does not exist, the routine 321 prompts the user for a label instep 341. Fromsteps steps step 345. - Although cell updates to and from the
Internet 150 have been discussed so far, the present invention also contemplates supporting the updates of specific objects such as an Object Linking and Embedding (OLE) container to enable the Windows program to cut, copy, paste entire objects, such as charts, spreadsheets or word processing documents, directly between applications. Such sharing of OLE container classes is discussed in Chapter 14 of Gregory et al., Building Internet Applications with Visual C++ (1996), hereby incorporated by reference. - Turning now to FIG. 16, another method to update the spreadsheet is shown. In FIG. 16, the user can activate a browser to view information on the Internet or other suitable network and designate the information to be retrieved into the spreadsheet. In this process, the routine executes a TCP/IP layer module in
step 351. ThePPP client layer 352 is then invoked. The data from thePPP client layer 352 is provided to a compression/decompression engine 353. Next, the decompressed data is provided to a message manager instep 354. If the message is a Java based message instep 35, the routine 350 provides the message to a Java interpreter, a just-in-time Java compiler, or a Java flash compiler instep 356. - From
steps 355 or 356, the routine then checks the remaining messages to see if it is in a Mark-up Language in step 357. If so, the routine 350 provides the message to a HTML or a HDML interpreter instep 358. Fromsteps 357 or 358, if the incoming message is provided to a default custom interpreter in step 359 to handle. special protocols supported by the user's application. Fromstep 360, thebrowser routine 350 exits instep 360. Thus, by using the browser of FIG. 16, the user can view the contents of databases located on the Internet and download the data via an appropriate protocol such as file transfer protocol (FTP). The incoming packet is executed if it is in Java, interpreted if it is HTML/HDML or custom protocol, and ultimately provided to the spreadsheet of the present invention. - In addition to the Java and HTML/HDML interpreter framework, the browser of the present invention preferably supplies a user interface with a menu bar, a tool bar, a URL bar in addition to the active window displaying the Web page. Turning now to FIG. 16A, the operation of the browser of the present invention is disclosed in more detail. From
step 370, the routine displays a menu bar, a tool bar and a URL bar instep 371;. The toolbar preferably allows the user to move backward/forward through various Web pages, reload a page, travel to a home page, print a page, stop the current load, among others. Furthermore, on the URL bar, an icon is available that, if dragged onto the window of the portable computer's desktop, creates a double-clickable link to that site on the desktop. Additionally, when entering data on the URL, the traditional “http://www” or “ftp://” can be automatically supplied by the browser. At the bottom of the active window displaying the Web page, the browser preferably displays a key which indicates the page's built-in security feature. Further, a status line is supplied to indicate the completion rate of the page download. Preferably, to conserve display area associated with thesmall LCD display 35, the tool bar and the status line are made to be hideable. - From
step 371, the routine accepts the user's URL, retrieves the HTML file from that URL, and parses the HTML file instep 372. Next, the routine of FIG. 16A adds the locations of the hyperlinks, as indicated in the respective HTML tags, into the event manager for watching. Once the hyperlink locations have been entered, the event manager catches double clickings on the hyperlinks and appropriately processes the requests for the hyperlinks. Fromstep 373, the routine checks for occurrences of menu bar events instep 374. If so, the routine jumps to the menu event handler instep 375. Alternatively, if no menu bar event occurred, the routine proceeds fromstep 374 to step 376 to check for tool bar events. If tool bar items have been selected, the routine proceeds from step 376 to step 377 where it handles tool bar events. Alternatively, if no tool bar events occurred in step 376, the routine proceeds to step 378 to check if the user has specified a new URL location. If so, the routine loads the new HTML file from the new URL location, parses the new HTML file, and adds the locations of the hyperlinks in the new HTML file instep 379. Alternatively, in the event that no hyperlink events occurred, the routine checks if the user wants to exit the browser in step 382. If so, the routine exits the browser instep 383. Alternatively, from step 382, or fromsteps browser routine 370 has a cache which improves access performance. Two types of cache are provided: a memory cache and a file cache. The memory cache buffers short term storage of graphic objects whereas the file cache is for intermediate term storage of data objects, as known to those skilled in the art. - The browser of the present invention also provides an off-line browsing capability to compensate for long delays associated with Web traffic. In the off-line mode, the computer of the present invention instructs a host server to perform searches during off-peak time and save the search result for subsequent viewing at a much faster pace. In this manner, the Web experience is preserved without the frustrating delays typically encountered when accessing the Web at peak hours. Preferably, the browser of the present invention has an integrated front end to Web search engines and directory, allowing users to issue a query using multiple search engines such as Lycos and Yahoo. As the front end generates direct inquiry to the CGI compatible databases, the front-end is relatively compact. To prevent the search engines from presenting the user with a large number of useless results, a data filter is provided to reduce the amount of documents to be viewed. The browser can be configured to download specific objects, such as text only such that large graphics files can be discarded if the user does not want to view graphics. Furthermore, the browser deploys agents to monitor the Bookmark mentioned above and rerun the search at specified intervals and notify the user when new results are found. Preferably, even in the off-line browsing mode, the browser of the present invention replicate the Web experience by preserving the URL. In one embodiment, the browser of the invention supports a news ticker capability which automatically download news files at night according to a user defined schedule. The news ticker is subsequently presented to the user when the computer is idle in an analogous manner to a screen saver.
- Turning now to FIG. 17, the routine for receiving images via the CCD/
CIS unit 27 is shown in more detail. From thescan step 400, the routine issues a reset instruction to the CCD/CIS unit 27 instep 401. Next, the routine 400 checks if the scan button on the CCD/CIS unit 27 is depressed. If so, the routine 400 acquires the image from the CCD/CIS system 27 instep 403 and loops back to step 402 to continue the image acquisition until the scan button is released instep 402. Once the scan button is released instep 402, the routine 400 performs an optical character recognition (OCR) process instep 404. The optical character recognition may perform a combination of feature detection and template matching methods for recognition of characters, as disclosed in U.S. Pat. No. 5,436,983, or may utilize neural networks as is known in the art. - From
step 404, the routine 400 formats the OCR data instep 405. Next, the routine 400 places the formatted data in the cells of the spreadsheet instep 406. Fromstep 406, the routine 400 also copies related formulas to appropriate cells in the spreadsheet. As an example, in the event that the routine 400 had scanned, OCRed, and converted sales and cost of goods figures in to the spreadsheet,step 407 may copy the formula for computing gross profit below the cost of goods line if the gross profit formula is used for current sales figures. Next, instep 408, the routine performs a spreadsheet recalculation before it exits instep 409. In this manner, the computer and scanner may be used to optimize the data acquisition process. - Turning now to FIG. 18, the corresponding process for copying images using the CCD/
CIS unit 27 is disclosed. From thecopy step 420, the routine issues a reset instruction to the CCD/CIS unit 27 instep 421. Next, the routine 420 checks if the scan button on the CCD/CIS unit 26 is depressed. If so, the routine 420 acquires the image from the CCD/CIS system 27 instep 423 and loops back to step 422 to continue the image acquisition until the scan button is released instep 422. Once the scan button is released instep 422, the routine 420 performs an optionalimage enhancement step 424 using known image signal processing routines. Next, the routine 420 proceeds to step 425 where, if the raster to vector option is picked, the bitmap is vectorized instep 426 before the routine 420 continues. Fromsteps 425 or 426, the routine 420 proceeds to step 427 where it checks if the compression option has been selected. If the compression option has been selected in step 427, the routine performs the bitmap compression process instep 428. Fromsteps 427 or 428, the routine 420 exits. Thus, in the manner discussed, the computer of the present invention allows the user to scan in images on the fly, digitally enhance the images, and store the images for subsequent printing in the event that the user simply wishes to copy the images, or include the images in a document or a file to be transmitted via a suitable network, or send the images via facsimile or other medium in the event that the user wishes to fax the image to a remote site for review. - Turning now to FIG. 19, the process for viewing data normally shown on the
LCD display 35 on the largerremote display device 52 is detailed. In FIG. 19, fromstep 440, the routine of FIG. 19 checks if a link with the remote display orCRT device 52 is active instep 441. This is preferably done by scanning the IR frequency for the presence of a remote display orCRT device 52. Once the handshake indicating that a remote device exists is completed instep 441, the portable computer proceeds to step 442. Otherwise, the routine of FIG. 19 simply exits if no remote display orCRT device 52 exists. Alternatively, if an active link to theremote display device 52 exists, the routine of FIG. 19 proceeds to step 442 where it turns off theLCD screen 35 on theportable computer 10 and further sends an acknowledgment return signal to theremote CRT device 52. Next, the routine trap graphic calls instep 443 so that a custom version of the graphic routines supporting a higher resolution display are used in place of the original graphic routines supporting theLCD display 35. Instep 444, the routine examines the identification data sent by theremote display 52 to determine whether thedisplay 52 is a high resolution device or not. If so, the routine modifies the display range resolution in the graphic routines to support the higher resolution instep 445. Alternatively, if theremote display device 52 is a conventional low resolution TV, the display resolution information is not updated such that the display on the remotelow resolution display 52 shows the same information as on theLCD display 35, but merely with enlarged and brighter images for ease of reading. Fromstep 445, or in the event that thedisplay 52 is a conventional low resolution TV instep 444, the routine of FIG. 19 sends the trapped graphics primitives to theremote display 52 instep 446. Next, instep 447, the routine of FIG. 19 receives the high level graphics primitives sent instep 446 and decodes or rasterizes the primitives before displaying them on theremote display 52. When the remote link is ended, the routine of FIG. 19 exits instep 448. - FIG. 19A is a flow chart of the process for teleconferencing with a remote user and for visually sharing an electronic chalkboard. The chalkboard conferencing process of FIG. 19A requires DSVD modems as well as the software carrying out the process of FIG. 19A to be installed both on the portable computer of the present invention as well as on the remote computer. In FIG. 19A, from
step 490, the chalkboard process proceeds to step 491 where a connection with the remote computer having a DSVD modem is established. Next, instep 492, the routine selects a file to be viewed on the blackboard. The file may be a text file as is conventional or may be a graphical document such as a document generated by the graphical drawing tool disclosed in the previously incorporated Ser. No. 08/684,842, entitled “GRAPHICAL DATA ENTRY SYSTEM.” The selection of the file automatically invokes the tool to edit the file such as a text editor or the graphics editor of Ser. No. 08/684,842. Fromstep 493, the respective files on both the local and remote computers are synchronized instep 493. Next, the voice data from the remote end is received, demultiplexed, decompressed, and reconstructed for the user to listen instep 494, as is conventional in the DSVD specification. Simultaneously, digital data is received. Instep 495, the data is decoded and checked if changes to the document have been made by the user at the local end. If so, the routine proceeds fromstep 495 to step 496 where it captures the changes and transmits the update packet to the remote unit for synchronizing the blackboard. Although not shown, the present invention contemplates that a suitable compression software can be used to minimize the data transmitted. Thus, in the case of drawings, the compression software can convert the strokes into vectors and transmit the vector information rather than bitmaps to conserve bandwidth on the digital channel. - From
steps step 497. If so, the routine proceeds fromstep 497 to step 498 where it receives the incoming packet and updates the chalkboard. Fromstep - Turning now to FIG. 20, the routine to record a voice note using the
voice recorder 43 is shown in more detail. After entering the routine of FIG. 20 in step 460, the routine initializes thevoice recorder 43 instep 461. Next, the routine of FIG. 20 checks for the desired action instep 462. In the event that the user wishes to save a memo in an application, the routine sends a record (REC) command to thevoice recorder 43 and saves the current address of the memo in a message management record (MMR) instep 463. The index to the message management record is also saved by the application such that when the user wishes to replay the previously recorded note, the address can be retrieved from the MMR to send to thevoice recorder 43 to play. Fromstep 463, the routine exits instep 470. - From
step 462, if the application does not need to save a memo, the routine of FIG. 20 checks if the user wishes to play a previously recorded memo instep 464. If so, the routine sends a PLAY command to thevoice recorder 43 instep 465 using the address retrieved from the MMR. Fromstep 465, the routine exits instep 470. Alternatively, fromstep 464, if the user does not wish to play a memo, the routine checks if the user wishes to edit the memo instep 466. If so, the new message is recorded in part or in whole over the old memo instep 467 before the routine of FIG. 20 exits instep 470. Additionally, in the event that the application or the user decides to erase a previously recorded memo instep 468, the routine proceeds to step 469 where it removes the current address from the MMR and mark the space as being available for additional messages. In this manner, the routine of FIG. 20 allows the user to quickly record and edit his or her messages in thevoice recorder 43 without consumingmain memory 22, as voice messages can be rather memory intensive. Furthermore, as thevoice recorder 43 is detachable from thecomputer 10, the user can carry only the voice recorder in the event the user needs a reminder/recorder device when available space is very small. - In addition to receiving commands by the pen or the keyboard, the data processor of the present invention also provides speech recognition capability as another mode of data entry. Turning now to FIG. 21, the routine to process voice commands and data is illustrated in more detail. In FIG. 21, the routine waits for a speech pattern directed at the
computer system 10 of FIG. 1. If no speech patterns exist, the routine simply exits instep 480. Alternatively, if voice is directed at the system, the routine 480 checks if the voice data is a command or an annotation instep 482. If a command, the routine performs the voice command instep 483 before it exits instep 487. Alternatively, if the speech pattern does not relate to a voice command, the routine proceeds to step 484 to check on voice annotations. If the voice input is a voice data annotation, the routine proceeds fromstep 484 to step 485 where voice is converted into computer readable text. Fromstep 484, the routine formats the converted data instep 486 such that the spreadsheet can process the dictated spreadsheet data entry. Fromsteps step 487. - As discussed above, data can be collected into the spreadsheet by scanning and performing OCR on images or by capturing voice and performing a speech to text recognition on the dictation. Additionally, intelligent agents can be used in conjunction with the computer system of FIG. 1 to locate/process information over a network via the two-
way communication device 31. Smart agents can automatically route user-specified data from the Web, other on-line services, and E-mail messages and faxes, to the computer of FIG. 1. In FIG. 22, a software entity called an “agent” serves as an independent source of expertise designed to perform particular tasks or sets of tasks. These agents continually process and update requests, even though the user is no longer connected to the network. These agents can also “mine” sites for information and retrieve only data relevant to the user. Further, the agents can be activated on demand to serve a particular purpose and then be deactivated after accomplishing solely that purpose. The agents navigate through computer networks such as the Internet to search for information and perform tasks for their users. The collected data from the search or the results of the execution of the tasks are compressed and delivered to the portable computer system of FIG. 1 the next time a wireless connection is established with the two-way communication device 31. - Turning now to FIG. 22, a flow chart showing the process of specifying an intelligent agent capable of operating with the two-
way communication device 31 is shown. In FIG. 22, the routine accepts rules and parameters for the intelligent agent instep 501. The intelligent agent of FIG. 22 is rule driven. Rules can be specified with a simple English like syntax to set slot values, create objects, among others, and can be organized into rulebases when the rules deal with related conditions and actions. A set of rules activates when its rulebase is loaded and deactivates when the rulebase is flushed. This collection of rulebases is collectively called the “agent.” - Agents can use rules to inference about the data, create new data or modify existing data. The two fundamental search strategies of these agents include forward and backward chaining. Forward chaining, which is a data driven process, proceeds from premises or data to conclusions. Alternatively, backward chaining, or goal-driven approach, proceeds from a tentative conclusion backward to the premises to determine whether the data supports that conclusion. Further, a combination of forward and backward chaining can be used to optimally solve a particular problem. Details of various expert systems suitable for use are discussed in James P. Ignizio'sbook Introduction to Expert Systems—The Development and Implementation of Rule-Based Expert Systems, hereby incorporated by reference. Additionally, the present invention contemplates that other artificial intelligence constructs, including neural networks, fuzzy logic system, and others known to those skilled in the art may be applied in place of the expert system. The present invention also contemplates that the intelligent agent can be specified using an object oriented language such as Java such that it is free to roam the Internet and other networks with Java compatible browsers.
- After the rules, strategies and parameters have been specified for the agent in
step 501, the routine trains the agent instep 502 with training data, if necessary in the event that neural networks and the like are to be used in implementing the agent. Next, the routine sends the agent over a network such as the Internet instep 503. Instep 504, the agent checks if the data it encountered satisfy the rules and parameters that the agent is looking for. If not, the routine proceeds to search the next database. Fromstep 504, the routine checks if all databases have been mined in step 505. If not, the routine moves to the next database instep 506 before it loops back tostep 504. Alternatively, from step 505, if all databases have been mined and the agent still has not located responsive information, it puts itself to sleep instep 507. The agent periodically wakes up and broadcasts itself once more instep 503 to continue searching for responsive materials. - From
step 504, in the event that responsive documents have been located, the agent checks in step 505 whether it is instructed to call other agents instep 508. If so authorized, the agent invokes other agents instep 509. When the agent reports back with information, the host computer proceeds to notify the user with data located instep 510. Instep 511, if the user accepts the data, the routine of FIG. 22 stores the data and updates the spreadsheet with information instep 512. Next, the routine checks if the user has invoked additional agents in response to the information detected by the original agent. If so, the routine proceeds fromstep 512 to step 513 where additional agents are launched or additional routines are executed. Fromstep 513, if the user does not invoke additional agents, the routine of FIG. 22 simply exits. Thus, the agent of FIG. 22 can respond to the verbal, handwritten, hand-drawn, or typed command or data from the user and intelligently perform the requested action, be it a data search or various user specified actions. Thus, these agents are capable of gathering information resourcefully, negotiating deals, and performing transactions on their user's behalf. Furthermore, the agent can have contingency plans such as they are aware of their environment when situated in different places and act accordingly. - Furthermore, the present invention also provides a database management system which acts in conjunction with the spreadsheet of the present invention to optimally manage data on behalf of the user. The purpose of a database management system is to store, maintain, and retrieve database records in files. A file collects records of the same format that serve a common purpose. Traditionally, general purpose database management systems use four language interfaces between the application programming language and the database manager: a data definition language, a data manipulation language, a query language and a report writer language. The data definition language defines the format, or schema, of the database by identifying the files, record formats, and relationship between files. The data manipulation language is the applications program interface to the database management system such as the opening, closing of the database or the adding, changing or deleting of records of the database. The query language allows the database to be searched according to a search criteria, while the report writer language allows the user to generate a report based on the result of the query.
- Turning now to FIG. 23, the process for creating and using the database in conjunction with the non-cursive handwriting recognizer and the speech recognizer is shown. From
step 515, the routine of FIG. 23 generates one or more database forms instep 516. The database forms are the user interface for the records of data stored in the database. The process for creating the form ofstep 516 is shown in more detail in FIG. 24. - From
step 516, the routine of FIG. 23 allows the user to enter information into the records of the database in accordance with the data entry format specified in theform creation step 516. The data may be entered by writing, scanning, dictating or by an agent sent over the Internet for hunting down relevant data. - From
step 517, once data has been entered into the database of the present invention, the routine of FIG. 23 checks if the user wishes to perform a database search instep 518. If not, the routine exits. Alternatively, if a search is to be done, the routine prompts the user for a database query instep 519. Fromstep 519, a search is carried out instep 520 before the database routine of FIG. 23 exits instep 521. - In the present invention, the data definition language stored in the form specification is preferably iconized such that the user can quickly layout a data entry form using a graphical specification. Preferably, the icons displayed include control objects such as text boxes, check boxes, dialogs, option buttons, labels, among others. The user can simply pick the control icon and place the control icon on its appropriate position on the LCD display. Next, the user select the appropriate attributes of the control icon such as caption, font, and dimensions, in a manner similar to the selection and customization of controls in Visual Basic or Visual C++, available from Microsoft Corporation of Redmond, Wash. The user repeats this process until all the data element have been defined, formatted, and positioned on the form. The present invention also contemplates that the database can simply a free-text database without any control icon restrictions, or in the event that control icons are appropriate and necessary, a smart agent can be used to help a new user to select the right controls and their attributes by asking the user questions about the type of data to be stored by the database, generating the icons, and allowing the user to move and/or adjust the attributes.
- Turning now to the process for creating forms on the database, shown in FIG. 24, from
step 516, the routine checks if the user needs the assistance of an intelligent agent in formulating the form of the database instep 530. If the user requires assistance, the intelligent guides the user through the form set up process instep 531 by asking the user information about the relevant fields, their formats, among others. Once the fields and their characteristics have been identified, the intelligent agent generates a generic form with the fields required for the user's application and saves the new form instep 538 before it exits. - Alternatively, if the user does not require the assistance of the agent in
step 530, the routine of FIG. 24 checks if the user wishes to, create the form using a visual format instep 532. If not, the routine of FIG. 24 proceeds to step 533 where it accepts textual specifications of the database form from the user. Next, the form information is saved instep 538 prior to exiting the routine. In the event that the user wishes to create the form layout graphically instep 532, the routine of FIG. 24 displays a palette of control objects available for the form instep 534. These control objects include a check box, a pop-up menu, a pop list, a text field, a numeric field, a table field, a date/time field, currency filed, ink field, a formulas field, a look-up field, a barcode field, and a GPS field. Next, the routine waits until the user select an object instep 535. Once the control object has been selected, the routine transitions to step 536 where it displays the object on the form and requests the user to enter the object parameters, including the object dimensions and formatting characteristics, among others. Fromstep 536, the routine loops back to step 535 to wait for the next object selection. Fromstep 535, the routine checks if the user is done with the form creation process in step 537. If not, the routine simply loops back to step 535 to await the next user selection. Alternatively, if the user has completed the graphical form creation process in step 537, the form created is saved instep 538 before the routine of FIG. 24 exits instep 539. - Once the database forms have been generated, data can be entered into the database by writing data to the fields of each record in the database. Alternatively, the data can be imported from raw text, from other dBase files, or from the spreadsheet data files of the present invention. Additionally the data can be scanned in using the scanner/bar code reader discussed above, dictated in using the speech recognition engine, or delivered from an external By source such as the
Internet 150 using smart agents as discussed above or live-data databases that respond to data changes and events such as those discussed in K. C. Hopson and Stephen E. Ingram's book Developing Professional Java Applets (1996) or the live-data product available from Cycle Software, Inc. located in Quincy, Mass. Furthermore, the database of the present invention automatically classifies and handles information presented to the database. For instance, if the field definition generated instep 534 of FIG. 24 specifies two barcode fields, the first barcode data captured will be assigned to the first barcode field, while the subsequent barcode data will be assigned to the second barcode field, even if the barcode fields are not adjacent to each other. The same processing is provided to handle GPS data. - After the creation of the database forms and the entry of data by the user or the agent, the user can generate reports and/or update the database via the query language, the manipulation language and the report writer. The query language interface can simply be a standard dBase-like query commands, as known to those skilled in the art. Alternatively, the query language interface can be an easy to query language where the user simply enters the field to search and the appropriate search parameters. The search is then conducted in accordance with the parameters. Turning now to FIG. 25, the routine to handle the
search process 520 of FIG. 23 is shown. In FIG. 25, the routine first searches the database as exactly requested instep 550. The search process can be an indexed search or a binary search for speed reasons, as known in the art. Fromstep 550, the routine checks if the user has designated that non-traditional searches are to be done instep 551. If so, the routine proceeds to step 532 where it performs an inexact fuzzy search by looking for records with fields that almost, but not exactly, matched the query request, in a manner analogous to the fuzzy search done in speech recognition, as discussed in the incorporated by reference U.S. patent application Ser. No. 08/461,646. Furthermore, a probabilistic search is performed instep 553. The probabilistic search looks up equivalent words using a thesaurus and the replaces the keyword with equivalent words according a probability distribution, in accordance with the context of word usage. The present invention also contemplates that Soundex expansion techniques, as known in the art, may be used to expand the keyword search. - When records matching the search criteria are located, they are displayed on the screen on a record by record basis, ready for inspection by the user in
step 554 before the routine exits instep 555. However, such manual inspection is appropriate only if the user wishes to edit the records. When the user wishes to summarize or tabulate the results rather than to examine the responsive records, he or she can use the report writer to generate reports. In the preferred embodiment, the report writer accepts d-Base compatible report generator requests. Alternative, for users who are not familiar with d-Base, the report generator displays a report form with an option list consisting of available field captions, as generated during the data definition language phase. In addition, the option list includes operator buttons such as column and row summations, among others. Using the option list, the user can select and place the fields to be displayed. - Turning now to FIG. 26, the process for handling GPS data is shown. In FIG. 26, from
step 560, the current coordinate data is queried from theGPS receiver 46 instep 561. Fromstep 561, the routine of FIG. 26 proceeds to step 562 where the routine detects whether the portable computer is within a predetermined proximity. Instep 562, if the portable computer has not moved, the routine proceeds to step 563 where the routine puts itself to sleep. Fromstep 563, after a predetermined period, the routine of FIG. 26 wakes up and proceeds to step 561 to check whether the portable computer has moved. Fromstep 562, in the event that the portable computer has moved, the routine proceeds to step 564 where the beginning coordinate and time information are saved. Next, the routine samples the output of theGPS receiver 46 instep 565. Instep 566, the routine checks if the position of theGPS receiver 46 has changed. If not, the routine loops back to step 565 to continue acquiring GPS data. Alternatively, if data from theGPS receiver 46 indicate that theGPS receiver 46 has altered or moved, its position, the routine of FIG. 26 proceeds fromstep 566 to step 567 where it waits until theGPS receiver 46 has stopped moving, typically by checking if the proximity remains unchanged for a predetermined period of time. When theGPS receiver 46 stops moving, the routine proceeds to step 567 where it saves the ending. coordinate and ending time. Furthermore, the routine computes the mileage incurred for the trip in-step 567. - From
step 567, the routine collects other business data instep 568. The type of data collect varies with the application. For instance, for lawyers, the data collected may simply be time and expense and case management applications. For medical practitioners, the data collected may consist of patient information, drug interaction, type of treatment provided, and billing related information, among others. For salesperson, the data collected may relate to order taking, inventory checking, creating to-do list, and pricing, among others. Fromstep 568, the routine collates the data into one or more packets, compresses the packets and transmits the data via a suitable wireless transmitter such as the pager or thewireless transceiver 31 before the routine of FIG. 26 exits instep 570. - Referring to FIGS. 27 through 29, routines for supporting a meeting are shown. The routines in FIGS. 27 through 29 provide automated support for mobile users and in effect act as an intelligent researcher or agent for the users. The agent is necessary to protect the user from an increasing information overload in modern life while allowing the user to maintain control over .
- FIG. 27 illustrates the detail of the data search and preparation before a meeting takes place. From
step 580, the routine of FIG. 27 proceeds to step 581 where the routine checks with the calendar engine for meetings scheduled for a particular date. Fromstep 581, for each meeting calendared, the routine performs a search on the company and individuals scheduled for the meeting instep 582. Next, the routine checks in its internal records for historical data of prior meetings instep 583. In this step, the routine also attempts to identify areas of agreement and disagreement, as well as the personal information of the people in the meeting to remind and prepare the user of hot-spots to be careful on in the meeting. - From
step 583, the routine of FIG. 27 proceeds to step 584 where it searches for information relating to the competition as well as other potential stakeholders. The search starts with in-house data and sweeps outwardly toward theInternet 150. This step preferably deploys the intelligent agent of FIG. 22. As a first step, the agent of FIG. 22 enters the respective competitor's name into search engines such as Yahoo, AltaVista, HotBot or Infoseek. The agent may also check the competitor's financial health by performing a search in Hoover's Online, located at http://www.hoovers.com, and a search at the U.S. Securities & Exchange Commission, located at http://www.sec.gov. Other sites with financial information on public and private companies that can be searched by the agent of FIG. 22 include http://www.pathfinder.com, http://www.avetech.com, http://www.dbisna.com. - For general news regarding a particular company, the agent of FIG. 22 can search Ecola's 24-hour newsstand, located at http://www.ecola.com, which links to more than 2,000 newspapers, journals, magazines and publications. Additionally, the agent can search CNN Interactive at http://www.cnn.com for archived information going back a few weeks. Furthermore, the agent of FIG. 22 can search the Knowledge Index on CompuServer, and the Electric Library, available at http://www.elibrary.com, for scouring magazines, reference works and news wires. Furthermore, MediaFinder, located at http://www.mediafinder.com, provides an index and description of thousands of newsletters, catalogs and magazines.
- The agent of FIG. 22 also provides the ability to listen in on conversations regarding a particular company by news groups and discussion groups prevalent in the Usenet section of the
Internet 50. For a searchable directory of E-mail discussion groups, the agent of FIG. 22 reviews Deja News Research Service, located at http://www.dejanews.com, and Liszt, located at http://www.liszt.com. - As a last resort when the above searches turn up empty, the agent of
step 584 checks sites that have compiled good collections of business resources, including John Makulowich's Awesome Lists, located at http://www.clark.net, American Demographics, located at http://www.demographics.com, ProfNet, located at http://www.vyne.com, StartingPoint, located at http://www.stpt.com, Babson College, located at http://babson.edu, and Competitive Intelligence Guide, located at http://www.fuld.com. Additionally, the present invention contemplates that yet other sites can be searched as well for competitive information, including the Lexis/Nexis database, the Westlaw database, various judicial decisions at Villanova University, licensing information from Licensing Executive Society at http://www.les.org, and the patent abstract information database from the U.S. Patent & Trademark Office, or alternatively, abstracts from MicroPatent, located at http://www.micropat.com, among other sites. - From
step 584, the routine formats the collected information of steps 583-584 instep 585. Next, the routine proceeds to step 586 where it checks to see if it is time to meet. If so, the routine proceeds to step 587 where it notifies the user of the meeting and displays the formatted report ofstep 585 instep 588. Alternatively, if it is not yet meeting time instep 586, the routine proceeds to step 589 where it puts itself to sleep until the next check interval. Fromstep step 590. - Turning now to FIG. 28, the processes occurring during the meeting scheduled in FIG. 27 are shown in more detail. From
step 600, the routine proceeds to step 601 where it displays the reports generated instep 585 of FIG. 27. Further, the routine checks instep 602 whether the customer or client has specific questions. If so, the routine proceeds to step 603 where, in the event that the user does not know the answer already, the routine jumps to step 604 where it queries a database and allows the user to electronically mail questions to the technical staff instep 605. - From
steps step 606, the routine proceeds fromstep 606 to step 607. Instep 607, the routine downloads the requisite computer aided design (CAD) file for editing purposes. Fromstep 607, the routine proceeds to step 608 where the design can be updated using a number of tools, including the tools disclosed in the incorporated by reference patent applications. - From
steps step 609. If so, the routine downloads pricing information from the host to the portable computer instep 610 and applies the spreadsheet discussed above to the data instep 611. - From
steps step 612 to step 613 where it flags that a standard follow-up letter without questions is to be used. Alternatively, in the event that outstanding questions remain to be answered, the routine proceeds fromstep 612 to step 614 where it adds to the list of follow-up questions. Fromstep - Referring now to FIG. 29, the routine to process events after the meeting is shown. From
step 620 of FIG. 29, the routine proceeds to step 621 where it loads a standard letter template which provides the foundational structure for the correspondence. Fromstep 621, the routine checks if unresolved questions remain instep 621. If so, the routine proceeds to step 623 where it displays the question list instep 623 to remind the user of the items to be addressed in the correspondence. Next, instep 624, in the event that the answer requires an expert, the routine proceeds fromstep 624 to step 625 where the routine forwards the question to the appropriate person. - From
step 624 or step 625, the routine proceeds to step 626 where the question is answered. Next, instep 627, the routine checks if it is done with all questions. If not, the routine loops back to step 623 to answer the next question in the list. Alternatively, if all questions have been answered instep 627 or step 622, the routine proceeds to step 628 where it applies standard closing paragraphs as well as a signature facsimile. Furthermore, to the extent some personalized compliments or congratulations can be made, as identified instep 582 of FIG. 27, the routine also applies these congratulatory remarks to the correspondence. Next, instep 629, the routine prints, e-mail, postal mail, or fax the correspondence to the client or customer before exiting in step 631 of FIG. 29. - The present invention thus provides a convenient system for accepting and manipulating data using a spreadsheet or a database such that the user can quickly write commands or data on a mobile computer with a relatively compact screen. Further, the present invention integrates speech and typed data entry to provide a user friendly computer system. Further, the spreadsheet or database system of the present invention can be used in two-way messaging systems to support object linking and embedding like capabilities. Data can be imported into the spreadsheet or database by scanning or dictating the information to the computer system. The present invention also supports an intelligent agent operating with the computer to locate responsive information, as specified by the spreadsheet or database system of the present invention.
- Although specific embodiments of the present invention have been illustrated in the accompanying drawings and described in the foregoing detailed description, it will be understood that the invention is not limited to the particular embodiments described herein, but is capable of numerous rearrangements, modifications, and substitutions without departing from the scope of the invention. The following claims are intended to encompass all such modifications.
Claims (18)
1. A portable computer system for managing data for a user, comprising:
a processor;
an input recognizer embodied in said program storage device, said input recognizer adapted to receive non-cursive handwritings from said user and convert said non-cursive handwritings into text data;
a program storage device coupled to said processor;
a computer readable code embodied in said program storage device and coupled to said input recognizer for receiving said non-cursive handwritings, said computer readable code storing said data and allowing said user to process said data.
2. The portable computer system of claim 1 , wherein said computer readable code is a spreadsheet.
3. The portable computer system of claim 2 , further comprising a magnifier code coupled to said spreadsheet code and said input recognizer code, said magnifier code zooming in or out of portions of said spreadsheet upon demand by said user to improve the readability of said spreadsheet.
4. The portable computer system of claim 1 , wherein said computer readable code is a spreadsheet, further comprising:
a scanner removably coupled to said processor; and
an optical character recognition (OCR) engine coupled to said processor, said scanner and said spreadsheet.
7. The portable computer system of claim 1 , wherein said computer readable code is a spreadsheet, further comprising:
an analog to digital converter coupled to said processor;
a microphone coupled to said voice recorder; and
a speech recognizer coupled to said processor, said analog to digital converter and to said spreadsheet.
8. The portable computer system of claim 1 , wherein said computer readable code is a spreadsheet, further comprising a communications unit coupled to said processor and to said spreadsheet.
9. The portable computer system of claim 8 , wherein said computer readable code is a spreadsheet, further comprising a peripheral device selected from the group consisting of a camera, a scanner, a tuner, an audio system, a modem, a voice recorder, and a display, said peripheral device coupled to said communications unit to provide data to said spreadsheet in a non-contact manner.
10. The portable computer system of claim 1 , wherein said computer readable code is a spreadsheet, further comprising a browser coupled to said spreadsheet.
11. The portable computer of claim 1 , wherein said program storage device coupled to said processor is a browser.
12. The portable computer of claim 1 , wherein said browser is adapted to operate with an intermediate language, said browser further comprising:
an intermediate language parser; and
an intermediate language interpreter coupled to said parser, said interpreter performing actions in accordance with said intermediate language.
13. The portable computer system of claim 1 , further comprising:
a digital simultaneous voice data (DSVD) modem coupled to said processor, said DSVD modem adapted to communicate with a remote DSVD modem; and
a blackboard code for receiving data from said remote DSVD modem and displaying said data to a user.
14. The portable computer system of claim 1 , further comprising:
a data storage device coupled to said processor; and
a file system code for linking said processor to said data storage device, said file system allowing applications to open and close files stored on said data storage device.
15. The portable computer of claim 11 , further comprising:
an application executing on said computer, said application generating one or more messages having one or more instructions;
a data compression engine coupled to said application, said data compression engine generating a compressed message from said application message; and
a message manager coupled to said data compression engine, said message manager adapted to transmit said compressed message over a media.
16. The portable computer of claim 15 , further comprising:
a resource;
a decompression engine adapted to receive said compressed message and decoding said application message from said compressed message;
an event manager coupled to said decompression engine, said event manager queuing said application message; and
an event handler coupled to said event manager and to said resource, said handler accessing said resource in accordance with said message instructions.
17. The portable computer system of claim 11 , further comprising a Java compiler or interpreter coupled to said processor.
18. The portable computer system of claim 11 , further comprising an intelligent agent adapted to be sent by said processor to a remote network for data collection purposes.
19. The portable computer system of claim 18 , wherein said intelligent agent is sent over an Internet for data collection purposes.
20. The portable computer of claim 18 , wherein said intelligent agent comprises:
a first code for specifying a goal for said intelligent agent;
a second code coupled to said first code for sending said agent over a network;
a third code coupled to said second code, said third code mining databases existing on said network in search of data satisfying said goal; and
a fourth code coupled to said third code for reporting results back to the portable computer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/533,564 US20020069220A1 (en) | 1996-12-17 | 2000-03-22 | Remote data access and management system utilizing handwriting input |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/767,833 US6157935A (en) | 1996-12-17 | 1996-12-17 | Remote data access and management system |
US09/533,564 US20020069220A1 (en) | 1996-12-17 | 2000-03-22 | Remote data access and management system utilizing handwriting input |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/767,833 Continuation US6157935A (en) | 1996-12-17 | 1996-12-17 | Remote data access and management system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20020069220A1 true US20020069220A1 (en) | 2002-06-06 |
Family
ID=25080732
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/767,833 Expired - Lifetime US6157935A (en) | 1996-12-17 | 1996-12-17 | Remote data access and management system |
US09/533,564 Abandoned US20020069220A1 (en) | 1996-12-17 | 2000-03-22 | Remote data access and management system utilizing handwriting input |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/767,833 Expired - Lifetime US6157935A (en) | 1996-12-17 | 1996-12-17 | Remote data access and management system |
Country Status (1)
Country | Link |
---|---|
US (2) | US6157935A (en) |
Cited By (216)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010005857A1 (en) * | 1998-05-29 | 2001-06-28 | Mihal Lazaridis | System and method for pushing information from a host system to a mobile data communication device |
US20020044149A1 (en) * | 2000-08-31 | 2002-04-18 | Mccarthy Kevin | Handset personalisation |
US20020165873A1 (en) * | 2001-02-22 | 2002-11-07 | International Business Machines Corporation | Retrieving handwritten documents using multiple document recognizers and techniques allowing both typed and handwritten queries |
US6587846B1 (en) * | 1999-10-01 | 2003-07-01 | Lamuth John E. | Inductive inference affective language analyzer simulating artificial intelligence |
US20040027487A1 (en) * | 2002-08-09 | 2004-02-12 | Rzadzki Robert J. | System to provide custom text and graphic information to a television system infrastructure |
US20040078756A1 (en) * | 2002-10-15 | 2004-04-22 | Napper Jonathon Leigh | Method of improving recognition accuracy in form-based data entry systems |
US20040085286A1 (en) * | 2002-10-31 | 2004-05-06 | Microsoft Corporation | Universal computing device |
US20040104920A1 (en) * | 2002-09-30 | 2004-06-03 | Tsuyoshi Kawabe | Image display method for mobile terminal in image distribution system, and image conversion apparatus and mobile terminal using the method |
US20040128142A1 (en) * | 2001-01-05 | 2004-07-01 | Whitham Charles Lamont | Interactive multimedia book |
US20040131230A1 (en) * | 1998-07-22 | 2004-07-08 | Paraskevakos Theodore George | Intelligent currency validation network |
US20040143788A1 (en) * | 2001-02-27 | 2004-07-22 | Jean-Jacques Aureglia | Method and system in an electronic spreadsheet for handling graphical objects referring to working ranges of cells in a copy/cut and paste operation |
US6778834B2 (en) * | 2001-02-27 | 2004-08-17 | Nokia Corporation | Push content filtering |
US20040210854A1 (en) * | 2001-12-10 | 2004-10-21 | Mentor Graphics Corporation | Parellel electronic design automation: shared simultaneous editing |
US20040225988A1 (en) * | 2001-12-10 | 2004-11-11 | Mentor Graphics Corporation | Protection boundaries in a parallel printed circuit board design environment |
US20050044518A1 (en) * | 2001-12-10 | 2005-02-24 | Mentor Graphics Corporation | Reservation of design elements in a parallel printed circuit board design environment |
US20050114821A1 (en) * | 2003-11-21 | 2005-05-26 | Mentor Graphics Corporation | Distributed autorouting of conductive paths |
US20050114865A1 (en) * | 2003-11-21 | 2005-05-26 | Mentor Graphics Corporation | Integrating multiple electronic design applications |
US20050198563A1 (en) * | 2004-03-03 | 2005-09-08 | Kristjansson Trausti T. | Assisted form filling |
US20050277406A1 (en) * | 2004-06-14 | 2005-12-15 | Sbc Knowledge Ventures, L.P. | System and method for electronic message notification |
WO2005125029A2 (en) * | 2004-06-14 | 2005-12-29 | Sbc Knowledge Ventures, L.P. | System and method for electronic message notification |
US20060095504A1 (en) * | 2004-08-24 | 2006-05-04 | Gelsey Jonathan I | System and method for optical character information retrieval (OCR) via a thin-client user interface |
US20060095882A1 (en) * | 2004-09-08 | 2006-05-04 | Mentor Graphics Corporation | Distributed electronic design automation environment |
US20060101368A1 (en) * | 2004-09-08 | 2006-05-11 | Mentor Graphics Corporation | Distributed electronic design automation environment |
US20060161893A1 (en) * | 2004-12-22 | 2006-07-20 | Lg Electronics Inc. | Method and apparatus interfacing between an application and a library of a master for network managing |
US20060184476A1 (en) * | 2001-02-28 | 2006-08-17 | Voice-Insight | Natural language query system for accessing an information system |
US7159037B1 (en) * | 1998-09-11 | 2007-01-02 | Lv Partners, Lp | Method and apparatus for utilizing an existing product code to issue a match to a predetermined location on a global network |
US20070073809A1 (en) * | 2005-09-13 | 2007-03-29 | Mentor Graphics Corporation | Distributed electronic design automation architecture |
US20070116349A1 (en) * | 2005-11-23 | 2007-05-24 | Pitney Bowes Incorporated | Method for detecting perforations on the edge of an image of a form |
US20070152961A1 (en) * | 2005-12-30 | 2007-07-05 | Dunton Randy R | User interface for a media device |
US20070159498A1 (en) * | 2006-01-10 | 2007-07-12 | Jung-Yi Yang | Display apparatus adapted for a display wall, image adjustment method therefor and display wall therewith |
US20070280535A1 (en) * | 2006-05-30 | 2007-12-06 | Microsoft Corporation Microsoft Patent Group | Cursive handwriting recognition with hierarchical prototype search |
US7337193B1 (en) * | 2002-05-02 | 2008-02-26 | Palmsource, Inc. | Determining priority between data items |
US7376752B1 (en) | 2003-10-28 | 2008-05-20 | David Chudnovsky | Method to resolve an incorrectly entered uniform resource locator (URL) |
US20080118141A1 (en) * | 2006-11-14 | 2008-05-22 | Codonics, Inc. | Assembling multiple medical images into a single film image |
US7383299B1 (en) * | 2000-05-05 | 2008-06-03 | International Business Machines Corporation | System and method for providing service for searching web site addresses |
US20080165142A1 (en) * | 2006-10-26 | 2008-07-10 | Kenneth Kocienda | Portable Multifunction Device, Method, and Graphical User Interface for Adjusting an Insertion Point Marker |
US20080263067A1 (en) * | 2005-10-27 | 2008-10-23 | Koninklijke Philips Electronics, N.V. | Method and System for Entering and Retrieving Content from an Electronic Diary |
US20080266131A1 (en) * | 2007-02-13 | 2008-10-30 | Wherenet Corp. | System, apparatus and method for locating and/or tracking assets |
US20080313686A1 (en) * | 2007-06-13 | 2008-12-18 | Matvey Thomas R | Handheld camcorder accessory with pre-programmed wireless internet access for simplified webcasting and handheld camcorder with built-in pre-programmed wireless internet access for simplified webcasting and method of commercially supplying and supporting same |
US20090055179A1 (en) * | 2007-08-24 | 2009-02-26 | Samsung Electronics Co., Ltd. | Method, medium and apparatus for providing mobile voice web service |
US20090228842A1 (en) * | 2008-03-04 | 2009-09-10 | Apple Inc. | Selecting of text using gestures |
US20090225365A1 (en) * | 2008-03-07 | 2009-09-10 | Canon Kabushiki Kaisha | Information processing apparatus, image processing apparatus, method for controlling information processing apparatus, method for controlling image processing apparatus, and program |
WO2009124197A2 (en) * | 2008-04-03 | 2009-10-08 | Livescribe, Inc. | Quick record function in a smart pen computing system |
US20090320076A1 (en) * | 2008-06-20 | 2009-12-24 | At&T Intellectual Property I, L.P. | System and Method for Processing an Interactive Advertisement |
US20090319276A1 (en) * | 2008-06-20 | 2009-12-24 | At&T Intellectual Property I, L.P. | Voice Enabled Remote Control for a Set-Top Box |
US7685510B2 (en) | 2004-12-23 | 2010-03-23 | Sap Ag | System and method for grouping data |
US7684618B2 (en) | 2002-10-31 | 2010-03-23 | Microsoft Corporation | Passive embedded interaction coding |
US7716714B2 (en) | 2004-12-01 | 2010-05-11 | At&T Intellectual Property I, L.P. | System and method for recording television content at a set top box |
US7729539B2 (en) | 2005-05-31 | 2010-06-01 | Microsoft Corporation | Fast error-correcting of embedded interaction codes |
US20100201455A1 (en) * | 2008-09-23 | 2010-08-12 | Aerovironment, Inc. | Predictive pulse width modulation for an open delta h-bridge driven high efficiency ironless permanent magnet machine |
US20100211627A1 (en) * | 2009-02-13 | 2010-08-19 | Mobitv, Inc. | Reprogrammable client using a uniform bytecode model |
US20100235734A1 (en) * | 2009-03-16 | 2010-09-16 | Bas Ording | Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display |
US7817816B2 (en) | 2005-08-17 | 2010-10-19 | Microsoft Corporation | Embedded interaction code enabled surface type identification |
US7826074B1 (en) | 2005-02-25 | 2010-11-02 | Microsoft Corporation | Fast embedded interaction code printing with custom postscript commands |
US7873102B2 (en) | 2005-07-27 | 2011-01-18 | At&T Intellectual Property I, Lp | Video quality testing by encoding aggregated clips |
US20110055681A1 (en) * | 2001-08-16 | 2011-03-03 | Knowledge Dynamics, Inc. | Parser, code generator, and data calculation and transformation engine for spreadsheet calculations |
US7908627B2 (en) | 2005-06-22 | 2011-03-15 | At&T Intellectual Property I, L.P. | System and method to provide a unified video signal for diverse receiving platforms |
US7908621B2 (en) | 2003-10-29 | 2011-03-15 | At&T Intellectual Property I, L.P. | System and apparatus for local video distribution |
US7908467B2 (en) | 1998-09-11 | 2011-03-15 | RPX-LV Acquistion LLC | Automatic configuration of equipment software |
US7912961B2 (en) | 1998-09-11 | 2011-03-22 | Rpx-Lv Acquisition Llc | Input device for allowing input of unique digital code to a user's computer to control access thereof to a web site |
US7920753B2 (en) | 2005-05-25 | 2011-04-05 | Microsoft Corporation | Preprocessing for information pattern analysis |
US20110080364A1 (en) * | 2006-10-26 | 2011-04-07 | Bas Ording | Method, System, and Graphical User Interface for Positioning an Insertion Marker in a Touch Screen Display |
US8001081B1 (en) | 2002-05-31 | 2011-08-16 | Access Co., Ltd. | Determining priority between data items in shared environments |
US8054849B2 (en) | 2005-05-27 | 2011-11-08 | At&T Intellectual Property I, L.P. | System and method of managing video content streams |
US8086261B2 (en) | 2004-10-07 | 2011-12-27 | At&T Intellectual Property I, L.P. | System and method for providing digital network access and digital broadcast services using combined channels on a single physical medium to the customer premises |
US20120023414A1 (en) * | 2010-07-23 | 2012-01-26 | Samsung Electronics Co., Ltd. | Method and apparatus for processing e-mail |
WO2012021659A2 (en) * | 2010-08-12 | 2012-02-16 | Brightedge Technologies, Inc. | Operationalizing search engine optimization |
WO2011153006A3 (en) * | 2010-06-04 | 2012-04-05 | Microsoft Corporation | Generating text manipulation programs using input-output examples |
US8156153B2 (en) | 2005-04-22 | 2012-04-10 | Microsoft Corporation | Global metadata embedding and decoding |
US8190688B2 (en) | 2005-07-11 | 2012-05-29 | At&T Intellectual Property I, Lp | System and method of transmitting photographs from a set top box |
US8201109B2 (en) | 2008-03-04 | 2012-06-12 | Apple Inc. | Methods and graphical user interfaces for editing on a portable multifunction device |
US8214859B2 (en) | 2005-02-14 | 2012-07-03 | At&T Intellectual Property I, L.P. | Automatic switching between high definition and standard definition IP television signals |
US8228224B2 (en) | 2005-02-02 | 2012-07-24 | At&T Intellectual Property I, L.P. | System and method of using a remote control and apparatus |
US8282476B2 (en) | 2005-06-24 | 2012-10-09 | At&T Intellectual Property I, L.P. | Multimedia-based video game distribution |
US8365218B2 (en) | 2005-06-24 | 2013-01-29 | At&T Intellectual Property I, L.P. | Networked television and method thereof |
US8390744B2 (en) | 2004-12-06 | 2013-03-05 | At&T Intellectual Property I, L.P. | System and method of displaying a video stream |
US8427445B2 (en) | 2004-07-30 | 2013-04-23 | Apple Inc. | Visual expander |
US8434116B2 (en) | 2004-12-01 | 2013-04-30 | At&T Intellectual Property I, L.P. | Device, system, and method for managing television tuners |
US20130138399A1 (en) * | 2011-11-29 | 2013-05-30 | Garrick EVANS | Generating an Analytically Accurate Model From an Abstract Representation Created Via a Mobile Device |
WO2013132309A1 (en) * | 2012-03-05 | 2013-09-12 | Tammel Eric Kamel | Systems and methods for processing unstructured numerical data |
US8584257B2 (en) | 2004-08-10 | 2013-11-12 | At&T Intellectual Property I, L.P. | Method and interface for video content acquisition security on a set-top box |
US8635659B2 (en) | 2005-06-24 | 2014-01-21 | At&T Intellectual Property I, L.P. | Audio receiver modular card and method thereof |
US8661339B2 (en) | 2011-05-31 | 2014-02-25 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US8713067B1 (en) * | 2010-07-09 | 2014-04-29 | Open Invention Network, Llc | Stable file system |
US8744852B1 (en) | 2004-10-01 | 2014-06-03 | Apple Inc. | Spoken interfaces |
US20140164907A1 (en) * | 2012-12-12 | 2014-06-12 | Lg Electronics Inc. | Mobile terminal and method of controlling the mobile terminal |
CN103870019A (en) * | 2012-12-17 | 2014-06-18 | 鸿富锦精密工业(深圳)有限公司 | Digital pen and digital writing module |
US8893199B2 (en) | 2005-06-22 | 2014-11-18 | At&T Intellectual Property I, L.P. | System and method of managing video content delivery |
US8892446B2 (en) | 2010-01-18 | 2014-11-18 | Apple Inc. | Service orchestration for intelligent automated assistant |
US8904458B2 (en) | 2004-07-29 | 2014-12-02 | At&T Intellectual Property I, L.P. | System and method for pre-caching a first portion of a video file on a set-top box |
US8949125B1 (en) * | 2010-06-16 | 2015-02-03 | Google Inc. | Annotating maps with user-contributed pronunciations |
US8977584B2 (en) | 2010-01-25 | 2015-03-10 | Newvaluexchange Global Ai Llp | Apparatuses, methods and systems for a digital conversation management platform |
US20150095294A1 (en) * | 2013-10-02 | 2015-04-02 | International Business Machines Corporation | Elimination of Fragmentation of Files in Storage Medium by Utilizing Head Movement Time |
US9024864B2 (en) | 2007-06-12 | 2015-05-05 | Intel Corporation | User interface with software lensing for very long lists of content |
US20150332492A1 (en) * | 2014-05-13 | 2015-11-19 | Masaaki Igarashi | Image processing system, image processing apparatus, and method for image processing |
US9262612B2 (en) | 2011-03-21 | 2016-02-16 | Apple Inc. | Device access using voice authentication |
US9300784B2 (en) | 2013-06-13 | 2016-03-29 | Apple Inc. | System and method for emergency calls initiated by voice command |
US9330720B2 (en) | 2008-01-03 | 2016-05-03 | Apple Inc. | Methods and apparatus for altering audio output signals |
US9338493B2 (en) | 2014-06-30 | 2016-05-10 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US9368114B2 (en) | 2013-03-14 | 2016-06-14 | Apple Inc. | Context-sensitive handling of interruptions |
US9374435B2 (en) | 1998-05-29 | 2016-06-21 | Blackberry Limited | System and method for using trigger events and a redirector flag to redirect messages |
US9430463B2 (en) | 2014-05-30 | 2016-08-30 | Apple Inc. | Exemplar-based natural language processing |
US9483461B2 (en) | 2012-03-06 | 2016-11-01 | Apple Inc. | Handling speech synthesis of content for multiple languages |
US9495129B2 (en) | 2012-06-29 | 2016-11-15 | Apple Inc. | Device, method, and user interface for voice-activated navigation and browsing of a document |
US9502031B2 (en) | 2014-05-27 | 2016-11-22 | Apple Inc. | Method for supporting dynamic grammars in WFST-based ASR |
US9535906B2 (en) | 2008-07-31 | 2017-01-03 | Apple Inc. | Mobile device having human language translation capability with positional feedback |
US9552335B2 (en) | 2012-06-04 | 2017-01-24 | Microsoft Technology Licensing, Llc | Expedited techniques for generating string manipulation programs |
US9576574B2 (en) | 2012-09-10 | 2017-02-21 | Apple Inc. | Context-sensitive handling of interruptions by intelligent digital assistant |
US9582608B2 (en) | 2013-06-07 | 2017-02-28 | Apple Inc. | Unified ranking with entropy-weighted information for phrase-based semantic auto-completion |
US9613115B2 (en) | 2010-07-12 | 2017-04-04 | Microsoft Technology Licensing, Llc | Generating programs based on input-output examples using converter modules |
US9620104B2 (en) | 2013-06-07 | 2017-04-11 | Apple Inc. | System and method for user-specified pronunciation of words for speech synthesis and recognition |
US9620105B2 (en) | 2014-05-15 | 2017-04-11 | Apple Inc. | Analyzing audio input for efficient speech and music recognition |
US9626955B2 (en) | 2008-04-05 | 2017-04-18 | Apple Inc. | Intelligent text-to-speech conversion |
US9633004B2 (en) | 2014-05-30 | 2017-04-25 | Apple Inc. | Better resolution when referencing to concepts |
US9633660B2 (en) | 2010-02-25 | 2017-04-25 | Apple Inc. | User profiling for voice input processing |
US9633674B2 (en) | 2013-06-07 | 2017-04-25 | Apple Inc. | System and method for detecting errors in interactions with a voice-based digital assistant |
US9646609B2 (en) | 2014-09-30 | 2017-05-09 | Apple Inc. | Caching apparatus for serving phonetic pronunciations |
US9646614B2 (en) | 2000-03-16 | 2017-05-09 | Apple Inc. | Fast, language-independent method for user authentication by voice |
US20170147195A1 (en) * | 2015-11-20 | 2017-05-25 | Tomer Alpert | Automove smart transcription |
US9668121B2 (en) | 2014-09-30 | 2017-05-30 | Apple Inc. | Social reminders |
US9697822B1 (en) | 2013-03-15 | 2017-07-04 | Apple Inc. | System and method for updating an adaptive speech recognition model |
US9697820B2 (en) | 2015-09-24 | 2017-07-04 | Apple Inc. | Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks |
US9711141B2 (en) | 2014-12-09 | 2017-07-18 | Apple Inc. | Disambiguating heteronyms in speech synthesis |
US9715875B2 (en) | 2014-05-30 | 2017-07-25 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US9721566B2 (en) | 2015-03-08 | 2017-08-01 | Apple Inc. | Competing devices responding to voice triggers |
US9734193B2 (en) | 2014-05-30 | 2017-08-15 | Apple Inc. | Determining domain salience ranking from ambiguous words in natural speech |
US9760559B2 (en) | 2014-05-30 | 2017-09-12 | Apple Inc. | Predictive text input |
US9785630B2 (en) | 2014-05-30 | 2017-10-10 | Apple Inc. | Text prediction using combined word N-gram and unigram language models |
US9798393B2 (en) | 2011-08-29 | 2017-10-24 | Apple Inc. | Text correction processing |
US9818400B2 (en) | 2014-09-11 | 2017-11-14 | Apple Inc. | Method and apparatus for discovering trending terms in speech requests |
US9842105B2 (en) | 2015-04-16 | 2017-12-12 | Apple Inc. | Parsimonious continuous-space phrase representations for natural language processing |
US9842101B2 (en) | 2014-05-30 | 2017-12-12 | Apple Inc. | Predictive conversion of language input |
US9858925B2 (en) | 2009-06-05 | 2018-01-02 | Apple Inc. | Using context information to facilitate processing of commands in a virtual assistant |
US9865280B2 (en) | 2015-03-06 | 2018-01-09 | Apple Inc. | Structured dictation using intelligent automated assistants |
US9886432B2 (en) | 2014-09-30 | 2018-02-06 | Apple Inc. | Parsimonious handling of word inflection via categorical stem + suffix N-gram language models |
US9886953B2 (en) | 2015-03-08 | 2018-02-06 | Apple Inc. | Virtual assistant activation |
US9899019B2 (en) | 2015-03-18 | 2018-02-20 | Apple Inc. | Systems and methods for structured stem and suffix language models |
US9922642B2 (en) | 2013-03-15 | 2018-03-20 | Apple Inc. | Training an at least partial voice command system |
US9934775B2 (en) | 2016-05-26 | 2018-04-03 | Apple Inc. | Unit-selection text-to-speech synthesis based on predicted concatenation parameters |
US9953088B2 (en) | 2012-05-14 | 2018-04-24 | Apple Inc. | Crowd sourcing information to fulfill user requests |
US9959870B2 (en) | 2008-12-11 | 2018-05-01 | Apple Inc. | Speech recognition involving a mobile device |
US9966065B2 (en) | 2014-05-30 | 2018-05-08 | Apple Inc. | Multi-command single utterance input method |
US9966068B2 (en) | 2013-06-08 | 2018-05-08 | Apple Inc. | Interpreting and acting upon commands that involve sharing information with remote devices |
US9972304B2 (en) | 2016-06-03 | 2018-05-15 | Apple Inc. | Privacy preserving distributed evaluation framework for embedded personalized systems |
US9971774B2 (en) | 2012-09-19 | 2018-05-15 | Apple Inc. | Voice-based media searching |
US10043516B2 (en) | 2016-09-23 | 2018-08-07 | Apple Inc. | Intelligent automated assistant |
US10049668B2 (en) | 2015-12-02 | 2018-08-14 | Apple Inc. | Applying neural network language models to weighted finite state transducers for automatic speech recognition |
US10049663B2 (en) | 2016-06-08 | 2018-08-14 | Apple, Inc. | Intelligent automated assistant for media exploration |
US10057736B2 (en) | 2011-06-03 | 2018-08-21 | Apple Inc. | Active transport based notifications |
US10067938B2 (en) | 2016-06-10 | 2018-09-04 | Apple Inc. | Multilingual word prediction |
US10074360B2 (en) | 2014-09-30 | 2018-09-11 | Apple Inc. | Providing an indication of the suitability of speech recognition |
US10078631B2 (en) | 2014-05-30 | 2018-09-18 | Apple Inc. | Entropy-guided text prediction using combined word and character n-gram language models |
US10079014B2 (en) | 2012-06-08 | 2018-09-18 | Apple Inc. | Name recognition system |
US10083688B2 (en) | 2015-05-27 | 2018-09-25 | Apple Inc. | Device voice control for selecting a displayed affordance |
US10089072B2 (en) | 2016-06-11 | 2018-10-02 | Apple Inc. | Intelligent device arbitration and control |
US10101822B2 (en) | 2015-06-05 | 2018-10-16 | Apple Inc. | Language input correction |
US10127911B2 (en) | 2014-09-30 | 2018-11-13 | Apple Inc. | Speaker identification and unsupervised speaker adaptation techniques |
US10127220B2 (en) | 2015-06-04 | 2018-11-13 | Apple Inc. | Language identification from short strings |
US10134385B2 (en) | 2012-03-02 | 2018-11-20 | Apple Inc. | Systems and methods for name pronunciation |
US10170123B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Intelligent assistant for home automation |
US10176167B2 (en) | 2013-06-09 | 2019-01-08 | Apple Inc. | System and method for inferring user intent from speech inputs |
US10185542B2 (en) | 2013-06-09 | 2019-01-22 | Apple Inc. | Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant |
US10186254B2 (en) | 2015-06-07 | 2019-01-22 | Apple Inc. | Context-based endpoint detection |
US10192552B2 (en) | 2016-06-10 | 2019-01-29 | Apple Inc. | Digital assistant providing whispered speech |
US10199051B2 (en) | 2013-02-07 | 2019-02-05 | Apple Inc. | Voice trigger for a digital assistant |
US10223066B2 (en) | 2015-12-23 | 2019-03-05 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US10241752B2 (en) | 2011-09-30 | 2019-03-26 | Apple Inc. | Interface for a virtual digital assistant |
US10241644B2 (en) | 2011-06-03 | 2019-03-26 | Apple Inc. | Actionable reminder entries |
US10249300B2 (en) | 2016-06-06 | 2019-04-02 | Apple Inc. | Intelligent list reading |
US10255907B2 (en) | 2015-06-07 | 2019-04-09 | Apple Inc. | Automatic accent detection using acoustic models |
US10269345B2 (en) | 2016-06-11 | 2019-04-23 | Apple Inc. | Intelligent task discovery |
US10276170B2 (en) | 2010-01-18 | 2019-04-30 | Apple Inc. | Intelligent automated assistant |
US10283110B2 (en) | 2009-07-02 | 2019-05-07 | Apple Inc. | Methods and apparatuses for automatic speech recognition |
US10289433B2 (en) | 2014-05-30 | 2019-05-14 | Apple Inc. | Domain specific language for encoding assistant dialog |
US10297253B2 (en) | 2016-06-11 | 2019-05-21 | Apple Inc. | Application integration with a digital assistant |
US10318871B2 (en) | 2005-09-08 | 2019-06-11 | Apple Inc. | Method and apparatus for building an intelligent automated assistant |
US10354011B2 (en) | 2016-06-09 | 2019-07-16 | Apple Inc. | Intelligent automated assistant in a home environment |
US10356243B2 (en) | 2015-06-05 | 2019-07-16 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
US10366158B2 (en) | 2015-09-29 | 2019-07-30 | Apple Inc. | Efficient word encoding for recurrent neural network language models |
CN110221760A (en) * | 2019-06-24 | 2019-09-10 | 梁舒云 | A method of generating towed picture voice label |
US10410637B2 (en) | 2017-05-12 | 2019-09-10 | Apple Inc. | User-specific acoustic models |
US10446143B2 (en) | 2016-03-14 | 2019-10-15 | Apple Inc. | Identification of voice inputs providing credentials |
US10446141B2 (en) | 2014-08-28 | 2019-10-15 | Apple Inc. | Automatic speech recognition based on user feedback |
US10482874B2 (en) | 2017-05-15 | 2019-11-19 | Apple Inc. | Hierarchical belief states for digital assistants |
US10490187B2 (en) | 2016-06-10 | 2019-11-26 | Apple Inc. | Digital assistant providing automated status report |
US10496753B2 (en) | 2010-01-18 | 2019-12-03 | Apple Inc. | Automatically adapting user interfaces for hands-free interaction |
US10509862B2 (en) | 2016-06-10 | 2019-12-17 | Apple Inc. | Dynamic phrase expansion of language input |
US10521466B2 (en) | 2016-06-11 | 2019-12-31 | Apple Inc. | Data driven natural language event detection and classification |
US10552013B2 (en) | 2014-12-02 | 2020-02-04 | Apple Inc. | Data detection |
US10553209B2 (en) | 2010-01-18 | 2020-02-04 | Apple Inc. | Systems and methods for hands-free notification summaries |
US10568032B2 (en) | 2007-04-03 | 2020-02-18 | Apple Inc. | Method and system for operating a multi-function portable electronic device using voice-activation |
US10567477B2 (en) | 2015-03-08 | 2020-02-18 | Apple Inc. | Virtual assistant continuity |
US10593346B2 (en) | 2016-12-22 | 2020-03-17 | Apple Inc. | Rank-reduced token representation for automatic speech recognition |
US10592095B2 (en) | 2014-05-23 | 2020-03-17 | Apple Inc. | Instantaneous speaking of content on touch devices |
US10659851B2 (en) | 2014-06-30 | 2020-05-19 | Apple Inc. | Real-time digital assistant knowledge updates |
US10671353B2 (en) | 2018-01-31 | 2020-06-02 | Microsoft Technology Licensing, Llc | Programming-by-example using disjunctive programs |
US10671428B2 (en) | 2015-09-08 | 2020-06-02 | Apple Inc. | Distributed personal assistant |
US10679605B2 (en) | 2010-01-18 | 2020-06-09 | Apple Inc. | Hands-free list-reading by intelligent automated assistant |
US10691473B2 (en) | 2015-11-06 | 2020-06-23 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US10706373B2 (en) | 2011-06-03 | 2020-07-07 | Apple Inc. | Performing actions associated with task items that represent tasks to perform |
US10705794B2 (en) | 2010-01-18 | 2020-07-07 | Apple Inc. | Automatically adapting user interfaces for hands-free interaction |
US10733993B2 (en) | 2016-06-10 | 2020-08-04 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US10747498B2 (en) | 2015-09-08 | 2020-08-18 | Apple Inc. | Zero latency digital assistant |
US10755703B2 (en) | 2017-05-11 | 2020-08-25 | Apple Inc. | Offline personal assistant |
US10762293B2 (en) | 2010-12-22 | 2020-09-01 | Apple Inc. | Using parts-of-speech tagging and named entity recognition for spelling correction |
US10789041B2 (en) | 2014-09-12 | 2020-09-29 | Apple Inc. | Dynamic thresholds for always listening speech trigger |
US10791216B2 (en) | 2013-08-06 | 2020-09-29 | Apple Inc. | Auto-activating smart responses based on activities from remote devices |
US10791176B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US10810274B2 (en) | 2017-05-15 | 2020-10-20 | Apple Inc. | Optimizing dialogue policy decisions for digital assistants using implicit feedback |
US10846298B2 (en) | 2016-10-28 | 2020-11-24 | Microsoft Technology Licensing, Llc | Record profiling for dataset sampling |
US11010550B2 (en) | 2015-09-29 | 2021-05-18 | Apple Inc. | Unified language modeling framework for word prediction, auto-completion and auto-correction |
US11025565B2 (en) | 2015-06-07 | 2021-06-01 | Apple Inc. | Personalized prediction of responses for instant messaging |
US11217255B2 (en) | 2017-05-16 | 2022-01-04 | Apple Inc. | Far-field extension for digital assistant services |
US11256710B2 (en) | 2016-10-20 | 2022-02-22 | Microsoft Technology Licensing, Llc | String transformation sub-program suggestion |
US11587559B2 (en) | 2015-09-30 | 2023-02-21 | Apple Inc. | Intelligent device identification |
US11620304B2 (en) | 2016-10-20 | 2023-04-04 | Microsoft Technology Licensing, Llc | Example management for string transformation |
Families Citing this family (344)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8352400B2 (en) | 1991-12-23 | 2013-01-08 | Hoffberg Steven M | Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore |
US6344791B1 (en) | 1998-07-24 | 2002-02-05 | Brad A. Armstrong | Variable sensor with tactile feedback |
US6347997B1 (en) * | 1997-10-01 | 2002-02-19 | Brad A. Armstrong | Analog controls housed with electronic displays |
US6222525B1 (en) | 1992-03-05 | 2001-04-24 | Brad A. Armstrong | Image controllers with sheet connected sensors |
ATE188793T1 (en) | 1994-10-12 | 2000-01-15 | Touchtunes Music Corp | INTELLIGENT SYSTEM FOR NUMERICAL AUDIOVISUAL REPRODUCTION |
US7424731B1 (en) | 1994-10-12 | 2008-09-09 | Touchtunes Music Corporation | Home digital audiovisual information recording and playback system |
US7188352B2 (en) | 1995-07-11 | 2007-03-06 | Touchtunes Music Corporation | Intelligent digital audiovisual playback system |
US8661477B2 (en) | 1994-10-12 | 2014-02-25 | Touchtunes Music Corporation | System for distributing and selecting audio and video information and method implemented by said system |
US8674932B2 (en) | 1996-07-05 | 2014-03-18 | Anascape, Ltd. | Image controller |
US6351205B1 (en) | 1996-07-05 | 2002-02-26 | Brad A. Armstrong | Variable-conductance sensor |
US6272457B1 (en) * | 1996-09-16 | 2001-08-07 | Datria Systems, Inc. | Spatial asset management system that time-tags and combines captured speech data and captured location data using a predifed reference grammar with a semantic relationship structure |
FR2753868A1 (en) | 1996-09-25 | 1998-03-27 | Technical Maintenance Corp | METHOD FOR SELECTING A RECORDING ON AN AUDIOVISUAL DIGITAL REPRODUCTION SYSTEM AND SYSTEM FOR IMPLEMENTING THE METHOD |
FR2769165B1 (en) | 1997-09-26 | 2002-11-29 | Technical Maintenance Corp | WIRELESS SYSTEM WITH DIGITAL TRANSMISSION FOR SPEAKERS |
US6456778B2 (en) | 1997-10-01 | 2002-09-24 | Brad A. Armstrong | Analog controls housed with electronic displays for video recorders and cameras |
US6404584B2 (en) | 1997-10-01 | 2002-06-11 | Brad A. Armstrong | Analog controls housed with electronic displays for voice recorders |
US6532000B2 (en) | 1997-10-01 | 2003-03-11 | Brad A. Armstrong | Analog controls housed with electronic displays for global positioning systems |
US6415707B1 (en) | 1997-10-01 | 2002-07-09 | Brad A. Armstrong | Analog controls housed with electronic displays for coffee makers |
US7885822B2 (en) * | 2001-05-09 | 2011-02-08 | William Rex Akers | System and method for electronic medical file management |
US7956894B2 (en) * | 1997-10-14 | 2011-06-07 | William Rex Akers | Apparatus and method for computerized multi-media medical and pharmaceutical data organization and transmission |
US6597392B1 (en) * | 1997-10-14 | 2003-07-22 | Healthcare Vision, Inc. | Apparatus and method for computerized multi-media data organization and transmission |
GB9722766D0 (en) | 1997-10-28 | 1997-12-24 | British Telecomm | Portable computers |
JPH11213054A (en) * | 1998-01-23 | 1999-08-06 | Hitachi Ltd | Method for distributing and withdrawing electronic work sheet |
US6295391B1 (en) * | 1998-02-19 | 2001-09-25 | Hewlett-Packard Company | Automatic data routing via voice command annotation |
US6519763B1 (en) * | 1998-03-30 | 2003-02-11 | Compuware Corporation | Time management and task completion and prediction software |
US6381745B1 (en) * | 1998-05-21 | 2002-04-30 | Avaya Technology Corp. | Signal distribution system |
FR2781582B1 (en) | 1998-07-21 | 2001-01-12 | Technical Maintenance Corp | SYSTEM FOR DOWNLOADING OBJECTS OR FILES FOR SOFTWARE UPDATE |
US8028318B2 (en) | 1999-07-21 | 2011-09-27 | Touchtunes Music Corporation | Remote control unit for activating and deactivating means for payment and for displaying payment status |
FR2781591B1 (en) | 1998-07-22 | 2000-09-22 | Technical Maintenance Corp | AUDIOVISUAL REPRODUCTION SYSTEM |
FR2781580B1 (en) | 1998-07-22 | 2000-09-22 | Technical Maintenance Corp | SOUND CONTROL CIRCUIT FOR INTELLIGENT DIGITAL AUDIOVISUAL REPRODUCTION SYSTEM |
US7966078B2 (en) | 1999-02-01 | 2011-06-21 | Steven Hoffberg | Network media appliance system and method |
JP2002536722A (en) * | 1999-02-01 | 2002-10-29 | バーポイント.コム.インコーポレイテッド | An interactive system for looking up products on a network |
US8726330B2 (en) | 1999-02-22 | 2014-05-13 | Touchtunes Music Corporation | Intelligent digital audiovisual playback system |
US6606611B1 (en) | 1999-02-27 | 2003-08-12 | Emdadur Khan | System and method for audio-only internet browsing using a standard telephone |
US6469716B1 (en) * | 1999-03-19 | 2002-10-22 | Corel Inc. | System and method for processing data for a graphical object |
US6469715B1 (en) * | 1999-03-19 | 2002-10-22 | Corel Inc. | System and method for controlling the operation of a graphical object using a project |
GB2351884B (en) * | 1999-04-10 | 2002-07-31 | Peter Strong | Data transmission method |
US6535949B1 (en) * | 1999-04-19 | 2003-03-18 | Research In Motion Limited | Portable electronic device having a log-structured file system in flash memory |
US6396481B1 (en) | 1999-04-19 | 2002-05-28 | Ecrio Inc. | Apparatus and method for portable handwriting capture |
US6832717B1 (en) * | 1999-05-25 | 2004-12-21 | Silverbrook Research Pty Ltd | Computer system interface surface |
US6825945B1 (en) * | 1999-05-25 | 2004-11-30 | Silverbrook Research Pty Ltd | Method and system for delivery of a brochure |
US7286115B2 (en) | 2000-05-26 | 2007-10-23 | Tegic Communications, Inc. | Directional input system with automatic correction |
US7030863B2 (en) | 2000-05-26 | 2006-04-18 | America Online, Incorporated | Virtual keyboard system with automatic correction |
US6626956B1 (en) * | 1999-06-15 | 2003-09-30 | Microsoft Corporation | Edit-time redirect for HTML documents |
FR2796482B1 (en) | 1999-07-16 | 2002-09-06 | Touchtunes Music Corp | REMOTE MANAGEMENT SYSTEM FOR AT LEAST ONE AUDIOVISUAL INFORMATION REPRODUCING DEVICE |
US6401103B1 (en) * | 1999-08-06 | 2002-06-04 | International Business Machines Corporation | Apparatus, method, and article of manufacture for client-side optimistic locking in a stateless environment |
IL132619A (en) * | 1999-10-27 | 2003-07-06 | Eci Telecom Ltd | Linked lists |
US20060259321A1 (en) * | 1999-11-05 | 2006-11-16 | Mindmatters Technologies, Inc. | System for automating and managing an enterprise IP environment |
WO2001035277A1 (en) * | 1999-11-12 | 2001-05-17 | Mindmatters Technologies, Inc. | System for automating and managing an enterprise ip environment |
US6978475B1 (en) * | 1999-11-24 | 2005-12-20 | Ecable, Llc | Method and apparatus for internet TV |
US7412478B1 (en) * | 2000-01-27 | 2008-08-12 | Marger Johnson & Mccollom, P.C. | Rich media file format and delivery methods |
US8050495B2 (en) * | 2000-01-27 | 2011-11-01 | Marger Johnson & Mccollom, P.C. | Rich media file format and delivery methods |
US6753884B1 (en) * | 2000-01-31 | 2004-06-22 | Journyx, Inc. | Method and apparatus for wireless web time and expense entry via time keeping and expense tracking server access |
ATE282272T1 (en) * | 2000-01-31 | 2004-11-15 | Aeptec Microsystems Inc | ACCESS DEVICE FOR BROADBAND COMMUNICATIONS |
US7069498B1 (en) * | 2000-01-31 | 2006-06-27 | Journyx, Inc. | Method and apparatus for a web based punch clock/time clock |
US7990985B2 (en) * | 2000-01-31 | 2011-08-02 | 3E Technologies International, Inc. | Broadband communications access device |
US7382786B2 (en) * | 2000-01-31 | 2008-06-03 | 3E Technologies International, Inc. | Integrated phone-based home gateway system with a broadband communication device |
US7047033B2 (en) * | 2000-02-01 | 2006-05-16 | Infogin Ltd | Methods and apparatus for analyzing, processing and formatting network information such as web-pages |
US7289244B2 (en) | 2000-02-02 | 2007-10-30 | Raja Singh Tuli | Portable high speed internet access device |
US7356570B1 (en) | 2000-08-29 | 2008-04-08 | Raja Tuli | Portable high speed communication device |
US20020115477A1 (en) * | 2001-02-13 | 2002-08-22 | Raja Singh | Portable high speed internet access device with scrolling |
US6633314B1 (en) * | 2000-02-02 | 2003-10-14 | Raja Tuli | Portable high speed internet device integrating cellular telephone and palm top computer |
FR2805377B1 (en) | 2000-02-23 | 2003-09-12 | Touchtunes Music Corp | EARLY ORDERING PROCESS FOR A SELECTION, DIGITAL SYSTEM AND JUKE-BOX FOR IMPLEMENTING THE METHOD |
US6941382B1 (en) | 2000-02-07 | 2005-09-06 | Raja Tuli | Portable high speed internet or desktop device |
FR2805072B1 (en) | 2000-02-16 | 2002-04-05 | Touchtunes Music Corp | METHOD FOR ADJUSTING THE SOUND VOLUME OF A DIGITAL SOUND RECORDING |
FR2805060B1 (en) | 2000-02-16 | 2005-04-08 | Touchtunes Music Corp | METHOD FOR RECEIVING FILES DURING DOWNLOAD |
US6874009B1 (en) | 2000-02-16 | 2005-03-29 | Raja Tuli | Portable high speed internet device with user fees |
US6700589B1 (en) * | 2000-02-17 | 2004-03-02 | International Business Machines Corporation | Method, system, and program for magnifying content downloaded from a server over a network |
US20010034659A1 (en) * | 2000-02-18 | 2001-10-25 | Mitsubishi International Corporation | Simplified method and system for e-commerce operable in on-line and off -line modes |
US20060245741A1 (en) * | 2000-03-09 | 2006-11-02 | Cynthia Lakhansingh | Digital enterainment recorder |
US7072569B2 (en) * | 2001-03-19 | 2006-07-04 | Cynthia Lakhansingh | Portable entertainment device |
US6784899B1 (en) * | 2000-03-31 | 2004-08-31 | Ricoh Company, Ltd. | Systems and methods for providing rich multimedia messages to remote users using telephones and facsimile machines |
US6742038B2 (en) | 2000-04-07 | 2004-05-25 | Danger, Inc. | System and method of linking user identification to a subscriber identification module |
US6735624B1 (en) * | 2000-04-07 | 2004-05-11 | Danger, Inc. | Method for configuring and authenticating newly delivered portal device |
US6721804B1 (en) | 2000-04-07 | 2004-04-13 | Danger, Inc. | Portal system for converting requested data into a bytecode format based on portal device's graphical capabilities |
US6701522B1 (en) | 2000-04-07 | 2004-03-02 | Danger, Inc. | Apparatus and method for portal device authentication |
JP4295894B2 (en) * | 2000-04-14 | 2009-07-15 | 株式会社アドバンテスト | Semiconductor device test apparatus and test method |
US7143338B2 (en) * | 2000-04-14 | 2006-11-28 | International Business Machines Corporation | Method and system in an electronic spreadsheet for handling absolute references in a copy/cut and paste operation according to different modes |
US6643641B1 (en) * | 2000-04-27 | 2003-11-04 | Russell Snyder | Web search engine with graphic snapshots |
FR2808906B1 (en) | 2000-05-10 | 2005-02-11 | Touchtunes Music Corp | DEVICE AND METHOD FOR REMOTELY MANAGING A NETWORK OF AUDIOVISUAL INFORMATION REPRODUCTION SYSTEMS |
WO2001091438A1 (en) * | 2000-05-19 | 2001-11-29 | Synapse Wireless, Inc. | Method and apparatus for generating dynamic graphical representations and real-time notification of the status of a remotely monitored system |
US6671700B1 (en) * | 2000-05-23 | 2003-12-30 | Palm Source, Inc. | Method and apparatus for parallel execution of conduits during simultaneous synchronization of databases |
US7328275B1 (en) * | 2000-05-30 | 2008-02-05 | Perttunen Cary D | Wirelessly retrieving and locally caching child and sibling items in a browsing session |
US6625503B1 (en) * | 2000-06-09 | 2003-09-23 | Motorola, Inc. | Personal preference information communication method and apparatus |
US20020016727A1 (en) * | 2000-06-16 | 2002-02-07 | Thoughtbank, Inc. | Systems and methods for interactive innovation marketplace |
FR2811175B1 (en) | 2000-06-29 | 2002-12-27 | Touchtunes Music Corp | AUDIOVISUAL INFORMATION DISTRIBUTION METHOD AND AUDIOVISUAL INFORMATION DISTRIBUTION SYSTEM |
FR2811114B1 (en) | 2000-06-29 | 2002-12-27 | Touchtunes Music Corp | DEVICE AND METHOD FOR COMMUNICATION BETWEEN A SYSTEM FOR REPRODUCING AUDIOVISUAL INFORMATION AND AN ELECTRONIC ENTERTAINMENT MACHINE |
JP2002027544A (en) * | 2000-07-04 | 2002-01-25 | Fujitsu Ltd | Data storing system |
US20020007382A1 (en) * | 2000-07-06 | 2002-01-17 | Shinichi Nojima | Computer having character input function,method of carrying out process depending on input characters, and storage medium |
US9189069B2 (en) | 2000-07-17 | 2015-11-17 | Microsoft Technology Licensing, Llc | Throwing gestures for mobile devices |
AU8879601A (en) * | 2000-09-07 | 2002-03-22 | A2Q Inc | Method and system for high speed wireless data transmission and reception |
US20020035571A1 (en) * | 2000-09-15 | 2002-03-21 | Coult John H | Digital patent marking method |
FR2814085B1 (en) | 2000-09-15 | 2005-02-11 | Touchtunes Music Corp | ENTERTAINMENT METHOD BASED ON MULTIPLE CHOICE COMPETITION GAMES |
US7593751B2 (en) | 2000-09-18 | 2009-09-22 | Field Data Management Solutions, Llc | Conducting field operations using handheld data management devices |
US6523037B1 (en) | 2000-09-22 | 2003-02-18 | Ebay Inc, | Method and system for communicating selected search results between first and second entities over a network |
US8392552B2 (en) | 2000-09-28 | 2013-03-05 | Vig Acquisitions Ltd., L.L.C. | System and method for providing configurable security monitoring utilizing an integrated information system |
CA2422519A1 (en) * | 2000-09-28 | 2002-04-04 | Vigilos, Inc. | System and method for dynamic interaction with remote devices |
US7627665B2 (en) | 2000-09-28 | 2009-12-01 | Barker Geoffrey T | System and method for providing configurable security monitoring utilizing an integrated information system |
AU2001296925A1 (en) | 2000-09-28 | 2002-04-08 | Vigilos, Inc. | Method and process for configuring a premises for monitoring |
US7191211B2 (en) | 2000-10-03 | 2007-03-13 | Raja Tuli | Portable high speed internet access device priority protocol |
US20020095378A1 (en) * | 2000-10-31 | 2002-07-18 | Cauchon Mark P. | Service provider network for legal services with direct browser delivery of rich text format documents |
US20070276675A1 (en) * | 2000-11-10 | 2007-11-29 | Gabrick John J | Innovation management system, apparatus, and method |
US20050240428A1 (en) * | 2000-11-10 | 2005-10-27 | Gabrick John J | System for automating and managing an IP environment |
US20040073443A1 (en) * | 2000-11-10 | 2004-04-15 | Gabrick John J. | System for automating and managing an IP environment |
US20090115739A1 (en) * | 2000-11-23 | 2009-05-07 | Samsung Electronics Co., Ltd. | Method of providing user interface in a portable terminal |
US7222184B2 (en) * | 2000-11-29 | 2007-05-22 | Ncr Corporation | Method of downloading web content to a network kiosk in advance |
EP1217541A1 (en) * | 2000-11-29 | 2002-06-26 | Lafayette Software Inc. | Method of processing queries in a database system, and database system and software product for implementing such method |
US7925703B2 (en) * | 2000-12-26 | 2011-04-12 | Numedeon, Inc. | Graphical interactive interface for immersive online communities |
US20020087628A1 (en) * | 2000-12-29 | 2002-07-04 | Andrew Rouse | System and method for providing wireless device access to e-mail applications |
US6983310B2 (en) * | 2000-12-29 | 2006-01-03 | International Business Machines Corporation | System and method for providing search capabilties on a wireless device |
US20050159136A1 (en) * | 2000-12-29 | 2005-07-21 | Andrew Rouse | System and method for providing wireless device access |
US7142883B2 (en) * | 2000-12-29 | 2006-11-28 | International Business Machines Corporation | System and method for providing search capabilities and storing functions on a wireless access device |
US7616971B2 (en) * | 2000-12-29 | 2009-11-10 | International Business Machines Corporation | System and method for providing access to forms for displaying information on a wireless access device |
US8112544B2 (en) * | 2000-12-29 | 2012-02-07 | International Business Machines Corporation | System and method for providing customizable options on a wireless device |
DE10100648A1 (en) * | 2001-01-09 | 2002-07-11 | Alcatel Sa | Information output apparatus |
US6928461B2 (en) | 2001-01-24 | 2005-08-09 | Raja Singh Tuli | Portable high speed internet access device with encryption |
AU2002255568B8 (en) | 2001-02-20 | 2014-01-09 | Adidas Ag | Modular personal network systems and methods |
US8452259B2 (en) | 2001-02-20 | 2013-05-28 | Adidas Ag | Modular personal network systems and methods |
US20020162112A1 (en) * | 2001-02-21 | 2002-10-31 | Vesta Broadband Services, Inc. | PC-based virtual set-top box for internet-based distribution of video and other data |
US6928465B2 (en) * | 2001-03-16 | 2005-08-09 | Wells Fargo Bank, N.A. | Redundant email address detection and capture system |
CA2441120C (en) * | 2001-03-16 | 2010-06-22 | Netomat, Inc. | Sharing, managing and communicating information over a computer network |
USH2201H1 (en) * | 2001-03-19 | 2007-09-04 | The United States Of America As Represented By The Secretary Of The Air Force | Software architecture and design for facilitating prototyping in distributed virtual environments |
US8054971B2 (en) * | 2001-04-27 | 2011-11-08 | Comverse Ltd | Free-hand mobile messaging-method and device |
US6711543B2 (en) * | 2001-05-30 | 2004-03-23 | Cameronsound, Inc. | Language independent and voice operated information management system |
JPWO2002099656A1 (en) * | 2001-05-31 | 2004-09-16 | 富士通株式会社 | Electronics and programs |
US8294552B2 (en) * | 2001-07-10 | 2012-10-23 | Xatra Fund Mx, Llc | Facial scan biometrics on a payment device |
US7334000B2 (en) | 2001-07-16 | 2008-02-19 | Aol Llc | Method and apparatus for calendaring reminders |
AU2002355799A1 (en) * | 2001-07-30 | 2003-02-17 | Markport Limited | Improved management of broadcast content for a mobile handset |
US6937154B2 (en) * | 2001-08-21 | 2005-08-30 | Tabula Rasa, Inc. | Method and apparatus for facilitating personal attention via wireless links |
FR2830153B1 (en) * | 2001-09-21 | 2004-07-02 | France Telecom | DIGITAL IMAGE TRANSMISSION ASSEMBLY, METHODS IMPLEMENTED IN SUCH AN ASSEMBLY, DIGITAL IMAGE TRANSMISSION DEVICE, AND DIGITAL IMAGE DISPLAY DEVICE |
US7089257B2 (en) * | 2001-09-27 | 2006-08-08 | Qualcomm, Inc. | Method and system for providing a unified data exchange and storage format |
US20040103046A1 (en) * | 2001-10-08 | 2004-05-27 | Christoph Fryer Jaskie, Inc. | Handheld ERP system |
US20030105535A1 (en) * | 2001-11-05 | 2003-06-05 | Roman Rammler | Unit controller with integral full-featured human-machine interface |
US20030093381A1 (en) * | 2001-11-09 | 2003-05-15 | David Hohl | Systems and methods for authorization of data strings |
US7054811B2 (en) * | 2002-11-06 | 2006-05-30 | Cellmax Systems Ltd. | Method and system for verifying and enabling user access based on voice parameters |
US20070069975A1 (en) * | 2001-11-28 | 2007-03-29 | Palm, Inc. | Detachable expandable flexible display |
US20030160755A1 (en) * | 2002-02-28 | 2003-08-28 | Palm, Inc. | Detachable expandable flexible display |
US6710754B2 (en) * | 2001-11-29 | 2004-03-23 | Palm, Inc. | Moveable output device |
US8135609B2 (en) * | 2002-01-08 | 2012-03-13 | Microsoft Corporation | Identifying and surveying subscribers |
US7752135B2 (en) * | 2002-01-16 | 2010-07-06 | International Business Machines Corporation | Credit authorization system and method |
US7480715B1 (en) | 2002-01-25 | 2009-01-20 | Vig Acquisitions Ltd., L.L.C. | System and method for performing a predictive threat assessment based on risk factors |
EP1333373B1 (en) * | 2002-01-30 | 2011-03-09 | Hewlett-Packard Company (a Delaware Corporation) | Computer and base station |
US20030163311A1 (en) * | 2002-02-26 | 2003-08-28 | Li Gong | Intelligent social agents |
US7162513B1 (en) | 2002-03-27 | 2007-01-09 | Danger, Inc. | Apparatus and method for distributing electronic messages to a wireless data processing device using a multi-tiered queuing architecture |
US7155725B1 (en) | 2002-03-27 | 2006-12-26 | Danger, Inc. | Apparatus and method for coordinating multiple e-mail accounts |
US7321769B2 (en) * | 2002-04-12 | 2008-01-22 | Intel Corporation | Method and apparatus for managing personal cache in a wireless network |
FI112119B (en) * | 2002-06-25 | 2003-10-31 | Nokia Corp | Touch screen control command interpreting method for electronic device e.g. mobile station, involves interpreting contact area larger than area before touch, as same area when area has been touched for release of touch |
US7260714B2 (en) * | 2002-08-20 | 2007-08-21 | Sony Corporation | System and method for authenticating wireless component |
US8151304B2 (en) | 2002-09-16 | 2012-04-03 | Touchtunes Music Corporation | Digital downloading jukebox system with user-tailored music management, communications, and other tools |
US9646339B2 (en) | 2002-09-16 | 2017-05-09 | Touchtunes Music Corporation | Digital downloading jukebox system with central and local music servers |
US11029823B2 (en) | 2002-09-16 | 2021-06-08 | Touchtunes Music Corporation | Jukebox with customizable avatar |
US7822687B2 (en) | 2002-09-16 | 2010-10-26 | Francois Brillon | Jukebox with customizable avatar |
US8332895B2 (en) | 2002-09-16 | 2012-12-11 | Touchtunes Music Corporation | Digital downloading jukebox system with user-tailored music management, communications, and other tools |
US10373420B2 (en) | 2002-09-16 | 2019-08-06 | Touchtunes Music Corporation | Digital downloading jukebox with enhanced communication features |
US8103589B2 (en) | 2002-09-16 | 2012-01-24 | Touchtunes Music Corporation | Digital downloading jukebox system with central and local music servers |
US8584175B2 (en) | 2002-09-16 | 2013-11-12 | Touchtunes Music Corporation | Digital downloading jukebox system with user-tailored music management, communications, and other tools |
US7062512B1 (en) | 2002-09-27 | 2006-06-13 | Danger, Inc. | System and method for processing identification codes |
US7069326B1 (en) | 2002-09-27 | 2006-06-27 | Danger, Inc. | System and method for efficiently managing data transports |
US7373144B1 (en) | 2002-09-30 | 2008-05-13 | Danger, Inc. | System and method for automatically providing user status in a messaging service |
US7107349B2 (en) * | 2002-09-30 | 2006-09-12 | Danger, Inc. | System and method for disabling and providing a notification for a data processing device |
US20090125591A1 (en) * | 2002-09-30 | 2009-05-14 | Ficus Kirkpatrick | Instant messaging proxy apparatus and method |
US7383303B1 (en) | 2002-09-30 | 2008-06-03 | Danger, Inc. | System and method for integrating personal information management and messaging applications |
US20070283047A1 (en) * | 2002-10-01 | 2007-12-06 | Theis Ronald L A | System and method for processing alphanumeric characters for display on a data processing device |
US7437405B1 (en) | 2002-10-01 | 2008-10-14 | Danger, Inc. | System and method for managing data objects in a wireless device |
US7360156B1 (en) * | 2002-10-09 | 2008-04-15 | Microsoft Corporation | Method and system for performing actions on content in a region within a free form two-dimensional workspace |
US7116840B2 (en) | 2002-10-31 | 2006-10-03 | Microsoft Corporation | Decoding and error correction in 2-D arrays |
CA2411203A1 (en) * | 2002-11-05 | 2004-05-05 | Alphaglobal It Inc. | Intelligent data management system and method |
EP1561159A4 (en) | 2002-11-12 | 2007-08-29 | Zetera Corp | Electrical devices with improved communication |
US8005918B2 (en) | 2002-11-12 | 2011-08-23 | Rateze Remote Mgmt. L.L.C. | Data storage devices having IP capable partitions |
US7170890B2 (en) | 2002-12-16 | 2007-01-30 | Zetera Corporation | Electrical devices with improved communication |
US7649880B2 (en) | 2002-11-12 | 2010-01-19 | Mark Adams | Systems and methods for deriving storage area commands |
US8176428B2 (en) | 2002-12-03 | 2012-05-08 | Datawind Net Access Corporation | Portable internet access device back page cache |
US7143391B1 (en) * | 2002-12-11 | 2006-11-28 | Oracle International Corporation | Method and apparatus for globalization testing computer software |
US20040160975A1 (en) * | 2003-01-21 | 2004-08-19 | Charles Frank | Multicast communication protocols, systems and methods |
US7426329B2 (en) | 2003-03-06 | 2008-09-16 | Microsoft Corporation | Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player |
US7124134B2 (en) | 2003-05-08 | 2006-10-17 | Eugene Buzzeo | Distributed, multi-user, multi-threaded application development system and method |
US7045377B2 (en) * | 2003-06-26 | 2006-05-16 | Rj Mears, Llc | Method for making a semiconductor device including a superlattice and adjacent semiconductor layer with doped regions defining a semiconductor junction |
US20040267887A1 (en) * | 2003-06-30 | 2004-12-30 | Berger Kelly D. | System and method for dynamically managing presence and contact information |
US7117445B2 (en) * | 2003-06-30 | 2006-10-03 | Danger, Inc. | Multi-mode communication apparatus and interface for contacting a user |
US20050015316A1 (en) * | 2003-07-02 | 2005-01-20 | Vincenzo Salluzzo | Methods for calendaring, tracking, and expense reporting, and devices and systems employing same |
GB2404803A (en) * | 2003-07-16 | 2005-02-09 | Empics Ltd | Image editing and distribution system |
US20050210391A1 (en) * | 2003-08-11 | 2005-09-22 | Core Mobility, Inc. | Systems and methods for navigating content in an interactive ticker |
US7441203B2 (en) | 2003-08-11 | 2008-10-21 | Core Mobility, Inc. | Interactive user interface presentation attributes for location-based content |
US7430724B2 (en) * | 2003-08-11 | 2008-09-30 | Core Mobility, Inc. | Systems and methods for displaying content in a ticker |
US7343564B2 (en) * | 2003-08-11 | 2008-03-11 | Core Mobility, Inc. | Systems and methods for displaying location-based maps on communication devices |
US20050039135A1 (en) * | 2003-08-11 | 2005-02-17 | Konstantin Othmer | Systems and methods for navigating content in an interactive ticker |
US7370283B2 (en) * | 2003-08-11 | 2008-05-06 | Core Mobility, Inc. | Systems and methods for populating a ticker using multiple data transmission modes |
US7343179B1 (en) | 2003-08-13 | 2008-03-11 | Danger Research | System and method for previewing and purchasing ring tones for a mobile device |
US7532196B2 (en) * | 2003-10-30 | 2009-05-12 | Microsoft Corporation | Distributed sensing techniques for mobile devices |
US7583842B2 (en) | 2004-01-06 | 2009-09-01 | Microsoft Corporation | Enhanced approach of m-array decoding and error correction |
US7263224B2 (en) | 2004-01-16 | 2007-08-28 | Microsoft Corporation | Strokes localization by m-array decoding and fast image matching |
US8442331B2 (en) | 2004-02-15 | 2013-05-14 | Google Inc. | Capturing text from rendered documents using supplemental information |
US7707039B2 (en) | 2004-02-15 | 2010-04-27 | Exbiblio B.V. | Automatic modification of web pages |
US10635723B2 (en) | 2004-02-15 | 2020-04-28 | Google Llc | Search engines and systems with handheld document data capture devices |
US7812860B2 (en) | 2004-04-01 | 2010-10-12 | Exbiblio B.V. | Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device |
US7383238B1 (en) * | 2004-02-24 | 2008-06-03 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Inductive monitoring system constructed from nominal system data and its use in real-time system monitoring |
US7552630B2 (en) * | 2004-02-27 | 2009-06-30 | Akron Special Machinery, Inc. | Load wheel drive |
US7949726B2 (en) * | 2004-03-12 | 2011-05-24 | Ocean And Coastal Environmental Sensing, Inc. | System and method for delivering information on demand |
US20060098900A1 (en) | 2004-09-27 | 2006-05-11 | King Martin T | Secure data gathering from rendered documents |
US20060081714A1 (en) | 2004-08-23 | 2006-04-20 | King Martin T | Portable scanning device |
US9116890B2 (en) | 2004-04-01 | 2015-08-25 | Google Inc. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
US8146156B2 (en) | 2004-04-01 | 2012-03-27 | Google Inc. | Archive of text captures from rendered documents |
US9008447B2 (en) | 2004-04-01 | 2015-04-14 | Google Inc. | Method and system for character recognition |
US7894670B2 (en) | 2004-04-01 | 2011-02-22 | Exbiblio B.V. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
WO2008028674A2 (en) | 2006-09-08 | 2008-03-13 | Exbiblio B.V. | Optical scanners, such as hand-held optical scanners |
US8081849B2 (en) | 2004-12-03 | 2011-12-20 | Google Inc. | Portable scanning and memory device |
US7990556B2 (en) | 2004-12-03 | 2011-08-02 | Google Inc. | Association of a portable scanner with input/output and storage devices |
US9143638B2 (en) | 2004-04-01 | 2015-09-22 | Google Inc. | Data capture from rendered documents using handheld device |
US8713418B2 (en) | 2004-04-12 | 2014-04-29 | Google Inc. | Adding value to a rendered document |
US7827139B2 (en) | 2004-04-15 | 2010-11-02 | Citrix Systems, Inc. | Methods and apparatus for sharing graphical screen data in a bandwidth-adaptive manner |
US7680885B2 (en) | 2004-04-15 | 2010-03-16 | Citrix Systems, Inc. | Methods and apparatus for synchronization of data set representations in a bandwidth-adaptive manner |
US8874504B2 (en) | 2004-12-03 | 2014-10-28 | Google Inc. | Processing techniques for visual capture data from a rendered document |
US8489624B2 (en) | 2004-05-17 | 2013-07-16 | Google, Inc. | Processing techniques for text capture from a rendered document |
US8620083B2 (en) | 2004-12-03 | 2013-12-31 | Google Inc. | Method and system for character recognition |
US7693880B1 (en) * | 2004-05-06 | 2010-04-06 | Symantec Operating Corporation | Mirrored storage at the file system level |
US8346620B2 (en) | 2004-07-19 | 2013-01-01 | Google Inc. | Automatic modification of web pages |
US7797724B2 (en) | 2004-08-31 | 2010-09-14 | Citrix Systems, Inc. | Methods and apparatus for secure online access on a client device |
US7599838B2 (en) | 2004-09-01 | 2009-10-06 | Sap Aktiengesellschaft | Speech animation with behavioral contexts for application scenarios |
US20060055281A1 (en) * | 2004-09-16 | 2006-03-16 | Com Dev Ltd. | Microelectromechanical electrostatic actuator assembly |
US8903760B2 (en) * | 2004-11-12 | 2014-12-02 | International Business Machines Corporation | Method and system for information workflows |
US20060106868A1 (en) * | 2004-11-17 | 2006-05-18 | Youngtack Shim | Information processing systems and methods thereor |
WO2006076498A2 (en) | 2005-01-13 | 2006-07-20 | Welch Allyn, Inc. | Vital signs monitor |
US7920169B2 (en) | 2005-01-31 | 2011-04-05 | Invention Science Fund I, Llc | Proximity of shared image devices |
US20060174203A1 (en) | 2005-01-31 | 2006-08-03 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Viewfinder for shared image device |
US20060221197A1 (en) * | 2005-03-30 | 2006-10-05 | Jung Edward K | Image transformation estimator of an imaging device |
US7876357B2 (en) | 2005-01-31 | 2011-01-25 | The Invention Science Fund I, Llc | Estimating shared image device operational capabilities or resources |
US9489717B2 (en) | 2005-01-31 | 2016-11-08 | Invention Science Fund I, Llc | Shared image device |
US9325781B2 (en) | 2005-01-31 | 2016-04-26 | Invention Science Fund I, Llc | Audio sharing |
US8902320B2 (en) | 2005-01-31 | 2014-12-02 | The Invention Science Fund I, Llc | Shared image device synchronization or designation |
US9910341B2 (en) | 2005-01-31 | 2018-03-06 | The Invention Science Fund I, Llc | Shared image device designation |
US8606383B2 (en) | 2005-01-31 | 2013-12-10 | The Invention Science Fund I, Llc | Audio sharing |
US9124729B2 (en) | 2005-01-31 | 2015-09-01 | The Invention Science Fund I, Llc | Shared image device synchronization or designation |
US20060170956A1 (en) | 2005-01-31 | 2006-08-03 | Jung Edward K | Shared image devices |
US9082456B2 (en) | 2005-01-31 | 2015-07-14 | The Invention Science Fund I Llc | Shared image device designation |
US20090144167A1 (en) * | 2005-02-10 | 2009-06-04 | Pablo Calamera | System and method for managing data and voice connectivity for wireless devices |
US7607076B2 (en) | 2005-02-18 | 2009-10-20 | Microsoft Corporation | Embedded interaction code document |
US7702850B2 (en) | 2005-03-14 | 2010-04-20 | Thomas Earl Ludwig | Topology independent storage arrays and methods |
US7882122B2 (en) * | 2005-03-18 | 2011-02-01 | Capital Source Far East Limited | Remote access of heterogeneous data |
US20060217110A1 (en) * | 2005-03-25 | 2006-09-28 | Core Mobility, Inc. | Prioritizing the display of non-intrusive content on a mobile communication device |
US7353034B2 (en) | 2005-04-04 | 2008-04-01 | X One, Inc. | Location sharing and tracking using mobile phones or other wireless devices |
US7599560B2 (en) | 2005-04-22 | 2009-10-06 | Microsoft Corporation | Embedded interaction code recognition |
US8253821B2 (en) | 2005-10-31 | 2012-08-28 | The Invention Science Fund I, Llc | Degradation/preservation management of captured data |
US10003762B2 (en) | 2005-04-26 | 2018-06-19 | Invention Science Fund I, Llc | Shared image devices |
US9191611B2 (en) | 2005-06-02 | 2015-11-17 | Invention Science Fund I, Llc | Conditional alteration of a saved image |
US9167195B2 (en) | 2005-10-31 | 2015-10-20 | Invention Science Fund I, Llc | Preservation/degradation of video/audio aspects of a data stream |
US7782365B2 (en) | 2005-06-02 | 2010-08-24 | Searete Llc | Enhanced video/still image correlation |
US9076208B2 (en) | 2006-02-28 | 2015-07-07 | The Invention Science Fund I, Llc | Imagery processing |
US8681225B2 (en) | 2005-06-02 | 2014-03-25 | Royce A. Levien | Storage access technique for captured data |
US8964054B2 (en) | 2006-08-18 | 2015-02-24 | The Invention Science Fund I, Llc | Capturing selected image objects |
US7872675B2 (en) | 2005-06-02 | 2011-01-18 | The Invention Science Fund I, Llc | Saved-image management |
US8233042B2 (en) | 2005-10-31 | 2012-07-31 | The Invention Science Fund I, Llc | Preservation and/or degradation of a video/audio data stream |
US8072501B2 (en) | 2005-10-31 | 2011-12-06 | The Invention Science Fund I, Llc | Preservation and/or degradation of a video/audio data stream |
US9451200B2 (en) | 2005-06-02 | 2016-09-20 | Invention Science Fund I, Llc | Storage access technique for captured data |
US9942511B2 (en) | 2005-10-31 | 2018-04-10 | Invention Science Fund I, Llc | Preservation/degradation of video/audio aspects of a data stream |
US9001215B2 (en) | 2005-06-02 | 2015-04-07 | The Invention Science Fund I, Llc | Estimating shared image device operational capabilities or resources |
US9621749B2 (en) | 2005-06-02 | 2017-04-11 | Invention Science Fund I, Llc | Capturing selected image objects |
US20070222865A1 (en) | 2006-03-15 | 2007-09-27 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Enhanced video/still image correlation |
US9819490B2 (en) | 2005-05-04 | 2017-11-14 | Invention Science Fund I, Llc | Regional proximity for shared image device(s) |
US9967424B2 (en) | 2005-06-02 | 2018-05-08 | Invention Science Fund I, Llc | Data storage usage protocol |
US8443040B2 (en) | 2005-05-26 | 2013-05-14 | Citrix Systems Inc. | Method and system for synchronizing presentation of a dynamic data set to a plurality of nodes |
US7620981B2 (en) * | 2005-05-26 | 2009-11-17 | Charles William Frank | Virtual devices and virtual bus tunnels, modules and methods |
US20060271550A1 (en) * | 2005-05-26 | 2006-11-30 | Siemens Communications, Inc. | Method and system for remote document editing using a wireless communication device |
US7580576B2 (en) | 2005-06-02 | 2009-08-25 | Microsoft Corporation | Stroke localization and binding to electronic document |
US7822917B2 (en) * | 2005-06-10 | 2010-10-26 | Hewlett-Packard Development Company, L.P. | Mass storage system with user interface |
US7619607B2 (en) | 2005-06-30 | 2009-11-17 | Microsoft Corporation | Embedding a pattern design onto a liquid crystal display |
US7710912B1 (en) | 2005-07-11 | 2010-05-04 | Microsoft Corporation | Managing content synchronization between a data service and a data processing device |
US8849752B2 (en) * | 2005-07-21 | 2014-09-30 | Google Inc. | Overloaded communication session |
US8819092B2 (en) | 2005-08-16 | 2014-08-26 | Rateze Remote Mgmt. L.L.C. | Disaggregated resources and access methods |
US7743214B2 (en) | 2005-08-16 | 2010-06-22 | Mark Adams | Generating storage system commands |
US7622182B2 (en) | 2005-08-17 | 2009-11-24 | Microsoft Corporation | Embedded interaction code enabled display |
US9270532B2 (en) | 2005-10-06 | 2016-02-23 | Rateze Remote Mgmt. L.L.C. | Resource command messages and methods |
US7756890B2 (en) * | 2005-10-28 | 2010-07-13 | Novell, Inc. | Semantic identities |
US7636794B2 (en) | 2005-10-31 | 2009-12-22 | Microsoft Corporation | Distributed sensing techniques for mobile devices |
US20070120980A1 (en) | 2005-10-31 | 2007-05-31 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Preservation/degradation of video/audio aspects of a data stream |
US7664067B2 (en) * | 2005-12-15 | 2010-02-16 | Microsoft Corporation | Preserving socket connections over a wireless network |
US7613955B2 (en) * | 2006-01-06 | 2009-11-03 | Microsoft Corporation | Collecting debug data from a wireless device |
US7817991B2 (en) * | 2006-02-14 | 2010-10-19 | Microsoft Corporation | Dynamic interconnection of mobile devices |
US7877677B2 (en) * | 2006-03-01 | 2011-01-25 | Infogin Ltd. | Methods and apparatus for enabling use of web content on various types of devices |
US20070213094A1 (en) * | 2006-03-09 | 2007-09-13 | Intel Corporation | Method and apparatus for a configurable processing and storage device |
US7924881B2 (en) | 2006-04-10 | 2011-04-12 | Rateze Remote Mgmt. L.L.C. | Datagram identifier management |
JP5028022B2 (en) * | 2006-04-25 | 2012-09-19 | キヤノン株式会社 | Printing apparatus and document printing method |
US20090143059A1 (en) * | 2006-05-02 | 2009-06-04 | Danger, Inc. | System and method remote servicing of a wireless data processing device |
US8054241B2 (en) | 2006-09-14 | 2011-11-08 | Citrix Systems, Inc. | Systems and methods for multiple display support in remote access software |
US7791559B2 (en) * | 2006-09-14 | 2010-09-07 | Citrix Systems, Inc. | System and method for multiple display support in remote access software |
US7730478B2 (en) * | 2006-10-04 | 2010-06-01 | Salesforce.Com, Inc. | Method and system for allowing access to developed applications via a multi-tenant on-demand database service |
US9171419B2 (en) | 2007-01-17 | 2015-10-27 | Touchtunes Music Corporation | Coin operated entertainment system |
US9330529B2 (en) | 2007-01-17 | 2016-05-03 | Touchtunes Music Corporation | Game terminal configured for interaction with jukebox device systems including same, and/or associated methods |
US8141049B2 (en) * | 2007-03-14 | 2012-03-20 | Nec Laboratories America, Inc. | System and method for scalable flow and context-sensitive pointer alias analysis |
US9953481B2 (en) | 2007-03-26 | 2018-04-24 | Touchtunes Music Corporation | Jukebox with associated video server |
US7664932B2 (en) * | 2007-04-13 | 2010-02-16 | Microsoft Corporation | Scalable and configurable execution pipeline of handlers having policy information for selectively acting on payload |
US8638363B2 (en) | 2009-02-18 | 2014-01-28 | Google Inc. | Automatically capturing information, such as capturing information using a document-aware device |
US10290006B2 (en) | 2008-08-15 | 2019-05-14 | Touchtunes Music Corporation | Digital signage and gaming services to comply with federal and state alcohol and beverage laws and regulations |
US8332887B2 (en) | 2008-01-10 | 2012-12-11 | Touchtunes Music Corporation | System and/or methods for distributing advertisements from a central advertisement network to a peripheral device via a local advertisement server |
US20090176481A1 (en) * | 2008-01-04 | 2009-07-09 | Palm, Inc. | Providing Location-Based Services (LBS) Through Remote Display |
WO2009114710A2 (en) | 2008-03-14 | 2009-09-17 | Neomedia Technologies, Inc. | Messaging interchange system |
WO2010005569A1 (en) | 2008-07-09 | 2010-01-14 | Touchtunes Music Corporation | Digital downloading jukebox with revenue-enhancing features |
US20100030609A1 (en) * | 2008-07-31 | 2010-02-04 | International Business Machines Corporation | Intelligent system and fuzzy logic based method to determine project risk |
JP4376952B1 (en) * | 2008-08-11 | 2009-12-02 | 株式会社東芝 | Content transmission device and content display system |
TW201009698A (en) * | 2008-08-19 | 2010-03-01 | Arcadyan Technology Corp | Method for improving the accessing efficiency of embedded web page |
US8290971B2 (en) | 2008-09-09 | 2012-10-16 | Applied Systems, Inc. | Method and apparatus for remotely displaying a list by determining a quantity of data to send based on the list size and the display control size |
EP2169570A1 (en) | 2008-09-25 | 2010-03-31 | Infogin LTD | Mobile sites detection and handling |
WO2010105246A2 (en) | 2009-03-12 | 2010-09-16 | Exbiblio B.V. | Accessing resources based on capturing information from a rendered document |
US8447066B2 (en) | 2009-03-12 | 2013-05-21 | Google Inc. | Performing actions based on capturing information from rendered documents, such as documents under copyright |
US9292166B2 (en) | 2009-03-18 | 2016-03-22 | Touchtunes Music Corporation | Digital jukebox device with improved karaoke-related user interfaces, and associated methods |
KR101748448B1 (en) | 2009-03-18 | 2017-06-16 | 터치튠즈 뮤직 코포레이션 | Entertainment server and associated social networking services |
US10719149B2 (en) | 2009-03-18 | 2020-07-21 | Touchtunes Music Corporation | Digital jukebox device with improved user interfaces, and associated methods |
US10564804B2 (en) | 2009-03-18 | 2020-02-18 | Touchtunes Music Corporation | Digital jukebox device with improved user interfaces, and associated methods |
US9081799B2 (en) | 2009-12-04 | 2015-07-14 | Google Inc. | Using gestalt information to identify locations in printed information |
US9323784B2 (en) | 2009-12-09 | 2016-04-26 | Google Inc. | Image search using text-based elements within the contents of images |
US8713597B2 (en) * | 2010-01-05 | 2014-04-29 | Alcatel Lucent | Authenticating and off-loading IPTV operations from mobile devices to fixed rendering viewing devices |
CA2881456A1 (en) | 2010-01-26 | 2011-08-04 | Touchtunes Music Corporation | Digital jukebox device with improved user interfaces, and associated methods |
US8612700B1 (en) | 2010-10-29 | 2013-12-17 | Symantec Corporation | Method and system of performing block level duplications of cataloged backup data |
US8589509B2 (en) | 2011-01-05 | 2013-11-19 | Cloudium Systems Limited | Controlling and optimizing system latency |
US9286299B2 (en) | 2011-03-17 | 2016-03-15 | Red Hat, Inc. | Backup of data items |
US9824159B2 (en) * | 2011-03-17 | 2017-11-21 | Red Hat, Inc. | Assigning labels to desktop items |
US9588644B2 (en) | 2011-03-17 | 2017-03-07 | Red Hat, Inc. | Time-based organization of desktop items |
US8538679B1 (en) | 2011-04-08 | 2013-09-17 | Oberweis Dairy, Inc. | Enhanced geocoding |
GB2522772B (en) | 2011-09-18 | 2016-01-13 | Touchtunes Music Corp | Digital jukebox device with karaoke and/or photo booth features, and associated methods |
US9588953B2 (en) * | 2011-10-25 | 2017-03-07 | Microsoft Technology Licensing, Llc | Drag and drop always sum formulas |
US11151224B2 (en) | 2012-01-09 | 2021-10-19 | Touchtunes Music Corporation | Systems and/or methods for monitoring audio inputs to jukebox devices |
CN102662690B (en) * | 2012-03-14 | 2014-06-11 | 腾讯科技(深圳)有限公司 | Method and apparatus for starting application program |
US9294539B2 (en) | 2013-03-14 | 2016-03-22 | Microsoft Technology Licensing, Llc | Cooperative federation of digital devices via proxemics and device micro-mobility |
US10360297B2 (en) | 2013-06-14 | 2019-07-23 | Microsoft Technology Licensing, Llc | Simplified data input in electronic documents |
US9942396B2 (en) * | 2013-11-01 | 2018-04-10 | Adobe Systems Incorporated | Document distribution and interaction |
WO2015070070A1 (en) | 2013-11-07 | 2015-05-14 | Touchtunes Music Corporation | Techniques for generating electronic menu graphical user interface layouts for use in connection with electronic devices |
US9544149B2 (en) | 2013-12-16 | 2017-01-10 | Adobe Systems Incorporated | Automatic E-signatures in response to conditions and/or events |
EP3123293A4 (en) | 2014-03-25 | 2017-09-27 | Touchtunes Music Corporation | Digital jukebox device with improved user interfaces, and associated methods |
CN104951361B (en) * | 2014-03-27 | 2018-10-09 | 阿里巴巴集团控股有限公司 | A kind of triggering method and device of timed task |
US10013412B2 (en) | 2014-08-25 | 2018-07-03 | Purple Robot Software, Inc. | Peer to peer spreadsheet processing |
US9418056B2 (en) * | 2014-10-09 | 2016-08-16 | Wrap Media, LLC | Authoring tool for the authoring of wrap packages of cards |
US9442906B2 (en) * | 2014-10-09 | 2016-09-13 | Wrap Media, LLC | Wrap descriptor for defining a wrap package of cards including a global component |
US20160104210A1 (en) | 2014-10-09 | 2016-04-14 | Wrap Media, LLC | Authoring tool for the authoring of wrap packages of cards |
US9703982B2 (en) | 2014-11-06 | 2017-07-11 | Adobe Systems Incorporated | Document distribution and interaction |
US9531545B2 (en) | 2014-11-24 | 2016-12-27 | Adobe Systems Incorporated | Tracking and notification of fulfillment events |
US9432368B1 (en) | 2015-02-19 | 2016-08-30 | Adobe Systems Incorporated | Document distribution and interaction |
US9600803B2 (en) | 2015-03-26 | 2017-03-21 | Wrap Media, LLC | Mobile-first authoring tool for the authoring of wrap packages |
US9582917B2 (en) * | 2015-03-26 | 2017-02-28 | Wrap Media, LLC | Authoring tool for the mixing of cards of wrap packages |
US9935777B2 (en) | 2015-08-31 | 2018-04-03 | Adobe Systems Incorporated | Electronic signature framework with enhanced security |
US9626653B2 (en) | 2015-09-21 | 2017-04-18 | Adobe Systems Incorporated | Document distribution and interaction with delegation of signature authority |
US10347215B2 (en) | 2016-05-27 | 2019-07-09 | Adobe Inc. | Multi-device electronic signature framework |
US10503919B2 (en) | 2017-04-10 | 2019-12-10 | Adobe Inc. | Electronic signature framework with keystroke biometric authentication |
US10657326B2 (en) * | 2017-05-23 | 2020-05-19 | International Business Machines Corporation | Removable spell checker device |
CN111367454A (en) * | 2018-12-25 | 2020-07-03 | 中兴通讯股份有限公司 | Screen display control method and device |
US10594899B1 (en) | 2019-02-15 | 2020-03-17 | Kyocera Document Solutions Inc. | Methods and system for generating a confidential document |
CN110851097B (en) * | 2019-10-18 | 2023-09-29 | 北京字节跳动网络技术有限公司 | Control method, device, medium and electronic equipment for consistency of handwriting data |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5514861A (en) * | 1988-05-11 | 1996-05-07 | Symbol Technologies, Inc. | Computer and/or scanner system mounted on a glove |
US5446891A (en) * | 1992-02-26 | 1995-08-29 | International Business Machines Corporation | System for adjusting hypertext links with weighed user goals and activities |
US5416895A (en) * | 1992-04-08 | 1995-05-16 | Borland International, Inc. | System and methods for improved spreadsheet interface with user-familiar objects |
US5537586A (en) * | 1992-04-30 | 1996-07-16 | Individual, Inc. | Enhanced apparatus and methods for retrieving and selecting profiled textural information records from a database of defined category structures |
US5463696A (en) * | 1992-05-27 | 1995-10-31 | Apple Computer, Inc. | Recognition system and method for user inputs to a computer system |
US5870492A (en) * | 1992-06-04 | 1999-02-09 | Wacom Co., Ltd. | Hand-written character entry apparatus |
US5510606A (en) * | 1993-03-16 | 1996-04-23 | Worthington; Hall V. | Data collection system including a portable data collection terminal with voice prompts |
JPH06311119A (en) * | 1993-04-20 | 1994-11-04 | Sony Corp | Data broadcasting system |
US5699456A (en) * | 1994-01-21 | 1997-12-16 | Lucent Technologies Inc. | Large vocabulary connected speech recognition system and method of language representation using evolutional grammar to represent context free grammars |
US5572643A (en) * | 1995-10-19 | 1996-11-05 | Judson; David H. | Web browser with dynamic display of information objects during linking |
US5838819A (en) * | 1995-11-14 | 1998-11-17 | Lucent Technologies Inc. | System and method for processing and managing electronic copies of handwritten notes |
-
1996
- 1996-12-17 US US08/767,833 patent/US6157935A/en not_active Expired - Lifetime
-
2000
- 2000-03-22 US US09/533,564 patent/US20020069220A1/en not_active Abandoned
Cited By (354)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010005857A1 (en) * | 1998-05-29 | 2001-06-28 | Mihal Lazaridis | System and method for pushing information from a host system to a mobile data communication device |
US9344839B2 (en) * | 1998-05-29 | 2016-05-17 | Blackberry Limited | System and method for pushing information from a host system to a mobile communication device |
US9374435B2 (en) | 1998-05-29 | 2016-06-21 | Blackberry Limited | System and method for using trigger events and a redirector flag to redirect messages |
US20040131230A1 (en) * | 1998-07-22 | 2004-07-08 | Paraskevakos Theodore George | Intelligent currency validation network |
US7006664B2 (en) * | 1998-07-22 | 2006-02-28 | Theodore George Paraskevakos | Intelligent currency validation network |
US7912961B2 (en) | 1998-09-11 | 2011-03-22 | Rpx-Lv Acquisition Llc | Input device for allowing input of unique digital code to a user's computer to control access thereof to a web site |
US7908467B2 (en) | 1998-09-11 | 2011-03-15 | RPX-LV Acquistion LLC | Automatic configuration of equipment software |
US7159037B1 (en) * | 1998-09-11 | 2007-01-02 | Lv Partners, Lp | Method and apparatus for utilizing an existing product code to issue a match to a predetermined location on a global network |
US6587846B1 (en) * | 1999-10-01 | 2003-07-01 | Lamuth John E. | Inductive inference affective language analyzer simulating artificial intelligence |
US9646614B2 (en) | 2000-03-16 | 2017-05-09 | Apple Inc. | Fast, language-independent method for user authentication by voice |
US7383299B1 (en) * | 2000-05-05 | 2008-06-03 | International Business Machines Corporation | System and method for providing service for searching web site addresses |
US7979061B2 (en) * | 2000-08-31 | 2011-07-12 | Nokia Corporation | Handset personalisation |
US20020044149A1 (en) * | 2000-08-31 | 2002-04-18 | Mccarthy Kevin | Handset personalisation |
US20040128142A1 (en) * | 2001-01-05 | 2004-07-01 | Whitham Charles Lamont | Interactive multimedia book |
US7039589B2 (en) * | 2001-01-05 | 2006-05-02 | Charles Lamont Whitham | Interactive multimedia book |
US7627596B2 (en) * | 2001-02-22 | 2009-12-01 | International Business Machines Corporation | Retrieving handwritten documents using multiple document recognizers and techniques allowing both typed and handwritten queries |
US20020165873A1 (en) * | 2001-02-22 | 2002-11-07 | International Business Machines Corporation | Retrieving handwritten documents using multiple document recognizers and techniques allowing both typed and handwritten queries |
US7295862B2 (en) | 2001-02-27 | 2007-11-13 | Nokia Corporation | Push content filtering for broadcast communication |
US20040237109A1 (en) * | 2001-02-27 | 2004-11-25 | Laitinen Timo M. | Push content filtering for broadcast communication |
US20040219882A1 (en) * | 2001-02-27 | 2004-11-04 | Laitinen Timo M. | Push content filtering for short range communication |
US7895511B2 (en) | 2001-02-27 | 2011-02-22 | International Business Machines Corporation | Copy and paste of cells in a multi-dimensional spreadsheet |
US7308269B2 (en) | 2001-02-27 | 2007-12-11 | Nokia Corporation | Push content filtering for short range communication |
US6778834B2 (en) * | 2001-02-27 | 2004-08-17 | Nokia Corporation | Push content filtering |
US20040143788A1 (en) * | 2001-02-27 | 2004-07-22 | Jean-Jacques Aureglia | Method and system in an electronic spreadsheet for handling graphical objects referring to working ranges of cells in a copy/cut and paste operation |
US7392478B2 (en) * | 2001-02-27 | 2008-06-24 | International Business Machines Corporation | Method and system in an electronic spreadsheet for handling graphical objects referring to working ranges of cells in a copy/cut and paste operation |
US7653604B2 (en) * | 2001-02-28 | 2010-01-26 | Voice-Insight | Natural language query system for accessing an information system |
US20060184476A1 (en) * | 2001-02-28 | 2006-08-17 | Voice-Insight | Natural language query system for accessing an information system |
US8209661B2 (en) * | 2001-08-16 | 2012-06-26 | Knowledge Dynamics, Inc. | Parser, code generator, and data calculation and transformation engine for spreadsheet calculations |
US20120222000A1 (en) * | 2001-08-16 | 2012-08-30 | Smialek Michael R | Parser, Code Generator, and Data Calculation and Transformation Engine for Spreadsheet Calculations |
US8656348B2 (en) * | 2001-08-16 | 2014-02-18 | Knowledge Dynamics, Inc. | Parser, code generator, and data calculation and transformation engine for spreadsheet calculations |
US20110055681A1 (en) * | 2001-08-16 | 2011-03-03 | Knowledge Dynamics, Inc. | Parser, code generator, and data calculation and transformation engine for spreadsheet calculations |
US20080059932A1 (en) * | 2001-12-10 | 2008-03-06 | Mentor Graphics Corporation | Parallel Electronic Design Automation: Shared Simultaneous Editing |
US20040225988A1 (en) * | 2001-12-10 | 2004-11-11 | Mentor Graphics Corporation | Protection boundaries in a parallel printed circuit board design environment |
US7587695B2 (en) | 2001-12-10 | 2009-09-08 | Mentor Graphics Corporation | Protection boundaries in a parallel printed circuit board design environment |
US7516435B2 (en) | 2001-12-10 | 2009-04-07 | Mentor Graphics Corporation | Reservation of design elements in a parallel printed circuit board design environment |
US7949990B2 (en) * | 2001-12-10 | 2011-05-24 | Mentor Graphics Corporation | Parallel electronic design automation: shared simultaneous editing |
US20040210854A1 (en) * | 2001-12-10 | 2004-10-21 | Mentor Graphics Corporation | Parellel electronic design automation: shared simultaneous editing |
US20100199240A1 (en) * | 2001-12-10 | 2010-08-05 | Mentor Graphics Corporation | Parallel Electronic Design Automation: Shared Simultaneous Editing |
US20050044518A1 (en) * | 2001-12-10 | 2005-02-24 | Mentor Graphics Corporation | Reservation of design elements in a parallel printed circuit board design environment |
US20080104136A1 (en) * | 2002-05-02 | 2008-05-01 | Palmsource, Inc. | Determining priority between data items |
US7337193B1 (en) * | 2002-05-02 | 2008-02-26 | Palmsource, Inc. | Determining priority between data items |
US8001081B1 (en) | 2002-05-31 | 2011-08-16 | Access Co., Ltd. | Determining priority between data items in shared environments |
US20040027487A1 (en) * | 2002-08-09 | 2004-02-12 | Rzadzki Robert J. | System to provide custom text and graphic information to a television system infrastructure |
US20040104920A1 (en) * | 2002-09-30 | 2004-06-03 | Tsuyoshi Kawabe | Image display method for mobile terminal in image distribution system, and image conversion apparatus and mobile terminal using the method |
US20060106610A1 (en) * | 2002-10-15 | 2006-05-18 | Napper Jonathon L | Method of improving recognition accuracy in form-based data entry systems |
US20040078756A1 (en) * | 2002-10-15 | 2004-04-22 | Napper Jonathon Leigh | Method of improving recognition accuracy in form-based data entry systems |
US7684618B2 (en) | 2002-10-31 | 2010-03-23 | Microsoft Corporation | Passive embedded interaction coding |
US20060109263A1 (en) * | 2002-10-31 | 2006-05-25 | Microsoft Corporation | Universal computing device |
US20040085286A1 (en) * | 2002-10-31 | 2004-05-06 | Microsoft Corporation | Universal computing device |
US7009594B2 (en) * | 2002-10-31 | 2006-03-07 | Microsoft Corporation | Universal computing device |
US7376752B1 (en) | 2003-10-28 | 2008-05-20 | David Chudnovsky | Method to resolve an incorrectly entered uniform resource locator (URL) |
US7908621B2 (en) | 2003-10-29 | 2011-03-15 | At&T Intellectual Property I, L.P. | System and apparatus for local video distribution |
US8843970B2 (en) | 2003-10-29 | 2014-09-23 | Chanyu Holdings, Llc | Video distribution systems and methods for multiple users |
US20050114865A1 (en) * | 2003-11-21 | 2005-05-26 | Mentor Graphics Corporation | Integrating multiple electronic design applications |
US20050114821A1 (en) * | 2003-11-21 | 2005-05-26 | Mentor Graphics Corporation | Distributed autorouting of conductive paths |
US7788622B2 (en) | 2003-11-21 | 2010-08-31 | Mentor Graphics Corporation | Distributed autorouting of conductive paths |
US7590963B2 (en) | 2003-11-21 | 2009-09-15 | Mentor Graphics Corporation | Integrating multiple electronic design applications |
US7305648B2 (en) | 2003-11-21 | 2007-12-04 | Mentor Graphics Corporation | Distributed autorouting of conductive paths in printed circuit boards |
KR101114194B1 (en) * | 2004-03-03 | 2012-02-22 | 마이크로소프트 코포레이션 | Assisted form filling |
US7426496B2 (en) * | 2004-03-03 | 2008-09-16 | Microsoft Corporation | Assisted form filling |
US20050198563A1 (en) * | 2004-03-03 | 2005-09-08 | Kristjansson Trausti T. | Assisted form filling |
US8320528B2 (en) | 2004-06-14 | 2012-11-27 | At&T Intellectual Property I, L.P. | System and method for electronic message notification |
WO2005125029A3 (en) * | 2004-06-14 | 2006-03-16 | Sbc Knowledge Ventures Lp | System and method for electronic message notification |
WO2005125029A2 (en) * | 2004-06-14 | 2005-12-29 | Sbc Knowledge Ventures, L.P. | System and method for electronic message notification |
US8660242B2 (en) | 2004-06-14 | 2014-02-25 | At&T Intellectual Property I, L.P. | System and method for electronic message notification |
US20050277406A1 (en) * | 2004-06-14 | 2005-12-15 | Sbc Knowledge Ventures, L.P. | System and method for electronic message notification |
US8904458B2 (en) | 2004-07-29 | 2014-12-02 | At&T Intellectual Property I, L.P. | System and method for pre-caching a first portion of a video file on a set-top box |
US9521452B2 (en) | 2004-07-29 | 2016-12-13 | At&T Intellectual Property I, L.P. | System and method for pre-caching a first portion of a video file on a media device |
US8427445B2 (en) | 2004-07-30 | 2013-04-23 | Apple Inc. | Visual expander |
US8584257B2 (en) | 2004-08-10 | 2013-11-12 | At&T Intellectual Property I, L.P. | Method and interface for video content acquisition security on a set-top box |
US20060095504A1 (en) * | 2004-08-24 | 2006-05-04 | Gelsey Jonathan I | System and method for optical character information retrieval (OCR) via a thin-client user interface |
US7546571B2 (en) | 2004-09-08 | 2009-06-09 | Mentor Graphics Corporation | Distributed electronic design automation environment |
US20060095882A1 (en) * | 2004-09-08 | 2006-05-04 | Mentor Graphics Corporation | Distributed electronic design automation environment |
US20060101368A1 (en) * | 2004-09-08 | 2006-05-11 | Mentor Graphics Corporation | Distributed electronic design automation environment |
US8744852B1 (en) | 2004-10-01 | 2014-06-03 | Apple Inc. | Spoken interfaces |
US8086261B2 (en) | 2004-10-07 | 2011-12-27 | At&T Intellectual Property I, L.P. | System and method for providing digital network access and digital broadcast services using combined channels on a single physical medium to the customer premises |
US7716714B2 (en) | 2004-12-01 | 2010-05-11 | At&T Intellectual Property I, L.P. | System and method for recording television content at a set top box |
US8839314B2 (en) | 2004-12-01 | 2014-09-16 | At&T Intellectual Property I, L.P. | Device, system, and method for managing television tuners |
US8434116B2 (en) | 2004-12-01 | 2013-04-30 | At&T Intellectual Property I, L.P. | Device, system, and method for managing television tuners |
US9571702B2 (en) | 2004-12-06 | 2017-02-14 | At&T Intellectual Property I, L.P. | System and method of displaying a video stream |
US8390744B2 (en) | 2004-12-06 | 2013-03-05 | At&T Intellectual Property I, L.P. | System and method of displaying a video stream |
US8019855B2 (en) * | 2004-12-22 | 2011-09-13 | Lg Electronics Inc. | Method and apparatus interfacing between an application and a library of a master for network managing |
US20060161893A1 (en) * | 2004-12-22 | 2006-07-20 | Lg Electronics Inc. | Method and apparatus interfacing between an application and a library of a master for network managing |
US7685510B2 (en) | 2004-12-23 | 2010-03-23 | Sap Ag | System and method for grouping data |
US8228224B2 (en) | 2005-02-02 | 2012-07-24 | At&T Intellectual Property I, L.P. | System and method of using a remote control and apparatus |
US8214859B2 (en) | 2005-02-14 | 2012-07-03 | At&T Intellectual Property I, L.P. | Automatic switching between high definition and standard definition IP television signals |
US7826074B1 (en) | 2005-02-25 | 2010-11-02 | Microsoft Corporation | Fast embedded interaction code printing with custom postscript commands |
US8156153B2 (en) | 2005-04-22 | 2012-04-10 | Microsoft Corporation | Global metadata embedding and decoding |
US7920753B2 (en) | 2005-05-25 | 2011-04-05 | Microsoft Corporation | Preprocessing for information pattern analysis |
US9178743B2 (en) | 2005-05-27 | 2015-11-03 | At&T Intellectual Property I, L.P. | System and method of managing video content streams |
US8054849B2 (en) | 2005-05-27 | 2011-11-08 | At&T Intellectual Property I, L.P. | System and method of managing video content streams |
US7729539B2 (en) | 2005-05-31 | 2010-06-01 | Microsoft Corporation | Fast error-correcting of embedded interaction codes |
US8893199B2 (en) | 2005-06-22 | 2014-11-18 | At&T Intellectual Property I, L.P. | System and method of managing video content delivery |
US7908627B2 (en) | 2005-06-22 | 2011-03-15 | At&T Intellectual Property I, L.P. | System and method to provide a unified video signal for diverse receiving platforms |
US9338490B2 (en) | 2005-06-22 | 2016-05-10 | At&T Intellectual Property I, L.P. | System and method to provide a unified video signal for diverse receiving platforms |
US8966563B2 (en) | 2005-06-22 | 2015-02-24 | At&T Intellectual Property, I, L.P. | System and method to provide a unified video signal for diverse receiving platforms |
US10085054B2 (en) | 2005-06-22 | 2018-09-25 | At&T Intellectual Property | System and method to provide a unified video signal for diverse receiving platforms |
US8535151B2 (en) | 2005-06-24 | 2013-09-17 | At&T Intellectual Property I, L.P. | Multimedia-based video game distribution |
US9278283B2 (en) | 2005-06-24 | 2016-03-08 | At&T Intellectual Property I, L.P. | Networked television and method thereof |
US8635659B2 (en) | 2005-06-24 | 2014-01-21 | At&T Intellectual Property I, L.P. | Audio receiver modular card and method thereof |
US8365218B2 (en) | 2005-06-24 | 2013-01-29 | At&T Intellectual Property I, L.P. | Networked television and method thereof |
US8282476B2 (en) | 2005-06-24 | 2012-10-09 | At&T Intellectual Property I, L.P. | Multimedia-based video game distribution |
US8190688B2 (en) | 2005-07-11 | 2012-05-29 | At&T Intellectual Property I, Lp | System and method of transmitting photographs from a set top box |
US9167241B2 (en) | 2005-07-27 | 2015-10-20 | At&T Intellectual Property I, L.P. | Video quality testing by encoding aggregated clips |
US7873102B2 (en) | 2005-07-27 | 2011-01-18 | At&T Intellectual Property I, Lp | Video quality testing by encoding aggregated clips |
US7817816B2 (en) | 2005-08-17 | 2010-10-19 | Microsoft Corporation | Embedded interaction code enabled surface type identification |
US10318871B2 (en) | 2005-09-08 | 2019-06-11 | Apple Inc. | Method and apparatus for building an intelligent automated assistant |
US20070073809A1 (en) * | 2005-09-13 | 2007-03-29 | Mentor Graphics Corporation | Distributed electronic design automation architecture |
US8326926B2 (en) | 2005-09-13 | 2012-12-04 | Mentor Graphics Corporation | Distributed electronic design automation architecture |
US20080263067A1 (en) * | 2005-10-27 | 2008-10-23 | Koninklijke Philips Electronics, N.V. | Method and System for Entering and Retrieving Content from an Electronic Diary |
US20070116349A1 (en) * | 2005-11-23 | 2007-05-24 | Pitney Bowes Incorporated | Method for detecting perforations on the edge of an image of a form |
US7889885B2 (en) * | 2005-11-23 | 2011-02-15 | Pitney Bowes Inc. | Method for detecting perforations on the edge of an image of a form |
US20070152961A1 (en) * | 2005-12-30 | 2007-07-05 | Dunton Randy R | User interface for a media device |
US7489323B2 (en) * | 2006-01-10 | 2009-02-10 | Delta Electronics, Inc. | Display apparatus adapted for a display wall, image adjustment method therefor and display wall therewith |
US20070159498A1 (en) * | 2006-01-10 | 2007-07-12 | Jung-Yi Yang | Display apparatus adapted for a display wall, image adjustment method therefor and display wall therewith |
US20070280535A1 (en) * | 2006-05-30 | 2007-12-06 | Microsoft Corporation Microsoft Patent Group | Cursive handwriting recognition with hierarchical prototype search |
US7929768B2 (en) | 2006-05-30 | 2011-04-19 | Microsoft Corporation | Cursive handwriting recognition with hierarchical prototype search |
US8265377B2 (en) | 2006-05-30 | 2012-09-11 | Microsoft Corporation | Cursive handwriting recognition with hierarchical prototype search |
US20100027889A1 (en) * | 2006-05-30 | 2010-02-04 | Microsoft Corporation | Curvise handwriting recognition with hierarchical prototype search |
US7620245B2 (en) | 2006-05-30 | 2009-11-17 | Microsoft Corporation | Cursive handwriting recognition with hierarchical prototype search |
US8930191B2 (en) | 2006-09-08 | 2015-01-06 | Apple Inc. | Paraphrasing of user requests and results by automated digital assistant |
US9117447B2 (en) | 2006-09-08 | 2015-08-25 | Apple Inc. | Using event alert text as input to an automated assistant |
US8942986B2 (en) | 2006-09-08 | 2015-01-27 | Apple Inc. | Determining user intent based on ontologies of domains |
US8570278B2 (en) | 2006-10-26 | 2013-10-29 | Apple Inc. | Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker |
US9348511B2 (en) | 2006-10-26 | 2016-05-24 | Apple Inc. | Method, system, and graphical user interface for positioning an insertion marker in a touch screen display |
US20080165142A1 (en) * | 2006-10-26 | 2008-07-10 | Kenneth Kocienda | Portable Multifunction Device, Method, and Graphical User Interface for Adjusting an Insertion Point Marker |
US9632695B2 (en) | 2006-10-26 | 2017-04-25 | Apple Inc. | Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker |
US20110080364A1 (en) * | 2006-10-26 | 2011-04-07 | Bas Ording | Method, System, and Graphical User Interface for Positioning an Insertion Marker in a Touch Screen Display |
US9207855B2 (en) | 2006-10-26 | 2015-12-08 | Apple Inc. | Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker |
US8565552B2 (en) * | 2006-11-14 | 2013-10-22 | Codonics, Inc. | Assembling multiple medical images into a single film image |
US20080118141A1 (en) * | 2006-11-14 | 2008-05-22 | Codonics, Inc. | Assembling multiple medical images into a single film image |
US20080266131A1 (en) * | 2007-02-13 | 2008-10-30 | Wherenet Corp. | System, apparatus and method for locating and/or tracking assets |
US9880283B2 (en) * | 2007-02-13 | 2018-01-30 | Zih Corp. | System, apparatus and method for locating and/or tracking assets |
US10568032B2 (en) | 2007-04-03 | 2020-02-18 | Apple Inc. | Method and system for operating a multi-function portable electronic device using voice-activation |
US9024864B2 (en) | 2007-06-12 | 2015-05-05 | Intel Corporation | User interface with software lensing for very long lists of content |
US20080313686A1 (en) * | 2007-06-13 | 2008-12-18 | Matvey Thomas R | Handheld camcorder accessory with pre-programmed wireless internet access for simplified webcasting and handheld camcorder with built-in pre-programmed wireless internet access for simplified webcasting and method of commercially supplying and supporting same |
US20090055179A1 (en) * | 2007-08-24 | 2009-02-26 | Samsung Electronics Co., Ltd. | Method, medium and apparatus for providing mobile voice web service |
US9251786B2 (en) * | 2007-08-24 | 2016-02-02 | Samsung Electronics Co., Ltd. | Method, medium and apparatus for providing mobile voice web service |
US9330720B2 (en) | 2008-01-03 | 2016-05-03 | Apple Inc. | Methods and apparatus for altering audio output signals |
US10381016B2 (en) | 2008-01-03 | 2019-08-13 | Apple Inc. | Methods and apparatus for altering audio output signals |
US8201109B2 (en) | 2008-03-04 | 2012-06-12 | Apple Inc. | Methods and graphical user interfaces for editing on a portable multifunction device |
US8650507B2 (en) | 2008-03-04 | 2014-02-11 | Apple Inc. | Selecting of text using gestures |
US9529524B2 (en) | 2008-03-04 | 2016-12-27 | Apple Inc. | Methods and graphical user interfaces for editing on a portable multifunction device |
US20090228842A1 (en) * | 2008-03-04 | 2009-09-10 | Apple Inc. | Selecting of text using gestures |
US20090225365A1 (en) * | 2008-03-07 | 2009-09-10 | Canon Kabushiki Kaisha | Information processing apparatus, image processing apparatus, method for controlling information processing apparatus, method for controlling image processing apparatus, and program |
US8446298B2 (en) | 2008-04-03 | 2013-05-21 | Livescribe, Inc. | Quick record function in a smart pen computing system |
US20090251336A1 (en) * | 2008-04-03 | 2009-10-08 | Livescribe, Inc. | Quick Record Function In A Smart Pen Computing System |
WO2009124197A3 (en) * | 2008-04-03 | 2009-12-30 | Livescribe, Inc. | Quick record function in a smart pen computing system |
WO2009124197A2 (en) * | 2008-04-03 | 2009-10-08 | Livescribe, Inc. | Quick record function in a smart pen computing system |
US9626955B2 (en) | 2008-04-05 | 2017-04-18 | Apple Inc. | Intelligent text-to-speech conversion |
US9865248B2 (en) | 2008-04-05 | 2018-01-09 | Apple Inc. | Intelligent text-to-speech conversion |
US20090319276A1 (en) * | 2008-06-20 | 2009-12-24 | At&T Intellectual Property I, L.P. | Voice Enabled Remote Control for a Set-Top Box |
US11568736B2 (en) | 2008-06-20 | 2023-01-31 | Nuance Communications, Inc. | Voice enabled remote control for a set-top box |
US20090320076A1 (en) * | 2008-06-20 | 2009-12-24 | At&T Intellectual Property I, L.P. | System and Method for Processing an Interactive Advertisement |
US9852614B2 (en) | 2008-06-20 | 2017-12-26 | Nuance Communications, Inc. | Voice enabled remote control for a set-top box |
US9135809B2 (en) | 2008-06-20 | 2015-09-15 | At&T Intellectual Property I, Lp | Voice enabled remote control for a set-top box |
US9535906B2 (en) | 2008-07-31 | 2017-01-03 | Apple Inc. | Mobile device having human language translation capability with positional feedback |
US10108612B2 (en) | 2008-07-31 | 2018-10-23 | Apple Inc. | Mobile device having human language translation capability with positional feedback |
US20100201455A1 (en) * | 2008-09-23 | 2010-08-12 | Aerovironment, Inc. | Predictive pulse width modulation for an open delta h-bridge driven high efficiency ironless permanent magnet machine |
US9959870B2 (en) | 2008-12-11 | 2018-05-01 | Apple Inc. | Speech recognition involving a mobile device |
WO2010093492A1 (en) * | 2009-02-13 | 2010-08-19 | Mobitv, Inc. | A reprogrammable client using a uniform bytecode model |
GB2479512A (en) * | 2009-02-13 | 2011-10-12 | Mobitv Inc | A reprogrammable client using a uniform bytecode model |
US20100211627A1 (en) * | 2009-02-13 | 2010-08-19 | Mobitv, Inc. | Reprogrammable client using a uniform bytecode model |
US8584050B2 (en) | 2009-03-16 | 2013-11-12 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US9875013B2 (en) | 2009-03-16 | 2018-01-23 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US20100235734A1 (en) * | 2009-03-16 | 2010-09-16 | Bas Ording | Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display |
US8255830B2 (en) | 2009-03-16 | 2012-08-28 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US20100235729A1 (en) * | 2009-03-16 | 2010-09-16 | Kocienda Kenneth L | Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display |
US20100235785A1 (en) * | 2009-03-16 | 2010-09-16 | Bas Ording | Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display |
US10761716B2 (en) | 2009-03-16 | 2020-09-01 | Apple, Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US9846533B2 (en) | 2009-03-16 | 2017-12-19 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US8756534B2 (en) | 2009-03-16 | 2014-06-17 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US8661362B2 (en) | 2009-03-16 | 2014-02-25 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US20100235726A1 (en) * | 2009-03-16 | 2010-09-16 | Bas Ording | Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display |
US8370736B2 (en) | 2009-03-16 | 2013-02-05 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US20100235735A1 (en) * | 2009-03-16 | 2010-09-16 | Bas Ording | Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display |
US20100235778A1 (en) * | 2009-03-16 | 2010-09-16 | Kocienda Kenneth L | Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display |
US8510665B2 (en) | 2009-03-16 | 2013-08-13 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US10475446B2 (en) | 2009-06-05 | 2019-11-12 | Apple Inc. | Using context information to facilitate processing of commands in a virtual assistant |
US9858925B2 (en) | 2009-06-05 | 2018-01-02 | Apple Inc. | Using context information to facilitate processing of commands in a virtual assistant |
US11080012B2 (en) | 2009-06-05 | 2021-08-03 | Apple Inc. | Interface for a virtual digital assistant |
US10795541B2 (en) | 2009-06-05 | 2020-10-06 | Apple Inc. | Intelligent organization of tasks items |
US10283110B2 (en) | 2009-07-02 | 2019-05-07 | Apple Inc. | Methods and apparatuses for automatic speech recognition |
US9318108B2 (en) | 2010-01-18 | 2016-04-19 | Apple Inc. | Intelligent automated assistant |
US9548050B2 (en) | 2010-01-18 | 2017-01-17 | Apple Inc. | Intelligent automated assistant |
US10705794B2 (en) | 2010-01-18 | 2020-07-07 | Apple Inc. | Automatically adapting user interfaces for hands-free interaction |
US10679605B2 (en) | 2010-01-18 | 2020-06-09 | Apple Inc. | Hands-free list-reading by intelligent automated assistant |
US10553209B2 (en) | 2010-01-18 | 2020-02-04 | Apple Inc. | Systems and methods for hands-free notification summaries |
US10496753B2 (en) | 2010-01-18 | 2019-12-03 | Apple Inc. | Automatically adapting user interfaces for hands-free interaction |
US10276170B2 (en) | 2010-01-18 | 2019-04-30 | Apple Inc. | Intelligent automated assistant |
US10706841B2 (en) | 2010-01-18 | 2020-07-07 | Apple Inc. | Task flow identification based on user intent |
US8892446B2 (en) | 2010-01-18 | 2014-11-18 | Apple Inc. | Service orchestration for intelligent automated assistant |
US11423886B2 (en) | 2010-01-18 | 2022-08-23 | Apple Inc. | Task flow identification based on user intent |
US8903716B2 (en) | 2010-01-18 | 2014-12-02 | Apple Inc. | Personalized vocabulary for digital assistant |
US8977584B2 (en) | 2010-01-25 | 2015-03-10 | Newvaluexchange Global Ai Llp | Apparatuses, methods and systems for a digital conversation management platform |
US9424861B2 (en) | 2010-01-25 | 2016-08-23 | Newvaluexchange Ltd | Apparatuses, methods and systems for a digital conversation management platform |
US9424862B2 (en) | 2010-01-25 | 2016-08-23 | Newvaluexchange Ltd | Apparatuses, methods and systems for a digital conversation management platform |
US9431028B2 (en) | 2010-01-25 | 2016-08-30 | Newvaluexchange Ltd | Apparatuses, methods and systems for a digital conversation management platform |
US10049675B2 (en) | 2010-02-25 | 2018-08-14 | Apple Inc. | User profiling for voice input processing |
US9633660B2 (en) | 2010-02-25 | 2017-04-25 | Apple Inc. | User profiling for voice input processing |
US8972930B2 (en) | 2010-06-04 | 2015-03-03 | Microsoft Corporation | Generating text manipulation programs using input-output examples |
WO2011153006A3 (en) * | 2010-06-04 | 2012-04-05 | Microsoft Corporation | Generating text manipulation programs using input-output examples |
US9672816B1 (en) | 2010-06-16 | 2017-06-06 | Google Inc. | Annotating maps with user-contributed pronunciations |
US8949125B1 (en) * | 2010-06-16 | 2015-02-03 | Google Inc. | Annotating maps with user-contributed pronunciations |
US8713067B1 (en) * | 2010-07-09 | 2014-04-29 | Open Invention Network, Llc | Stable file system |
US11775477B1 (en) | 2010-07-09 | 2023-10-03 | Philips North America Llc | Stable file system |
US9613115B2 (en) | 2010-07-12 | 2017-04-04 | Microsoft Technology Licensing, Llc | Generating programs based on input-output examples using converter modules |
US20120023414A1 (en) * | 2010-07-23 | 2012-01-26 | Samsung Electronics Co., Ltd. | Method and apparatus for processing e-mail |
WO2012021659A2 (en) * | 2010-08-12 | 2012-02-16 | Brightedge Technologies, Inc. | Operationalizing search engine optimization |
WO2012021659A3 (en) * | 2010-08-12 | 2012-04-19 | Brightedge Technologies, Inc. | Operationalizing search engine optimization |
US10762293B2 (en) | 2010-12-22 | 2020-09-01 | Apple Inc. | Using parts-of-speech tagging and named entity recognition for spelling correction |
US10102359B2 (en) | 2011-03-21 | 2018-10-16 | Apple Inc. | Device access using voice authentication |
US9262612B2 (en) | 2011-03-21 | 2016-02-16 | Apple Inc. | Device access using voice authentication |
US10664144B2 (en) | 2011-05-31 | 2020-05-26 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US8661339B2 (en) | 2011-05-31 | 2014-02-25 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US9092130B2 (en) | 2011-05-31 | 2015-07-28 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US8719695B2 (en) | 2011-05-31 | 2014-05-06 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US8677232B2 (en) | 2011-05-31 | 2014-03-18 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US9244605B2 (en) | 2011-05-31 | 2016-01-26 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US11256401B2 (en) | 2011-05-31 | 2022-02-22 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US10706373B2 (en) | 2011-06-03 | 2020-07-07 | Apple Inc. | Performing actions associated with task items that represent tasks to perform |
US10057736B2 (en) | 2011-06-03 | 2018-08-21 | Apple Inc. | Active transport based notifications |
US11120372B2 (en) | 2011-06-03 | 2021-09-14 | Apple Inc. | Performing actions associated with task items that represent tasks to perform |
US10241644B2 (en) | 2011-06-03 | 2019-03-26 | Apple Inc. | Actionable reminder entries |
US9798393B2 (en) | 2011-08-29 | 2017-10-24 | Apple Inc. | Text correction processing |
US10241752B2 (en) | 2011-09-30 | 2019-03-26 | Apple Inc. | Interface for a virtual digital assistant |
US20130138399A1 (en) * | 2011-11-29 | 2013-05-30 | Garrick EVANS | Generating an Analytically Accurate Model From an Abstract Representation Created Via a Mobile Device |
US10134385B2 (en) | 2012-03-02 | 2018-11-20 | Apple Inc. | Systems and methods for name pronunciation |
WO2013132309A1 (en) * | 2012-03-05 | 2013-09-12 | Tammel Eric Kamel | Systems and methods for processing unstructured numerical data |
US9483461B2 (en) | 2012-03-06 | 2016-11-01 | Apple Inc. | Handling speech synthesis of content for multiple languages |
US9953088B2 (en) | 2012-05-14 | 2018-04-24 | Apple Inc. | Crowd sourcing information to fulfill user requests |
US9552335B2 (en) | 2012-06-04 | 2017-01-24 | Microsoft Technology Licensing, Llc | Expedited techniques for generating string manipulation programs |
US10079014B2 (en) | 2012-06-08 | 2018-09-18 | Apple Inc. | Name recognition system |
US9495129B2 (en) | 2012-06-29 | 2016-11-15 | Apple Inc. | Device, method, and user interface for voice-activated navigation and browsing of a document |
US9576574B2 (en) | 2012-09-10 | 2017-02-21 | Apple Inc. | Context-sensitive handling of interruptions by intelligent digital assistant |
US9971774B2 (en) | 2012-09-19 | 2018-05-15 | Apple Inc. | Voice-based media searching |
US20140164907A1 (en) * | 2012-12-12 | 2014-06-12 | Lg Electronics Inc. | Mobile terminal and method of controlling the mobile terminal |
CN103870019A (en) * | 2012-12-17 | 2014-06-18 | 鸿富锦精密工业(深圳)有限公司 | Digital pen and digital writing module |
US10199051B2 (en) | 2013-02-07 | 2019-02-05 | Apple Inc. | Voice trigger for a digital assistant |
US10978090B2 (en) | 2013-02-07 | 2021-04-13 | Apple Inc. | Voice trigger for a digital assistant |
US9368114B2 (en) | 2013-03-14 | 2016-06-14 | Apple Inc. | Context-sensitive handling of interruptions |
US9697822B1 (en) | 2013-03-15 | 2017-07-04 | Apple Inc. | System and method for updating an adaptive speech recognition model |
US9922642B2 (en) | 2013-03-15 | 2018-03-20 | Apple Inc. | Training an at least partial voice command system |
US9966060B2 (en) | 2013-06-07 | 2018-05-08 | Apple Inc. | System and method for user-specified pronunciation of words for speech synthesis and recognition |
US9582608B2 (en) | 2013-06-07 | 2017-02-28 | Apple Inc. | Unified ranking with entropy-weighted information for phrase-based semantic auto-completion |
US9620104B2 (en) | 2013-06-07 | 2017-04-11 | Apple Inc. | System and method for user-specified pronunciation of words for speech synthesis and recognition |
US9633674B2 (en) | 2013-06-07 | 2017-04-25 | Apple Inc. | System and method for detecting errors in interactions with a voice-based digital assistant |
US10657961B2 (en) | 2013-06-08 | 2020-05-19 | Apple Inc. | Interpreting and acting upon commands that involve sharing information with remote devices |
US9966068B2 (en) | 2013-06-08 | 2018-05-08 | Apple Inc. | Interpreting and acting upon commands that involve sharing information with remote devices |
US10185542B2 (en) | 2013-06-09 | 2019-01-22 | Apple Inc. | Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant |
US10176167B2 (en) | 2013-06-09 | 2019-01-08 | Apple Inc. | System and method for inferring user intent from speech inputs |
US9300784B2 (en) | 2013-06-13 | 2016-03-29 | Apple Inc. | System and method for emergency calls initiated by voice command |
US10791216B2 (en) | 2013-08-06 | 2020-09-29 | Apple Inc. | Auto-activating smart responses based on activities from remote devices |
US20150095294A1 (en) * | 2013-10-02 | 2015-04-02 | International Business Machines Corporation | Elimination of Fragmentation of Files in Storage Medium by Utilizing Head Movement Time |
US9684664B2 (en) * | 2013-10-02 | 2017-06-20 | International Business Machines Corporation | Elimination of fragmentation of files in storage medium by utilizing head movement time |
US20150332492A1 (en) * | 2014-05-13 | 2015-11-19 | Masaaki Igarashi | Image processing system, image processing apparatus, and method for image processing |
US9779317B2 (en) * | 2014-05-13 | 2017-10-03 | Ricoh Company, Ltd. | Image processing system, image processing apparatus, and method for image processing |
US9620105B2 (en) | 2014-05-15 | 2017-04-11 | Apple Inc. | Analyzing audio input for efficient speech and music recognition |
US10592095B2 (en) | 2014-05-23 | 2020-03-17 | Apple Inc. | Instantaneous speaking of content on touch devices |
US9502031B2 (en) | 2014-05-27 | 2016-11-22 | Apple Inc. | Method for supporting dynamic grammars in WFST-based ASR |
US10083690B2 (en) | 2014-05-30 | 2018-09-25 | Apple Inc. | Better resolution when referencing to concepts |
US10289433B2 (en) | 2014-05-30 | 2019-05-14 | Apple Inc. | Domain specific language for encoding assistant dialog |
US9966065B2 (en) | 2014-05-30 | 2018-05-08 | Apple Inc. | Multi-command single utterance input method |
US9842101B2 (en) | 2014-05-30 | 2017-12-12 | Apple Inc. | Predictive conversion of language input |
US9715875B2 (en) | 2014-05-30 | 2017-07-25 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US11257504B2 (en) | 2014-05-30 | 2022-02-22 | Apple Inc. | Intelligent assistant for home automation |
US10169329B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Exemplar-based natural language processing |
US10170123B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Intelligent assistant for home automation |
US9430463B2 (en) | 2014-05-30 | 2016-08-30 | Apple Inc. | Exemplar-based natural language processing |
US9785630B2 (en) | 2014-05-30 | 2017-10-10 | Apple Inc. | Text prediction using combined word N-gram and unigram language models |
US11133008B2 (en) | 2014-05-30 | 2021-09-28 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US10497365B2 (en) | 2014-05-30 | 2019-12-03 | Apple Inc. | Multi-command single utterance input method |
US9633004B2 (en) | 2014-05-30 | 2017-04-25 | Apple Inc. | Better resolution when referencing to concepts |
US9760559B2 (en) | 2014-05-30 | 2017-09-12 | Apple Inc. | Predictive text input |
US9734193B2 (en) | 2014-05-30 | 2017-08-15 | Apple Inc. | Determining domain salience ranking from ambiguous words in natural speech |
US10078631B2 (en) | 2014-05-30 | 2018-09-18 | Apple Inc. | Entropy-guided text prediction using combined word and character n-gram language models |
US9668024B2 (en) | 2014-06-30 | 2017-05-30 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US10904611B2 (en) | 2014-06-30 | 2021-01-26 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US10659851B2 (en) | 2014-06-30 | 2020-05-19 | Apple Inc. | Real-time digital assistant knowledge updates |
US9338493B2 (en) | 2014-06-30 | 2016-05-10 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US10446141B2 (en) | 2014-08-28 | 2019-10-15 | Apple Inc. | Automatic speech recognition based on user feedback |
US9818400B2 (en) | 2014-09-11 | 2017-11-14 | Apple Inc. | Method and apparatus for discovering trending terms in speech requests |
US10431204B2 (en) | 2014-09-11 | 2019-10-01 | Apple Inc. | Method and apparatus for discovering trending terms in speech requests |
US10789041B2 (en) | 2014-09-12 | 2020-09-29 | Apple Inc. | Dynamic thresholds for always listening speech trigger |
US10127911B2 (en) | 2014-09-30 | 2018-11-13 | Apple Inc. | Speaker identification and unsupervised speaker adaptation techniques |
US9986419B2 (en) | 2014-09-30 | 2018-05-29 | Apple Inc. | Social reminders |
US10074360B2 (en) | 2014-09-30 | 2018-09-11 | Apple Inc. | Providing an indication of the suitability of speech recognition |
US9646609B2 (en) | 2014-09-30 | 2017-05-09 | Apple Inc. | Caching apparatus for serving phonetic pronunciations |
US9886432B2 (en) | 2014-09-30 | 2018-02-06 | Apple Inc. | Parsimonious handling of word inflection via categorical stem + suffix N-gram language models |
US9668121B2 (en) | 2014-09-30 | 2017-05-30 | Apple Inc. | Social reminders |
US11556230B2 (en) | 2014-12-02 | 2023-01-17 | Apple Inc. | Data detection |
US10552013B2 (en) | 2014-12-02 | 2020-02-04 | Apple Inc. | Data detection |
US9711141B2 (en) | 2014-12-09 | 2017-07-18 | Apple Inc. | Disambiguating heteronyms in speech synthesis |
US9865280B2 (en) | 2015-03-06 | 2018-01-09 | Apple Inc. | Structured dictation using intelligent automated assistants |
US9721566B2 (en) | 2015-03-08 | 2017-08-01 | Apple Inc. | Competing devices responding to voice triggers |
US10567477B2 (en) | 2015-03-08 | 2020-02-18 | Apple Inc. | Virtual assistant continuity |
US9886953B2 (en) | 2015-03-08 | 2018-02-06 | Apple Inc. | Virtual assistant activation |
US11087759B2 (en) | 2015-03-08 | 2021-08-10 | Apple Inc. | Virtual assistant activation |
US10311871B2 (en) | 2015-03-08 | 2019-06-04 | Apple Inc. | Competing devices responding to voice triggers |
US9899019B2 (en) | 2015-03-18 | 2018-02-20 | Apple Inc. | Systems and methods for structured stem and suffix language models |
US9842105B2 (en) | 2015-04-16 | 2017-12-12 | Apple Inc. | Parsimonious continuous-space phrase representations for natural language processing |
US10083688B2 (en) | 2015-05-27 | 2018-09-25 | Apple Inc. | Device voice control for selecting a displayed affordance |
US10127220B2 (en) | 2015-06-04 | 2018-11-13 | Apple Inc. | Language identification from short strings |
US10101822B2 (en) | 2015-06-05 | 2018-10-16 | Apple Inc. | Language input correction |
US10356243B2 (en) | 2015-06-05 | 2019-07-16 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
US10255907B2 (en) | 2015-06-07 | 2019-04-09 | Apple Inc. | Automatic accent detection using acoustic models |
US10186254B2 (en) | 2015-06-07 | 2019-01-22 | Apple Inc. | Context-based endpoint detection |
US11025565B2 (en) | 2015-06-07 | 2021-06-01 | Apple Inc. | Personalized prediction of responses for instant messaging |
US11500672B2 (en) | 2015-09-08 | 2022-11-15 | Apple Inc. | Distributed personal assistant |
US10671428B2 (en) | 2015-09-08 | 2020-06-02 | Apple Inc. | Distributed personal assistant |
US10747498B2 (en) | 2015-09-08 | 2020-08-18 | Apple Inc. | Zero latency digital assistant |
US9697820B2 (en) | 2015-09-24 | 2017-07-04 | Apple Inc. | Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks |
US11010550B2 (en) | 2015-09-29 | 2021-05-18 | Apple Inc. | Unified language modeling framework for word prediction, auto-completion and auto-correction |
US10366158B2 (en) | 2015-09-29 | 2019-07-30 | Apple Inc. | Efficient word encoding for recurrent neural network language models |
US11587559B2 (en) | 2015-09-30 | 2023-02-21 | Apple Inc. | Intelligent device identification |
US11526368B2 (en) | 2015-11-06 | 2022-12-13 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US10691473B2 (en) | 2015-11-06 | 2020-06-23 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US20170147195A1 (en) * | 2015-11-20 | 2017-05-25 | Tomer Alpert | Automove smart transcription |
US11157166B2 (en) * | 2015-11-20 | 2021-10-26 | Felt, Inc. | Automove smart transcription |
US10049668B2 (en) | 2015-12-02 | 2018-08-14 | Apple Inc. | Applying neural network language models to weighted finite state transducers for automatic speech recognition |
US10223066B2 (en) | 2015-12-23 | 2019-03-05 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US10446143B2 (en) | 2016-03-14 | 2019-10-15 | Apple Inc. | Identification of voice inputs providing credentials |
US9934775B2 (en) | 2016-05-26 | 2018-04-03 | Apple Inc. | Unit-selection text-to-speech synthesis based on predicted concatenation parameters |
US9972304B2 (en) | 2016-06-03 | 2018-05-15 | Apple Inc. | Privacy preserving distributed evaluation framework for embedded personalized systems |
US10249300B2 (en) | 2016-06-06 | 2019-04-02 | Apple Inc. | Intelligent list reading |
US10049663B2 (en) | 2016-06-08 | 2018-08-14 | Apple, Inc. | Intelligent automated assistant for media exploration |
US11069347B2 (en) | 2016-06-08 | 2021-07-20 | Apple Inc. | Intelligent automated assistant for media exploration |
US10354011B2 (en) | 2016-06-09 | 2019-07-16 | Apple Inc. | Intelligent automated assistant in a home environment |
US10490187B2 (en) | 2016-06-10 | 2019-11-26 | Apple Inc. | Digital assistant providing automated status report |
US11037565B2 (en) | 2016-06-10 | 2021-06-15 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US10067938B2 (en) | 2016-06-10 | 2018-09-04 | Apple Inc. | Multilingual word prediction |
US10509862B2 (en) | 2016-06-10 | 2019-12-17 | Apple Inc. | Dynamic phrase expansion of language input |
US10192552B2 (en) | 2016-06-10 | 2019-01-29 | Apple Inc. | Digital assistant providing whispered speech |
US10733993B2 (en) | 2016-06-10 | 2020-08-04 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US11152002B2 (en) | 2016-06-11 | 2021-10-19 | Apple Inc. | Application integration with a digital assistant |
US10089072B2 (en) | 2016-06-11 | 2018-10-02 | Apple Inc. | Intelligent device arbitration and control |
US10521466B2 (en) | 2016-06-11 | 2019-12-31 | Apple Inc. | Data driven natural language event detection and classification |
US10297253B2 (en) | 2016-06-11 | 2019-05-21 | Apple Inc. | Application integration with a digital assistant |
US10269345B2 (en) | 2016-06-11 | 2019-04-23 | Apple Inc. | Intelligent task discovery |
US10043516B2 (en) | 2016-09-23 | 2018-08-07 | Apple Inc. | Intelligent automated assistant |
US10553215B2 (en) | 2016-09-23 | 2020-02-04 | Apple Inc. | Intelligent automated assistant |
US11256710B2 (en) | 2016-10-20 | 2022-02-22 | Microsoft Technology Licensing, Llc | String transformation sub-program suggestion |
US11620304B2 (en) | 2016-10-20 | 2023-04-04 | Microsoft Technology Licensing, Llc | Example management for string transformation |
US10846298B2 (en) | 2016-10-28 | 2020-11-24 | Microsoft Technology Licensing, Llc | Record profiling for dataset sampling |
US10593346B2 (en) | 2016-12-22 | 2020-03-17 | Apple Inc. | Rank-reduced token representation for automatic speech recognition |
US10755703B2 (en) | 2017-05-11 | 2020-08-25 | Apple Inc. | Offline personal assistant |
US11405466B2 (en) | 2017-05-12 | 2022-08-02 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US10791176B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US10410637B2 (en) | 2017-05-12 | 2019-09-10 | Apple Inc. | User-specific acoustic models |
US10810274B2 (en) | 2017-05-15 | 2020-10-20 | Apple Inc. | Optimizing dialogue policy decisions for digital assistants using implicit feedback |
US10482874B2 (en) | 2017-05-15 | 2019-11-19 | Apple Inc. | Hierarchical belief states for digital assistants |
US11217255B2 (en) | 2017-05-16 | 2022-01-04 | Apple Inc. | Far-field extension for digital assistant services |
US10671353B2 (en) | 2018-01-31 | 2020-06-02 | Microsoft Technology Licensing, Llc | Programming-by-example using disjunctive programs |
CN110221760A (en) * | 2019-06-24 | 2019-09-10 | 梁舒云 | A method of generating towed picture voice label |
Also Published As
Publication number | Publication date |
---|---|
US6157935A (en) | 2000-12-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6157935A (en) | Remote data access and management system | |
US6202060B1 (en) | Data management system | |
US8131718B2 (en) | Intelligent data retrieval system | |
JP3423413B2 (en) | Handwritten information recognition apparatus and method | |
US6178426B1 (en) | Apparatus with extended markup language data capture capability | |
US5704029A (en) | System and method for completing an electronic form | |
Want et al. | An overview of the PARCTAB ubiquitous computing experiment | |
US6054990A (en) | Computer system with handwriting annotation | |
CN101334774B (en) | Character input method and input method system | |
EP1014254B1 (en) | Multi-moded scanning pen with feedback | |
US6557004B1 (en) | Method and apparatus for fast searching of hand-held contacts lists | |
US5148522A (en) | Information retrieval apparatus and interface for retrieval of mapping information utilizing hand-drawn retrieval requests | |
US7216266B2 (en) | Change request form annotation | |
US6343148B2 (en) | Process for utilizing external handwriting recognition for personal data assistants | |
US6342901B1 (en) | Interactive device for displaying information from multiple sources | |
US10152301B2 (en) | Providing interface controls based on voice commands | |
EP1395025B1 (en) | Interactive animation mailing system | |
US20080282160A1 (en) | Designated screen capturing and automatic image exporting | |
US20030069716A1 (en) | System & method for performing field inspection | |
US20030033329A1 (en) | Method and apparatus for entry and editing of spreadsheet formulas | |
US5586317A (en) | Method and apparatus for implementing I/O in a frame-based computer system | |
WO2000026792A1 (en) | System and method for specifying www site | |
MXPA04002158A (en) | Presentation of data based on user input. | |
JPH07222248A (en) | System for utilizing speech information for portable information terminal | |
US5588141A (en) | System for executing different functions associated with different contexts corresponding to different screen events based upon information stored in unified data structure |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |