A PLATFORM INDEPENDENT SYSTEM OF SPECIFYING AN
EMBEDDED USER INTERFACE
BACKGROUND
Electronic equipment of all kinds requires some form of human interface.
Whether it is a stereo, VCR, dishwasher, or cellular phone, the user interface is
critical to the success or failure of the product in the market. In the past two
decades, user interfaces have progressed from clumsy mechanical knobs with
poor user feedback to sophisticated digital displays with softkeys that guide a
user through all aspects of interfacing with the electronic device. As the
sophistication of the user interface has increased, so too has the complexity of
designing and engineering the interface.
Current user interface design forces hardware manufacturers to design
user interfaces integrally with the design of the associated hardware. Today,
user interfaces are so tied to the product that as the product changes the user
interface must be completely redesigned and implemented. With today's
current short life cycle of most consumer and industrial devices, user interface
engineering that is dependent on the product is wasteful. Because the designers
of the user interface must work constantly with the engineers of the functional
aspects of the product, these additional lines of communication can actually
slow down the product development cycle.
Therefore, a system is needed that will allow for independent
development of an embedded user interface.
A system is also needed that will allow the user interface to be platform
independent, so that short product life cycles will not result m wasted time and
effort.
Also, a system is needed that allows a user interface developer to specify
the display layout, user interaction, and process of the user interface
independent of the eventual hardware implementation.
Also, a system is needed that provides for a user interface application
that is self-contained withm the product and does not need to rely on any
external servers for execution
SUMMARY
The present invention is directed towards a platform independent system,
method and protocol for specifying an embedded user interface. The system
includes source code that compnses elements and parameters. These elements
and parameters descπbe the user interface including presentation to the user
and response to events. The source code includes elements for text fields, input
fields, selection fields, buttons, timers, images, branches, and functions.
In addition, the system has a compiler for tokemzmg the elements within
the source file; parsing the tokenized elements; and generating one or more
databases. Typically, a number of databases are generated for eventual transfer
to the target platform, including: a token database containing tokenized code;
stπng database containing the user interface text m one or more languages; a
function database having a listing of function calls; an icon database containing
graphics to be displayed for the user on the platform: and an event group
database for conveniently grouping and labeling events.
A user interface engine resides withm the target platform and serves to
interpret the interface instructions from the databases. The user interface
engine monitors for system level events, such as key presses, as well as internal
events like timers. The user interface engine will interpret the code in the token
database for reacting to the vaπous events. A function module withm the user
interface engine links the interpretation of the code to the target platform
functions to be performed. After the target platform performs the appropπate
function, control returns to the user interface engine. The user interface engine
also has a window module for displaying messages from the user interface
engine to the user of the platform.
Therefore, it can be seen that the present invention provides a platform
independent system, method, and protocol for specifying an embedded user
interface. These and other aspects, features, and advantages of the present
invention will be set forth in the description that follows and possible
embodiments thereof, and by reference to the appended drawings and claims.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 is a block diagram that illustrates the major elements of an
exemplary embodiment of the system of the present invention.
Fig. 2 is a block diagram that illustrates the major elements of the
decks/cards of an exemplary embodiment of the present invention.
Fig. 3 is a diagram illustrating the card/deck layout of an exemplary
embodiment of the present invention.
Fig. 4 is a system diagram that illustrates an exemplary environment
suitable for implementing various embodiments of the present invention.
Fig. 5 is an exemplary user interface display created by the user interface
engine of the present invention.
Fig. 6 is a block diagram of the operation of the card compiler of the
present invention.
Fig. 7 is a block diagram illustrating the function of the User Interface
Engine (UIE) of an exemplary embodiment of the present invention.
DESCRIPTION
Referring now to the drawings, in which like numerals represent like
elements throughout the several figures, aspects of the present invention and
exemplary operating environments and embodiments will be described.
The User Interface Engine (UIE) subsystem provides a platform
independent method of specifying an embedded application user interface using
template based cards and a target resident interpreter. The platform may be a
cellular telephone, pager, or any other form of electronic equipment where only
limited memory storage is available. In addition, the advantages of the present
invention may be applied to any electronic device that would benefit from the
development of a user interface independent of the hardware implementation,
including, but not limited to, televisions, VCR's, DVD players/recorders, all
forms of stereo equipment at home, mobile or in a car. household appliances,
industrial equipment, measurement and testing devices, etc.
The cards are grouped into decks, and there may be multiple decks
within a given application. Each card contains HTML-like instructions, known
as elements, that drive the user interface. The cards allow a user interface
developer to specify the display layout, interaction with the user, and life cycle
of the user interface without having to have platform specific knowledge of the
platform's user interface. The UIE is optimized for embedded applications by
minimizing storage requirements. The UIE is loosely based on the WAP WML
specification as described in the Wireless Application Protocol - Wireless
Markup Language Specification Version 30-Apr- 1998 published by the
Wireless Application Protocol Forum. While the WAP WML specification is
intended for use in delivering content over the air to wireless, narrowband
devices much the same way that content is delivered over the wired Internet in
HTML format, the present invention is directed to providing an embedded user
interface for those wireless devices.
The UIE 110 consists of three logical components as shown in Figure 1 :
Tag Definitions 120; Card Compiler 130; and UIE Interpreter 140. The Tag
Definitions 120 are loosely based on the WAP WML tag specification with
changes made to deal with the specific requirements of an embedded user
interface such as unique text formatting and font handling, conditional
processing, multiple language support, unique event handling and life cycle
definition. The Card Compiler 130 analyzes the elements for correctness;
converts the textual command elements into symbolic, space-saving tokens,
also known as tokenizing; and builds any related data structures such as text
tables and device function references. The Card Compiler 130 also converts
card references from names to offsets within the tokenized decks. The
tokenized elements contain encoded attribute information rather than
independent tokens for each attribute. The UIE Interpreter 140 is the target
resident component that handles tag processing, screen rendering, event
processing and input control. The UIE Interpreter 140 utilizes the compiled
cards provided by the Card Compiler 130 as well as local services resident in
the platform application.
As further shown in Figure 2, the Tag Definitions 120 comprise a series
of elements that will be used to build the cards and decks that define the user
interface. The Tag Definitions 120 can be grouped into four major categories:
General Declaration Elements 210; Active Card Elements 220: Display
Elements 230; and Miscellaneous Definitions and Attributes 240.
To better understand the function of the various elements, it is useful to
understand the card and deck paradigm used to configure the UIE. UIE data is
structured as a collection of cards. A single collection of cards is referred to as
a deck. Each card contains structured content and navigation specifications.
Any given application can have multiple decks. Any deck-level navigation
specifications and/or event elements are declared within the deck elements but
outside of the card elements. Unlike HTML, the UIE only supports a single
address card type represented by a CARD element. Cards can contain a
combination of elements. The UIE supports navigation to cards within a deck
or to cards in another deck. Navigating to a card within the current deck only
requires specification of the card name. Navigating to a card outside of the
current deck requires specification of the deck and card names. Figure 3
graphically shows the card and deck paradigm of the UIE. An application 305
created to drive a user interface comprises one or more decks 310-a and 310-b
with each deck having one or more cards 320-a - 320-d and 330-a - 330-d. As
shown in Figure 3, application 305 has two decks: Deckl 310-a and Deck2
310-b. Deckl 310-a has a seπes of cards represented by Cardl 320-a, Card2
320-b, Card3 320-c through Cardn 320-d. Deck2 310-b has a seπes of cards
represented by Cardl 330-a, Card2 330-b, Card3 330-c through Cardn 330-d.
Elements may be placed at the application level, within a deck or within a card.
General Declaration Elements 210 are used to delimit cards and decks
and include the following elements: CARD and DECK Table 1 below
summaπzes the General Declaration Elements 210.
</DECK> Delimits the end of a deck definition
Table 1
Table 1 shows that the structure to begin delimiting a deck is the element
<DECK NAME="deck name"> where "deck name" is the name of the deck.
Decks may include any of the elements shown in the table prior to the end of
deck statement </DECK>. Notice that CARD elements are declared within a
DECK. Table 1 further shows that the structure to begin delimiting a card is
the element <CARD NAME- 'card name"> where "card name" is the name of
the card. Cards may include any of the elements shown in the table prior to the
end of card statement </CARD>.
Active Card Elements 220 are used to perform control features within the
application, deck or card level. The Active Card Elements include the
following: BRANCH, DO, FUNCTION, HOME, INCLUDE, INPUT,
ONEVENT, OPTION, SELECT, TIMER, VAR. Table 2 below summaπzes
the Active Card Elements 220.
Table 2 BRANCH elements allow for conditional transfer of control to another
card based on a test against a specified variable. Branch elements may include
the following parameters: test type (equal, not equal, greater than, less than,
unconditional); optional target variable; and optional variable or constant to be
tested against target variable. They also specify the task type which indicates
the action to be taken if the test is true. Four tasks are possible. First, go to the
indicated card which implicitly executes a push onto the history stack. Second,
go to the previous card which implicitly executes a pop from the history stack.
Third, go to the home card specified by the HOME element. Fourth, go back to
a card m the history stack.
The DO element is used to specify unconditional action to be taken. The
DO element can consist of the following information: the task type which
indicates the action to be taken; an identifier of the object to be acted upon, and
the state or value that the object should be set to One task type is "Set Soft
Key" which would then be followed by the soft key number and the text to be
associated with that soft key.
The UIE will support device functions which are functions that exist
withm the target platform application but outside of the UIE. They are
supported using the FUNCTION element with parameters specified as
attπbutes and the single return value returned in a vaπable. There will be one
common interface to all device functions which carπes the function ID and
arguments as parameters. The UIE will convert a call to a device function to a
function pointer duπng tokemzation. The function ID will be a member of a
function list which is maintained by the target platform application. Device
function calls can return strings, numbers, or strings containing dynamic
element content with the stπngs utilized anywhere a vaπable can be used
including substitution in display stπngs.
The HOME element allows the user to specify the card to be loaded
whenever the home card is specified m a BRANCH element. The HOME
element must be defined before a reference to the home card is made.
Execution of the HOME element or any branch to the specified HOME card
clears the history stack.
The INCLUDE element allows the user to specify a card that is to be
included withm the current card as if it is a part of the current card. The
INCLUDE element can directly specify a card name or a device function call
that returns a pointer to dynamic element content.
The MODE element allows the user to configure the characters to
generate from the vaπous hard keys on the device. For instance, the keys may
by 1 , 2, 3, 4, 5, 6, 7, 8, 9, 0, #, *. Pressing the "2" key repeatedly may yield the
sequence A -> B -> C -> 2. The FORMAT attπbute allows the use to define an
input format for the keys. The VALUE attπbute allows the user to set the value
for the keys. The LANG attπbute allows the user to specify the language.
The INPUT element is quite extensive and is used to receive input data
from the user of the target platform; interpret the input; and perform vaπous
tasks in response to the input. The INPUT element supports the following
features:
Key Map Table Support
The UIE editor supports a key map table which defines the mapping of
physical keys of a target platform to their associated values. Often, the target
platform will have a limited number of physical keys because of the dimensions
of the platform. For example, in a cellular telephone, the physical key '2' may
be mapped to '2'. 'A'. 'B', and 'C; the 'SND' key just maps to 'SND'. In another
example, a touchpad on a small keyboard may have the physical key AD
mapped to 'A' B'.'C. and D'. This table is primarily used in text entry mode
to define the values a given key cycles through when the key is repeatedly
pressed within a fixed time period.
Data entry field can span one or more lines
The data entry field can span one or more lines of the display. In the
case of multiple lines, data is automatically split between the lines based on the
entry mode.
Left-to-Right and Right-to-Left Input Modes
Left to Right
In the left to right mode, the first character entered appears in the upper
left hand corner of the data entry field with the cursor to the right of the
character. As additional characters are entered, they are inserted at the current
cursor position and the cursor is moved to the right. Once the first line of a
multiple line input field is full, additional characters are entered starting from
the left side of the following line. Once the cursor is at the end of the data entry
field, it remains under the last character entered and no data entry is allowed
unless input field 'rolling' is enabled.
The target platform may have a button designated as a clear or delete
button. For simplicity throughout the rest of this discussion, this clear or delete
button will be known as the CLR button. If the CLR button is pressed, the
character to the left of the cursor is deleted and the cursor is moved one
position to the left. In addition, any characters to the right of the deleted
character could be moved or shifted to the left one character position the CLR
key is pressed and no data exists to the left of the current cursor position, no
action is taken by the editor. In addition the CLR button may be defined to that
if a long CLR keypress is detected, the current data entry field is cleared.
Right to Left
In the right to left mode, the first character entered appears in the lower
right hand comer of the data entry field with the cursor under the current
character. As additional characters are entered, they are inserted at the current
cursor position and the originally entered characters are moved to the left.
Once the first line of a multiple line input field is full, the oldest characters are
moved to the right side of the previous line. Once the entry field is full, no data
entry is allowed unless input field 'rolling' is enabled. If the CLR button is
pressed, the character under the cursor is deleted and the remaining characters
are shifted to the right. If the CLR key is pressed and no data exists at the
current cursor position, no action is taken by the editor. If a long CLR keypress
is detected, the current data entry field is cleared.
Cursor Movement
The cursor can be moved left or πght withm the specified data entry field
as well as up and down for a multiple line data entry field.
In some instances, if a data entry field is larger than the physical display
such that the display may not have the physical capacity to display the
complete data entry field, the left and right cursor button can be defined to
move or scroll the data withm the visible area left/πght m order to keep it in the
visible region of the display. An attempt to move the cursor left/πght outside
of the data entry field results in an error tone with no action taken. This is
similar to many "windowing" operating systems and applications in that a view
port is created into a larger, logical display area.
If the cursor is moved up or down such that it leaves the current entry
field, it is moved to the next valid tab field.
Data Entry Modes
The UIE editor supports multiple data entry modes. All entry modes can
be specified in a format stπng but only selected modes are available to the user
for selection via a softkey. The editor supports the following data entry modes:
Upper Case Alphanumeπc, Lower Case Alphanumeπc, Any Case
Alphanumeric, Numeπc only. Dialed Number, Hex, Binary, and Symbols. The
user of the UIE editor may not select the following modes: Any Case
Alphanumeric, Dialed Number, Hex, and Binary.
The text entered at the current cursor position when a key is pressed is
determined by the entry mode. If the current mode supports multiple characters
per key (such as alphanumeπc entry), the first keypress inserts the first
character associated with the key. If the key is hit again withm a timeout
peπod, the current character is replaced with the next character associated with
the key without moving the cursor. After the timeout peπod expires, the cursor
is moved to the next character position.
Once an entry mode other than alphanumeπc is specified, the user cannot
change entry modes (the mode softkey is not displayed). The entry mode may
be changed by using a softkey dedicated to data entry mode selection.
Support Entrv Field Format Specification
The editor interprets format stπngs associated with a data entry field
which specifies how the data should be entered and/or displayed. Some of the
formatting supported are as follows: Data entry mode required (upper/lower
case alpha, numeπc, hex. binary, etc.); Data conversion (none, to binary); Font
specification (specific size or auto-scalmg); Interspersed display string
specification (such as parenthesis and dashes for telephone numbers, or decimal
points for IP addresses); Input data echoing mode (echo, no-echo, substitute
fixed character for display); Entry field confirmation tone control (key
confirmation tone enabled/disabled); Minimum entry length; and Maximum
entry length.
The INPUT element will also support display of any data associated with
the key variable and placement of the cursor at the end of the data whenever the
INPUT element is interpreted. The INPUT element also dynamically records
any changes to the input data field into the designated key variable. The
INPUT element will generate an internal event whenever the input data field
transitions from input data present, i.e., the user has entered data, to input data
absent, i.e., the user has cleared all data in the input data field. In addition,
whenever the input dat field changes and the input cπteria have been met
(minimum length, input type, etc.), the INPUT element will execute the body of
the element.
ONEVENT elements are used to specify action to be taken when a target
platform event occurs. The ONEVENT element includes two parameters:
event type and optional event ID. The event or event group identifies the
external events for which action is being specified. The event type supports, at
least, the following event classes: keypad keys; limit events (such as bottom of
screen reached during scrolling, top/bottom of select list, input buffer cleared);
device information and alarm conditions (low battery, recoverable internal
error, etc.); air protocol interface information (incoming call, new system ID,
etc.); and expired timer notifications. Other event classes would also be present
depending on the nature of the target platform, i.e., a device with an internal
hard disk may prompt an alarm condition upon a disk error. The optional event
ID further specifies the expected device event. For instance, in a timer event,
the optional event ID would be the variable name associated with the timer.
The body of the ONEVENT element contains elements that are executed if the
specified event occurs.
The SELECT element lets users pick from a list of options. The
SELECT element identifies both the current (default) selection and the
selection currently under the cursor. Also, SELECT allows the specification of
a default option which is uniquely identified on the display and also highlighted
as the current selection when the card is displayed. In addition, the display of
a bitmap or index number along with the options is supported. The UIE
Interpreter will highlight the selection under the cursor as the user moves the
cursor between selections, and generate an internal event whenever the user
attempts to move the cursor before the first option or after the last option.
When the user makes a selection, the unique identifier for the default option is
updated.
If the user attempts to move the cursor past the first or last option of a
select element, the cursor is moved to the next available tab stop defined in the
card. If no additional tab stops are defined, the cursor remains on the current
option. The user can alternately select an option by pressing the key
corresponding to the displayed index number (if any). In addition, options with
no title during runtime (title specification is NULL) are ignored (not displayed
or selectable) by the UIE interpreter.
TIMER elements are used to start or stop application timers. They are
not necessarily associated with a particular card and must be explicitly started
and stopped The timer is actually associated with a vanable which contains
the remaining time on the timer. If the vaπable goes out of scope, the timer is
stopped and deleted. Once the timer expires, the vaπable is set to 0 and a timer
event is generated. The TIMER element consists of the following information:
the vaπable associated with the specified timer; and the timeout peπod.
The UIE implementation allows vaπables to be used m the place of
stπngs which are substituted at run-time with their current value. They must be
assigned an initial value using the VAR element before being used and take on
the type of the value being assigned. Variables are scoped by the level at which
they are defined. Any vanables declared outside of a deck are global. Any
vaπables declared withm a deck only exist while the deck is active. Any
variables declared withm a card only exist while the card is active. The
contents of a vaπable are preserved as long as the vaπable remains m scope.
Variables include the following types: text stπng, number, and pointer to
dynamic element content Variable substitution is specified by preceding the
vaπable name with a dollar-sign. Because of this, a literal dollar sign must be
encoded with a pair of dollar signs. The VAR element specifies: vaπable
name and the value to be assigned to the variable Variables can be substituted
anyplace where free-form text is allowed such as DISPLAY elements, INPUT
element prompts, SELECT element option titles and OPTION element titles.
Display Elements 230 are used to format text on the device display. Any
text not associated with a tag delimited field is assumed to be information to be
displayed on the device display. The text must be enclosed in double quotes.
Table 3 below summarizes the elements in the Display Elements 230 group.
Table 3
The DATTR element sets the display attributes. The display attributes
are only active until the next line break element. When the line break is
detected, the displayable characters revert to the default display style with
possible attributes associated with the new line break. In addition, several
special elements can be inserted into the display strings: line breaks, tabs, and
bitmaps (static and animated).
The last group of elements are the Miscellaneous Definitions and
Attributes 240. Table 4 below summarizes these elements. Use of the elements
is as described in the previous three tables.
Table 4
As mentioned above, the tag elements are used to build applications
comprising cards and decks that define a user interface. This UIE interpeter
results in a series of screens on the display device of the target platform with
each screen being defined by a card.
The hardware aspects of the target platform are more fully explained in
the following paragraphs. Fig. 4 is a system diagram that illustrates an
exemplary environment suitable for implementing various embodiments of the
present invention. Fig. 4 and the following discussion provide a general
overview of a platform onto which the invention may be integrated or
implemented. Although in the context of the exemplary environment the
invention will be described as consisting of instructions within a software
program being executed by a processing unit, those skilled in the art will
understand that portions of the invention, or the entire invention itself, may also
be implemented by using hardware components, state machines, or a
combination of any of these techniques. In addition, a software program
implementing an embodiment of the mvention may run as a stand-alone
program or as a software module, routine, or function call, operating in
conjunction with an operating system, another program, system call, interrupt
routine, library routine, or the like. The term program module will be used to
refer to software programs, routines, functions, macros, data, data structures, or
any set of machine readable instructions or object code, or software instructions
that can be compiled into such, and executed by a processing unit.
Those skilled in the art will appreciate that the system illustrated in Fig. 4
may take on many forms and may be directed towards performing a variety of
functions within a range of consumer devices, any of which may serve as an
exemplary environment for embodiments of the present invention.
The exemplary system illustrated in Fig. 4 includes a target platform 410
that is made up of various components including, but not limited to, a
processing unit 412, non-volatile memory 414, volatile memory 416, and a
system bus 418 that couples the non- volatile memory 414 and volatile memory
416 to the processing unit 412. The non-volatile memory 414 may include a
variety of memory types including, but not limited to, read only memory
(ROM), electronically erasable read only memory (EEROM), electronically
erasable and programmable read only memory (EEPROM), electronically
programmable read only memory (EPROM), electronically alterable read only
memory (EAROM), and battery backed random access memory (RAM). The
non-volatile memory 414 provides storage for power on and reset routines
(bootstrap routines) that are invoked upon applying power or resetting the
target platform 410. In some configurations the non-volatile memory 414
provides the basic input/output system (BIOS) routines that are utilized to
perform the transfer of information between the vaπous components of the
target platform 410.
The volatile memory 416 may include a vaπety of memory types and
devices including, but not limited to, random access memory (RAM), dynamic
random access memory (DRAM), FLASH memory, EEROM, bubble memory,
registers, or the like. The volatile memory 416 provides temporary storage for
program modules or data that are being or may be executed by, or are being
accessed or modified by the processing unit 412. In general, the distinction
between non-volatile memory 414 and volatile memory 416 is that when power
is removed from the target platform 410 and then reapplied, the contents of the
non-volatile memory 414 is not lost, whereas the contents of the volatile
memory 416 is lost, corrupted, or erased.
The target platform 410 may access one or more internal or external
display devices 430 such as a CRT monitor, LCD panel, LED panel, electro-
luminescent panel, or other display device, for the purpose of providing
information or computing results to a user. The processing unit 412 interfaces
to each display device 430 through a video interface 420 coupled to the
processing unit over system bus 418.
The target platform 410 may have access to one or more external storage
devices 432 such as a hard disk drive, a magnetic disk drive for the purpose of
reading from or writing to a removable disk, and an optical disk drive for the
purpose of reading a CD-ROM disk or to read from or write to other optical
media, as well as devices for reading from and or writing to other media types
including but not limited to, FLASH memory cards, Bernoulli drives, magnetic
cassettes, magnetic tapes, or the like. The processing unit 412 interfaces to
each storage device 432 through a storage interface 422 coupled to the
processing unit 412 over system bus 418. The storage devices 432 provide non¬
volatile storage for the target platform 410.
The target platform 410 may receive input or commands from one or
more input devices 434 such as a keyboard, pointing device, mouse, modem,
RF or infrared receiver, microphone, joystick, track ball, light pen, game pad,
scanner, camera, or the like. The processing unit 412 interfaces to each input
device 434 through an input interface 424 coupled to the processing unit 412
over system bus 418. The input interface may include one or more of a variety
of interfaces, including but not limited to, an RS-232 serial port interface or
other serial port interface, a parallel port interface, a universal serial bus (USB),
an optical interface such as infrared or IrDA, an RF or wireless interface such
as Bluetooth, or other interface.
The target platform 410 may send output information, in addition to the
display 430, to one or more output devices 436 such as a speaker, modem,
printer, plotter, facsimile machine, RF or infrared transmitter, or any other of a
variety of devices that can be controlled by the target platform 410. The
processing unit 412 interfaces to each output device 436 through an output
interface 426 coupled to the processing unit 412 over system bus 418. The
output interface may include one or more of a variety of interfaces, including
but not limited to, an RS-232 serial port interface or other serial port interface,
a parallel port interface, a universal serial bus (USB), an optical interface such
as infrared or IrDA, an RF or wireless interface such as Bluetooth, or other
interface.
The target platform 410 may operate in a networked environment using
logical connections to one or more remote systems, such as a remote computer
438. The remote computer 438 may be a server, a router, a peer device or other
common network node, and typically includes many or all of the components
described relative to the target platform 410. When used in a networking
environment, the target platform 410 is connected to the remote system 438
over a network interface 428. The connection between the remote computer
438 and the network interface 428 depicted in Fig. 1 may include a local area
network (LAN), a wide area network (WAN), a telephone connection, or the
like. These types of networking environments are commonplace in offices,
enterprise-wide computer networks, intranets and the Internet.
It will be appreciated that program modules implementing vaπous
embodiments of the present invention may be stored m the storage device 432.
the non-volatile memory 414, the volatile memory 416, or m a networked
environment, m a remote memory storage device of the remote system 438.
The program modules may include an operating system, application programs,
other program modules, and program data. The processing unit 412 may access
vaπous portions of the program modules in response to the vanous instructions
contained therein, as well as under the direction of events occurπng or being
received over the input interface 424 and the network interface 428.
To better understand the UIE, the following is an exemplary embodiment
of the present invention where the target platform is a cellular telephone. A
typical cellular telephone display is shown in Figure 5. This would be the
screen displayed to a user who wished to "lock" a target platform such as a
cellular telephone to prevent unauthorized use of the telephone. Screen 510
displays m the center of the screen the message "Enter Lock Code" followed by
a four digit input field. Softkey 1 is associated with the displayed command
OK 520, and Softkey 2 is associated with the displayed command CANCEL
530. A user would enter his lock code using an associated keypad 560 and then
press the OK 520 softkey.
The source code below is used to generate this display screen and
associated actions. Line numbers are used for reference purposes only and
would not appear m the actual ASCII source code file.
1 <VAR NAME=Ierror VALUE=0/> //global vaπable to hold input errors
2 <VAR NAME=Tmp VA UE=0/> //global temporary vaπable
3 <DECK NAME="Po erUp">
4 <VAR NAME=Code VALUE=""<'> //Deck vaπable to hold lock code
5 <CARD NAME="LockCode">
6 <VAR NAME=Code VALUE=""/> / Make sure Code=0
7 <DO TYPE=SKEY ID=1 STATE='OK"/> //Put "OK" on softkey 1
8 <DO TYPE=SKEY ID=2 STATE="CANCEL" > //Put "CANCEL" on softkey 2
9 <DATTR ALIGN=CENTER BR=YES/>"Enter" //Put first line of text on line 1
10 <DATTR ALIGN=CENTER BR=YES/>"Lock Code" //Put 2nd line of text on line 2
1 1 <INPUT KEY=Code MINLENGTH=4 //Get user input into Code
MAXLENG1Η=4 ECHO=CHAR ECHAR=" "
ALIGN=CENTER PHOLDER=YES PCHAR="_"
KTONE=OFF MODE=NUMBERIC CURSOR=OFF
ERROR=Ierτor/>
12 <ONEVENT TYPE=SKEY1> / Softkey 1 is pressed 13 <BRANCH TYPE=NE EY=Ierror VALUE=0 '/If Input syntax is not valid
DEST="BadCode"/> //Load card indicating error
14 <FUNCTION KEY=Tmp NAME="CheckLock Code" //See if code is valid
ARG=Code/>
15 <BRANCH TYPE=EQ KEY=Tmp VALUE=0 //If the lock code is valid,
DEST="LockOK"/> //go to confirmation screen
16 <BRANCH DEST="BadCode"/> //if invalid, load error screen 17 </ONEVENT> 18 </CARD> //end of card definition 19 <CARD NAME="BadCode">
20 </CARD>
21 <CARD NAME="LockOK">
22 </CARD>
23 <CARD NAME="Idle">
24 </CARD> 25 </DECK>
Structurally, the application comprises a single deck, "PowerUp," and
four cards, "LockCode," "BadCode," "LockOK," and "Idle." Lines 1 and 2
declare variables Ierror and Tmp and initiates these variables to zero. In
addition, these are global variables available to all decks and cards because
they are declared outside of a deck or card element. Line 3 declares the start of
the definition of the deck which continues until the end of deck command on
line 25. In line 4, the variable Code is declared and given the initial value of a
null string. Code will be available throughout this deck because it is declared
within the deck but outside of any card definitions.
The definition of the first card, "LockCode," begins in line 5 and
continues to line 18. The variable Code is cleared in line 6. Lines 7 and 8 are
used to establish the labels for softkey 1 and softkey 2 respectively as "OK"
and "CANCEL." If the user activates softkey 1 or softkey 2 an event is
triggered. Lines 9 and 10 place the text "Enter Lock Code" in the center of the
user display. A four character input field is designated in Line 1 1. Lines 12
through 17 define an ONEVENT element that is activated if the user presses
softkey 1 , labeled "OK." The BRANCH element in Line 13 states that if the
Ierror variable is not equal to 0, indicating an error, then go to the "BadCode"
card. Because the "BadCode" card is within the current deck, the use of a deck
name is not required. The FUNCTION element in Line 14 calls the built-in
function of the target platform, in this case a cellular telephone, and requests
the function named "CheckLock Code" to compare the variable Code to the
lock code of the platform. CheckLock Code will return the validation result in
the Tmp variable. The BRANCH element of line 15 compares the result of the
CheckLock Code function, stored in variable Tmp, against the value 0 to see if
it is equal. If Tmp equals zero, then control will transfer to the LockOK card.
If Tmp is not equal to zero, then the BRANCH instruction of line 16 will cause
control to jump to the "BadCode" card.
While the above application is a simple application using the UIE, it
displays the invention's versatility in constructing a myriad array of user
interfaces and functions for portable platforms. Further cards and decks of
cards could be defined to fully program a complete user interface.
Figure 6 illustrates the operation of the card compiler. After source code
containing deck and card information is complete, the card compiler reads in
the deck/card information from the source code and performs a series of
compilational passes to generate compiled data into a format that can be used
by the User Interface Engine. The source code is contained in one or more
ASCII based text files 605 whose names contain the .PML extension in the
illustrated embodiment. A file labeled MakeFile 610 contains a listing of the
card/deck files 605 that will be used by the card compiler for a particular
application. The Event File, events.ini 615, defines the user events required for
the system to interact with the target product. These events are things such as:
low battery, incoming call, etc. The Event File 615 also may contain the
definitions for an aggregation of events, called events groups, which allows a
single label to represent a series of defined events in a logical OR fashion. A
typical Event File 615 may be constructed as follows:
[UserEvents] ev_l=LOWBATT ev_2=INCOMING CALL ev_3=KEY_0 ev_4=KEY_l
[EventGroups]
gr_l=gANYKEY
[gANYKEY] ev_l=KEY_0 ev_2=KEY_l
In the above example of an Event File 615, any reference in the decks/cards to
ev_l would be a reference to the LOWBATT event; similarly, any reference to
gr_l would be a reference to either the KEY_0 or KEY 1 event.
Once the above series of files, 605 and 610, are presented to the card
compiler 130, the card compiler 130, begins the compilation process. Initially,
a language syntax check is performed in the Lexical 620 portion of the
compiler. If there are no syntax errors, compilation proceeds to the Token
Identifier 625 where the commands are tokenized and then parsed in the Parser
630 along with the Event File 615. The tokenization of the cards and decks
allows the final databases to be very compact, thereby facilitating the use of the
user interface engine in platforms with limited memory. The Parser 630
generates five database files that will eventually be downloaded to the target
platform by use with the User Interface Engine. EventGroupDB 635 contains
an array of the user defined events. IcondDB 640 is an image table containing
indexes to the images and image data used by the cards and decks. StringDB
645 is the string table containing string data for use by the cards/decks.
TokenDB 650 contains the tokenized decks/cards. FunctionDB 655 contains
an enumeration equivalent of the functions found in the cards.
Figure 7 is a block diagram illustrating the function of the User Interface
Engine (UIE) 140 of an exemplary embodiment of the present invention. The
UIE uses the card compiler's DB files 635-655 in conjunction with the
system's local functions to generate an executable. The UIE calls the target
platform's functions and performs the requested operations. Upon termination
of the target platform's performance of the requested operations, control returns
to the UIE to execute the next compiled command.
Event Server 710 monitors itself for internal events that may occur from
within objects and monitors the platform's operating system 705 for events that
occur in the target platform's environment. Events are placed in a first in / first
out (FIFO) event queue for processing in the order in which they are received.
Events may include such things as keystrokes, timer expirations, or various
internal events. The Event Server 710 will fetch events from the event queue
and send the event to the target element that has registered itself as the event
target. For example, an element on the card may be looking for a keystroke.
The Event Server 710 will first check the currently active card to see if the
event can be processed by elements on that card; if not, the Event Server will
see if the deck can process that event. If the event is not processed by the deck,
it is passed to the application or global event handler. If the event can be
processed by a card, deck or application, control will pass to the Token Server
715 for execution of the appropriate element. In addition, events may trigger
the need to render a display for the user in which case control will pass to the
Window 720 block.
Token Server 715 is in charge of extracting the next token and delivering
it to the target token element, so that it may execute the token. The Token
Server 715 fetches tokens from the token database 650 that reside in some form
of RAM or ROM within the target platform. The tokens are delivered to the
current token target where the tokens are interpreted and actions are performed.
The actions usually involve the creation of associated user interface object such
as an input data field, a selection list, or a text field. The user interface object
will be passed to the Window block 720 for display or input with the user.
The History block 725 keeps track of the Deck/Cards that have been
pushed or popped. The History block 725 is a first in / last out (FILO) module.
Deck/Card pointers, the state of the token server, and the state of the event
server are all pushed onto the History stack as the Token Server 715 executes.
Function Interface 730 is the gateway that allows the UIE 140 to make
function calls to the target platform's functions. The FunctionDB file 655 is
used to match the target platform's functions with the UIE. The Function
Interface 730 knows which function is being called and the parameter details.
Once the target platform's functions 735 are finished being executed, control
will return to the UIE 140.
As mentioned previously, Window block 720 is a group of objects that
are responsible for rendering the Deck/Cards objects into visible objects to be
displayed on the target platform's display. For example, the Card may instruct
the UIE 140 to create a Selection object to be displayed. A user can then select
from an option from a list mcluded in the selection object. Duπng the creation
of the Select/Option objects, the Window 720 knows how to interpret this
information on the display, taking into consideration the physical limitations of
the target platform's display.
From the foregoing descπption, it will be appreciated that the present
invention provides a platform independent method of specifying an embedded
user interface. The present invention has been descπbed m relation to particular
embodiments which are intended m all respects to be illustrative rather than
restrictive. Those skilled in the art will understand that the pπnciples of the
present invention may be applied to, and embodied m, vaπous combinations of
hardware and software with vaπous types of interfaces and transmission
technology. Alternative embodiments will become apparent to those skilled in
the art to which the present invention pertains without departing from its spiπt
and scope. Accordingly, the scope of the present invention is descπbed by the
appended claims and supported by the foregoing description