US20070085908A1 - A method and apparatus for enhancing the broadcast of a live event - Google Patents
A method and apparatus for enhancing the broadcast of a live event Download PDFInfo
- Publication number
- US20070085908A1 US20070085908A1 US11/560,237 US56023706A US2007085908A1 US 20070085908 A1 US20070085908 A1 US 20070085908A1 US 56023706 A US56023706 A US 56023706A US 2007085908 A1 US2007085908 A1 US 2007085908A1
- Authority
- US
- United States
- Prior art keywords
- target
- video
- camera
- image
- field
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 16
- 230000002708 enhancing effect Effects 0.000 title claims description 10
- 238000003909 pattern recognition Methods 0.000 abstract description 20
- 238000012986 modification Methods 0.000 description 16
- 230000004048 modification Effects 0.000 description 16
- 238000005516 engineering process Methods 0.000 description 11
- 230000003287 optical effect Effects 0.000 description 11
- 238000004891 communication Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 4
- 238000000576 coating method Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000003595 spectral effect Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 239000011248 coating agent Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000001429 visible spectrum Methods 0.000 description 2
- 239000004606 Fillers/Extenders Substances 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 208000003443 Unconsciousness Diseases 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005670 electromagnetic radiation Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000002329 infrared spectrum Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000005693 optoelectronics Effects 0.000 description 1
- 238000000053 physical method Methods 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/32—Transforming X-rays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/44—Mechanical structures for providing tensile strength and external protection for fibres, e.g. optical transmission cables
- G02B6/4439—Auxiliary devices
- G02B6/444—Systems or boxes with surplus lengths
- G02B6/4452—Distribution frames
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/44—Mechanical structures for providing tensile strength and external protection for fibres, e.g. optical transmission cables
- G02B6/4439—Auxiliary devices
- G02B6/444—Systems or boxes with surplus lengths
- G02B6/4452—Distribution frames
- G02B6/44524—Distribution frames with frame parts or auxiliary devices mounted on the frame and collectively not covering a whole width of the frame or rack
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/44—Mechanical structures for providing tensile strength and external protection for fibres, e.g. optical transmission cables
- G02B6/4439—Auxiliary devices
- G02B6/444—Systems or boxes with surplus lengths
- G02B6/44528—Patch-cords; Connector arrangements in the system or in the box
Definitions
- the present invention is directed to a method and apparatus for enhancing a television broadcast of a live event.
- the television presentation of live events could be improved by enhancing the video in real time to make the presentation more interesting to the viewer.
- television viewers cannot see the entire playing field during a sporting event; therefore, the viewer may lose perspective as to where one of the players or objects are on the field in relation to the rest of the field, players or objects.
- During the telecast of football games cameras tend to zoom in on the players which allows the viewer to only see a small portion of the field. Because the viewer can only see a small portion of the field a viewer may not know where a particular player is in relation to the pertinent locations on the field.
- One instance is when a player is carrying the football, the television viewer may not know how far that player has to run for a first down.
- One enhancement that would be helpful to television viewers of football games is to highlight the field at the point where a player must advance in order to obtain a first down.
- An enhancement that would be helpful to viewers of golf tournaments is to highlight those portions of a golf course that have been notorious trouble spots to golfers. While the professional golfer is aware of these trouble spots and hits the ball to avoid those spots, the television viewer may not be aware of those trouble spots and may wonder why a particular golfer is hitting the ball in a certain direction. If the golf course was highlighted to show these trouble spots, a television viewer would understand the strategy that the golfer is using and get more enjoyment out of viewing the golf tournament. Another useful enhancement would include showing the contours of the green. Similar enhancements to the playing field would be useful in other sports as well.
- live events do not take advantage of the scope of the television audience with respect to advertising.
- advertisements on display at a stadium can be televised; however, many of those advertisements are not applicable to the television audience.
- a particular sporting event may be played in San Francisco and televised around the world.
- a local store may pay for a billboard at the stadium.
- viewers in other parts of the United States or in other countries receiving the broadcast may not have access to that store and, thus, the broadcast of the advertisement is not effective.
- some of the space at a stadium is not used because such use would interfere with the view of the players or the spectators at the stadium. However, using that space for advertisement would be very effective for the television audience.
- the glass around the perimeter of a hockey rink would provide an effective place for advertisements to the television audience.
- advertisements would block the view of spectators at the stadium.
- some advertisements would be more effective if their exposure is limited to particular times when customers are thinking of that type of product. For example, an advertisement for an umbrella would be more effective while it was raining.
- Another solution included digitizing a frame of video and using a computer with pattern recognition software to locate the target image to be replaced in the frame of video. When the target image is found, a replacement image is inserted in its place.
- the problem with this solution is that the software is too slow and cannot be effectively used in conjunction with a live event. Such systems are even slower when they account for occlusions.
- An occlusion is something that blocks the target. For example, if the target is a billboard on the boards around a hockey rink, one example of an occlusion is a player standing in front of the billboard. When that billboard is replaced, the new billboard image must be inserted into the video such that the player appears to be in front of the replacement billboard.
- the present invention is directed to a system for enhancing the broadcast of a live event.
- a target at a live event, is selected to be enhanced.
- targets include advertisements at a stadium, portions of the playing field (e.g., football field, baseball field, soccer field, basketball court, etc.), locations at or near the stadium, or a monochrome background (e.g. for chroma-key) positioned at or near the stadium.
- the system of the present invention roughly described, captures video using a camera, senses field of view data for that camera, determines a position and orientation of a video image of the target in the captured video and modifies the captured video by enhancing at least a portion of the video image of the target.
- Alternative embodiments of the present invention include determining the perspective of the video image of the target and/or preparing an occlusion for the video image of the target.
- One embodiment of the present invention includes one or more field of view sensors coupled to a camera such that the sensors can detect data from which the field of view of the camera can be determined.
- the field of view sensors could include pan, tilt and/or zoom sensors.
- the system also includes a processor, a memory and a video modification unit.
- the memory stores a location of the target and, optionally, data representing at least a portion of the video image of the target.
- the processor which is in communication with the memory and the field of view sensors, is programmed to determine whether the target is within the field of view of the camera and, if so, the position of the target within a frame of video of the camera.
- Alternate embodiments allow for the processor to determine the position of the target in the frame of video using field of view data, pattern (or image) recognition technology, electromagnetic signals and/or other appropriate means.
- One exemplar embodiment uses field of view data to find a rough location of the target and then uses pattern recognition to find the exact location.
- pattern recognition technology provides for faster resolution of the target's location than using pattern recognition alone.
- the video modification unit which is in communication with the processor, modifies the frame of video to enhance at least a portion of the video image of the target. That is, a target can be edited, highlighted, overlayed or replaced with a replacement image.
- a video modification unit can be used to highlight a portion of a football field (or other playing field) or replace a first billboard in a stadium with a second billboard. Because the system can be configured to use pattern recognition technology and field of view sensors, the system can be used with multiple broadcast cameras simultaneously. Therefore, a producer of a live event is free to switch between the various broadcast cameras at the stadium and the television viewer will see the enhancement regardless of which camera is selected by the producer.
- An alternate embodiment contemplates replacing either the field of view sensors and/or the pattern recognition technology with electromagnetic transmitters and sensors. That is, the target can be used to emit an electromagnetic signal.
- a sensor can be placed at the camera, or the camera can be used as a sensor, to detect the signal from the target in order to locate the target. Once the target is located within the video frame, the system can enhance the video image of the target.
- a further alternative includes treating the target with spectral coatings so that the target will reflect (or emit) a distinct signal which can be detected by a camera with a filter or other sensor.
- FIG. 1 depicts a perspective view of part of a football stadium.
- FIG. 2 depicts a perspective view of the football stadium of FIG. 1 as seen by a television viewer after the video has been enhanced.
- FIG. 3 depicts a block diagram of a subset of the components that make up the present invention.
- FIG. 4 depicts a block diagram of a subset of the components that make up the present invention.
- FIG. 5 is a flow chart describing the operation of the present invention.
- FIG. 6 is a flow chart which provides more detail of how the present invention accounts for occlusions.
- FIG. 7 is a partial block diagram of an alternate embodiment of the present invention.
- FIG. 8 is a partial flow chart describing the operation of the alternate embodiment depicted in FIG. 7 .
- FIG. 1 is a partial view of football stadium 100 .
- a football field 102 Surrounding football field 102 are the seats 104 for the fans. Between seats 104 and playing field 102 is a retaining wall 106 .
- an advertisement AD 1 On retaining wall 106 is an advertisement AD 1 .
- the first target is an advertisement AD 1 to be replaced by another advertisement.
- the second target is a portion of the playing field which is to receive an advertisement.
- a third target is an area above the stadium.
- a fourth target is a location on the playing field 102 representing where a team must cross in order to get a first down.
- the television broadcaster may be enhancing the video image as discussed above, the spectators and players at the stadium would not see any of these enhancements, rather they would view the stadium as depicted in FIG. 1 .
- FIG. 2 shows the view of FIG. 1 , as seen by viewers watching the broadcast on television, after enhancements are made to the video.
- Advertisement AD 2 is in the same location as advertisement AD 1 was in FIG. 1 . Thus, advertisement AD 2 has replaced advertisement AD 1 .
- Advertisement AD 3 is shown in end zone 108 . Advertisement AD 3 does not replace another advertisement because there was no advertisement in end zone 108 prior to the enhancement.
- FIG. 2 also shows advertisement AD 4 , which to the television viewer appears to be suspended above stadium 100 .
- a thick line 110 which represents the highlighting of the portion of the field at which the team who is offense must cross in order to get a first down at a particular moment during the game.
- the highlighting of the field consists of a bold thick line.
- Alternatives include different color lines, shading, using a blinking line, varying the brightness, etc.
- the enhancement need not be a line.
- the enhancement may also be any other shape or graphic that is appropriate.
- an enhancement includes editing an image, replacing part of an image with another image, overlaying all or part of an image, highlighting an image using any appropriate method of highlighting, or replacing an image with video.
- FIG. 3 is a block diagram of a subset of the components that make up the present invention.
- the components shown on FIG. 3 are typically located at a camera bay in the stadium; however, they can be located in other suitable locations.
- Broadcast camera 140 captures a frame of video which is sent to a production center as shown by the signal BC 1 .
- Broadcast camera 140 has a zoom lens, including a 2X Expander (range extender).
- a 2X Expander/zoom/focus sensor 152 (collectively a “zoom sensor”) which senses the zoom in the camera, the focal distance of the camera lense, and whether the 2X Expander is being used.
- the analog output of sensor 152 is sent to an analog to digital converter 154 , which converts the analog signal to a digital signal, and transmits the digital signal to processor 156 .
- One alternative includes using a zoom sensor with a digital output, which would remove the need for analog to digital converter 154 .
- Broadcast camera 140 is mounted on tripod 144 which includes pan and tilt heads that enable broadcast camera 140 to pan and tilt. Attached to tripod 144 are pan sensor 146 and tilt sensor 148 , both of which are connected to pan-tilt electronics 150 .
- broadcast camera 140 can include a built-in pan and tilt unit. In either configuration, pan sensor 146 , tilt sensor 148 and zoom sensor 152 are considered to be coupled to broadcast camera 140 because they can sense data representing the pan tilt, and zoom of broadcast camera 140 .
- Processor 156 is an Intel Pentium processor with supporting electronics; however, various other processors can be substituted. Processor 156 also includes memory and a disk drive to store data and software. In addition to being in communication with pan-tilt electronics 150 and analog to digital converter 154 , processor 156 is in communication (via signal CB 1 ) with a production center which is described below.
- pan sensor 146 and tilt sensor 148 are optical encoders that output a signal, measured as a number of clicks, indicating the rotation of a shaft. Forty thousand (40,000) clicks represent a full 360° rotation. Thus, a processor can divide the number of measured clicks by 40,000 and multiply by 360 to determine the pan or tilt angle in degrees.
- the pan and tilt sensors use standard technology known in the art and can be replaced by other suitable pan and tilt sensors known by those skilled in the relevant art.
- Pan/tilt electronics 150 receives the output of pan sensor 146 and tilt sensor 148 , converts the output to a digital signal (representing pan and tilt) and transmits the digital signal to processor 156 .
- the pan, tilt and zoom sensors are used to determine the field of view of the broadcast camera. Thus, one or more of the pan, tilt or zoom sensors can be labeled as a field of view senor(s). For example, if a camera cannot zoom or tilt, the field of view sensor would only include a pan sensor.
- An alternative field of view sensor includes placing marks in various known locations in the stadium such that each mark looks different and at least one mark will always be visible to the camera while the camera is pointed at the relevant portions of the stadium.
- a computer using pattern recognition technology can find the mark in a frame of video and, based on the mark's size and position in the frame of video, determine more precisely the field of view and/or pan, tilt or zoom of the camera.
- a system can also be set up to use pan/tilt/zoom sensors in combination with the marks described above so that the pan/tilt/zoom can be used to make a rough estimate of where the camera is pointing and the mark is used to achieve a more accurate estimate. In such a combination system the marks need not look different if the placement of the marks is predetermined.
- Another alternative includes placing infrared emitters or beacons along the perimeter of the playing field or other portions of the stadium.
- a computer can determine an infrared sensor's field of view based on the location of the signal in the infrared sensor's frame of data. If the infrared sensor is mounted on a broadcast camera, determining the pan and tilt of the infrared sensor determines the pan and tilt of the broadcast camera plus a known offset.
- FIG. 3 shows a second and optional camera labeled as dedicated camera 142 .
- Dedicated camera 142 is mounted on a tripod 157 .
- tripod 157 includes an optional pan sensor 158 and an optional tilt sensor 160 , both of which are in communication with pan-tilt electronics 150 .
- the dedicated camera is set to one pan and tilt position; therefore, pan and tilt sensors are not needed.
- the output of dedicated camera 142 is the camera signal DC 1 , which is communicated to the production center described below.
- the present invention will perform its function without the use of dedicated camera 142 ; however, dedicated camera 142 improves the ability of the system to account for occlusions.
- Dedicated camera 142 should be located substantially adjacent to broadcast camera 140 .
- each broadcast camera could be associated with more than one dedicated cameras.
- each broadcast camera would include a plurality of dedicated cameras, one dedicated camera for each potential target the broadcast camera will view.
- FIG. 4 is a block diagram of the production center.
- the production center is housed in a truck parked outside of the stadium.
- the production center can be at a central office or the components of the production center can be spread out in multiple locations.
- the heart of the production center is processor 200 .
- the preferred processor 200 is an Onyx computer from Silicon Graphics; however, various other suitable processors or combinations of processors can perform the necessary functions of the present invention.
- Processor 200 is in communication with video control 202 , video mixer 204 and multiplexor 206 .
- processor 200 includes more than one processor.
- processor 200 could include two Onyx computers, one for locating the target and one for determining occlusions.
- Broadcasters use many broadcast cameras at the stadium to televise a sporting event.
- the video signals from the various cameras are sent to video control 202 which is used to select one broadcast camera for transmission to viewers.
- video control 202 includes a plurality of monitors (one monitor for each video signal) and a selection circuit.
- a director or manager, producer, etc.
- the choice would be communicated to the selection circuit which selects one camera signal to broadcast.
- the choice is also communicated to processor 200 , video mixer 204 and multiplexer 206 via signal 208 .
- the selected video signal is sent to delay 210 and processor 200 via analog to digital converter 212 . If the broadcast camera is a digital camera, then there would be no need for analog to digital converter 212 .
- delay 210 is sent to video modification unit 214 .
- the purpose of delay 210 is to delay the broadcast video signal a fixed number of frames to allow time for processor 200 to receive data, determine the position of the target in the frame of video and prepare any enhancements.
- the video is delayed a small number of frames, the television signal is still defined as live.
- the delay introduced by the system is a small delay (under one second) which does not accumulate. That is, different frames of video are enhanced with the same small delay. For example, a ten frame delay is equivalent to one-third of a second, which is not considered a significant delay for television.
- Video mixer 204 receives the video signals from all of the dedicated cameras.
- FIG. 4 shows signals DC 1 and DC 2 .
- Signal DC 1 is a dedicated camera associated with the broadcast camera BC 1 . If video control 202 selects BC 1 then that selection is communicated to video mixer 204 which selects DC 1 .
- some alternatives include having many dedicated cameras for one broadcast camera. For example, one broadcast camera may have four dedicated cameras. In that case, the dedicated cameras would be labeled DC 1 a, DC 1 b, DC 1 c and DC 1 d.
- video mixer 204 would select up to all four dedicated cameras: DC 1 a, DC 1 b, DC 1 c and DC 1 d.
- the selected signal(s) from video mixer 204 is sent to analog to digital converter 216 which digitizes the video signal(s) and sends the digital signal(s) to processor 200 .
- Multiplexer 206 receives signals from the processors at each of the camera locations.
- FIG. 4 shows multiplexer 206 receiving signal CB 1 from processor 156 of FIG. 3 .
- Each of the processor signals (CB 1 , CB 2 , . . . ) is associated with a broadcast camera.
- the selection by video control 202 is communicated to multiplexer 206 so that multiplexer 206 can send the corresponding signal to processor 200 .
- the signal sent by multiplexer 206 to processor 200 includes the information from the field of view sensors.
- processor 156 calculates the field of view and sends the resulting information, via multiplexer 206 , to processor 200 .
- processor 200 receives the data via multiplexer 206 and determines the field of view. Either alternative is suitable for the present invention.
- Processor 200 is connected to memory 220 which stores the locations of the targets and images of the targets (or at least partial images). Memory 220 also stores images of the replacement graphics, instructions for creating replacement graphics and/or instructions for highlighting, editing, etc. Memory 200 is loaded with its data and maintained by processor 222 . The inventors contemplate that during operation of this system, processor 200 will be too busy to use compute time for loading and maintaining memory 220 . Thus, a separate processor 222 is used to load and maintain the memory during operation. If cost is a factor, processor 222 can be eliminated and processor 200 will be used to load and maintain memory 220 ; however, for optimal performance memory 220 should be loaded, if possible, prior to the broadcast.
- the images and locations of targets can be loaded into memory 220 either manually or automatically. For example, if the target's image and location are known in advance (e.g. an advertisement at the stadium) then prior to real-time operation of the system an operator can input the location of the target and scan in (or otherwise download) an image of the target. Alternatively, the operator can point one or more cameras at the target and use a mouse, light pen or other pointing device to select the target's image for storing in memory 220 . The location of the target can be determined by physical measurement, using pan/tilt/zoom sensors, etc.
- the operator can select the target during operation using a pointing device and the system will download the image of the target and its location (using pan/tilt/zoom data) to memory 220 .
- the system can be programmed to know that the target is one of a set of possible targets.
- the system can be programmed to know that the target is a yard line and the operator need only input which yard line is the current target.
- the replacement graphics are loaded into memory after being digitized, downloaded or the replacement graphics can be created with processor 222 . Instructions for highlighting or creating replacement graphics can be programmed using processor 222 or processor 200 .
- Processor 200 is connected to video modification unit 214 .
- the output of video modification unit 214 labeled as signal 226 , is the video signal intended for broadcast. This signal can be directly broadcast or sent to other hardware for further modification or recording.
- Video modification unit 214 modifies the video signal from delay 210 with the data/signal from processor 200 .
- the type of modification can vary depending on the desired graphic result.
- One exemplar implementation uses a linear keyer as a video modification unit 214 .
- the signal from the video processor 200 to the keyer includes two signals: YUV and an external key (alpha).
- the YUV signal is called foreground and the signal from delay 210 is called background.
- video modification unit 214 can be another processor or video modification unit 214 can be a part of processor 200 .
- processor 200 determines the field of view of the selected broadcast camera and checks memory 220 to see if any targets are within that field of view. If so, processor 200 then determines the exact position of the target in a frame of video by determining which pixels represent the target. Processor 200 then checks memory 220 for the replacement graphic or instructions to make a replacement graphic (or highlight). If the replacement strategy is to highlight a certain portion of a field, then memory 220 may include instructions for changing the color of a certain portion of the field, shading of a certain portion of the field, etc. Based on the pan, tilt and zoom, and the actual image of the target, processor 200 determines the size and orientation of the replacement graphic (also called mapping).
- mapping the size and orientation of the replacement graphic
- the enhancement includes processor 200 creating a frame of video with a graphic at the position of the enhancement.
- the frame created by processor 200 is sent to video modification unit 214 which combines the frame from processor 200 with the frame from delay 210 .
- processor 200 is also used to account for occlusions.
- An alternate embodiment includes eliminating the separate video modification unit and using processor 200 to edit the video signal from the selected broadcast camera.
- FIG. 5 is a flow chart which explains the operation of the present invention.
- video data is captured by a broadcast camera and is digitized. If the broadcast camera is a digital camera, digitizing is unnecessary.
- pan, tilt and zoom data (field of view data) is sensed in step 302 and the field of view is determined in step 304 .
- processor 200 determines if any of the targets are within the field of view.
- Memory 200 includes a database. In one alternative, the database stores the three dimensional locations of all the targets.
- the field of view of a broadcast camera can be thought of as a pyramid whose location and dimensions are determined based on the field of view data.
- Step 306 is a quick method for determining if there is a target within the field of view of the camera. If not, the process is done and the system waits until the next frame of data. If there is a target within the field of view of the selected broadcast camera, then the exact position of the target must be determined within the frame of video of the selected broadcast camera.
- determining the position of the target is a two-step process.
- a rough estimate is made based on the pan, tilt and zoom values and in the second step the estimate of the target's position is refined (step 310 ).
- the target's position in the video frame can be estimated.
- the accuracy of step 308 is determined by the accuracy of the pan/tilt/zoom sensors, the software used to determine the field of view and the stability of the platform on which the camera is located.
- the field of view sensor equipment may be so accurate that the position of the target is adequately determined and step 310 is not necessary.
- the pan, tilt and zoom data only provides a rough estimate 308 (e.g a range of positions or general area of position) and step 310 is needed to determine a more accurate position.
- Step 310 provides a more accurate determination of the target's position using pattern recognition techniques which are known in the art.
- Example of known pattern recognition and image processing technology can be found in the following documents: U.S. Pat. No. 3,973,239, Pattern Preliminary Processing System; U.S. Pat. No. 4,612,666, Automatic Pattern Recognition Apparatus; U.S. Pat. No. 4,674,125, Real-Time Hierarchal Pyramid Signal Processing Apparatus; U.S. Pat. No. 4,817,171, Pattern Recognition System; U.S. Pat. No. 4,924,507, Real-Time Optical Multiple Object Recognition and Tracking System and Method; U.S. Pat. No. 4,950,050, Optical Target Recognition System; U.S. Pat. No.
- step 310 can use suitable technology other than pattern recognition technology.
- processor 200 fetches the replacement graphic from memory 220 . If memory 220 is storing instructions for replacement graphics, then processor 200 fetches the instructions and creates the graphic. For example, creating the graphic can include drawing a highlight for the yard line of a football field.
- processor 200 determines the size and orientation of the replacement image, and maps the replacement image to the video frame. Memory 220 merely stores one size image. Because of the pan, tilt and zoom of the broadcast camera, the image stored in memory 220 may need to be mapped to the video frame (e.g. magnified, reduced, twisted, angled, etc.). Processor 200 can determine the orientation based on the field of view data and/or the pattern recognition analysis in step 310 . For example, by knowing where the broadcast camera is located and the pan, tilt and zoom of the broadcast camera, a computer can be programmed to figure how to map the replacement image or highlight on to the video frame.
- step 316 the system accounts for occlusions. If there is an object or person in front of the target, then the enhanced video should show the object or person in front of the replacement graphic, highlight, etc. In one embodiment, the system cuts out a silhouette in the shape of the object or person from the replacement image. Step 316 is discussed in more detail with respect to FIG. 6 .
- step 318 the system modifies the video of the original broadcast camera. As discussed above, this could include creating a second frame of video which includes a replacement image and using a keyer to combine the second frame of video with the original frame of video. Alternatively, a processor can be used to edit the frame of video of the broadcast camera. It is possible that within a given frame of video there may be more than one target. In that case steps 308 - 318 may be repeated for each target, or steps 308 - 316 may be repeated for each target and step 318 be performed only once for all targets. Subsequent to step 318 , the enhanced frame of video may be broadcast or stored, and the process (steps 300 - 318 ) may repeat for another frame of video.
- FIG. 6 is a more detailed flow diagram explaining how the system accounts for occlusion.
- the steps described in FIG. 6 are performed by a system which includes one or more dedicated cameras (e.g. dedicated camera 142 ).
- Step 350 is performed before the live event occurs.
- a dedicated camera Prior to the game, a dedicated camera is pointed directly at one of the targets; the camera is zoomed in such that the target fills a substantial portion of the dedicated camera's frame of video; and the image of the target is stored in memory 220 .
- a substantial portion means that the target typically appears to cover over half of the frame of video of the dedicated camera.
- the dedicated camera should be zoomed in such that the target fills the greatest amount of the frame of video possible while remaining completely within the frame of video, unless it is desired to have clues of the scenery surrounding the target.
- the dedicated camera After the dedicated camera is pointed at the target, its pan, tilt and zoom should remain fixed.
- steps 352 - 362 are repeated for each frame where the occlusion analysis is desired.
- a video image is captured and digitized by the dedicated camera.
- a video image is captured by the broadcast camera.
- the digitized image from the dedicated camera is compared to the stored image of the target.
- the stored image is stored in memory 220 .
- the processor knows which stored image to compare with from step 306 of FIG. 5 .
- the step of comparing could include altering one of the images such that both images are the same size and orientation, and then subtracting the data. Alternatively, other methods can be used to compare.
- step 356 If there is an occlusion blocking the target (step 356 ), then the two images will be significantly different and, in step 358 , an occlusion will be reported. In reporting the occlusion, the system reports the presence of an occlusion and the coordinates of the occlusion.
- step 354 it is possible that there is no occlusion; however, the two images are not exactly the same. The differences between the images must meet a certain minimum threshold to be considered an occlusion. If the differences are not great enough to be an occlusion, then in step 360 the system determines that the differences are due to ambient conditions in the stadium. For example, if the lights have been dimmed then the captured image of the target may appear darker.
- step 360 If small differences are detected in step 360 that do not meet the threshold for occlusions, then the system “learns” the changes to the target by updating the stored image of the target to reflect the new lighting or weather conditions (step 362 ). For example, the new stored image of the target may be darker than the original image. Subsequent to step 362 the system performs the report step 358 and reports that no occlusion was found.
- An alternative to the method of FIG. 6 includes comparing the target image from the broadcast camera to the stored image.
- using the broadcast camera is not as advantageous as using a dedicated camera because it is likely that the broadcast camera would not be zoomed to the image.
- the target image is likely to be smaller on the broadcast camera than it will on the dedicated camera. Because there is a small image to work with, the system loses the subpixel accuracy obtained from the dedicated camera. Also, using a separate dedicated camera may increase the speed at which the system accounts for occlusions.
- FIG. 7 shows an alternative embodiment of the present invention which utilizes electromagnetic transmitting beacons at or near a target.
- the beacons transmit an electromagnetic signal not visible to the human eye.
- Electromagnetic waves include light, radio, x-rays, gamma rays, microwave, infrared, ultraviolet and others, all involving the propagation of electric and magnetic fields through space. The difference between the various types of electromagnetic waves are in the frequency or wave length.
- the human eye is sensitive to electromagnetic radiation of wave lengths from approximately 400-700 nm, the range called light, visible light or the visible spectrum.
- the phrase “electromagnetic signal not visible to a human eye” means an electromagnetic wave outside of the visible spectrum.
- the beacon is an electromagnetic transmitter which includes infrared emitting diodes. Other sources which transmit electromagnetic waves may also used, for example, radio transmitters, radar repeaters, etc.
- FIG. 7 shows a broadcast camera 400 which outputs a video signal 402 .
- Broadcast camera 400 includes a zoom lens coupled to a zoom detector 404 .
- the output of zoom detector 404 is transmitted to analog to digital converter 406 which sends the digital output to processor 408 .
- Mounted on top of broadcast camera 400 is sensor 410 .
- sensor 410 is an infrared sensor.
- Sensor 410 is mounted on top of broadcast camera 400 so that the optical axis of sensor 410 is as close as possible to the optical axis of broadcast camera 400 . It is also possible to locate sensor 410 near broadcast camera 400 and account for differences between optical axes using matrix transformations or other suitable mathematics.
- an infrared sensor is a progressive scan, full frame shutter camera, for example, the TM-9701 by Pulnix.
- the Pulnix sensor is a high resolution 768(H) by 484(V) black and white full frame shutter camera with asynchronous reset capability.
- the camera has an eight bit digital signal output and progressively scans 525 lines of video data.
- a narrow band infrared filter is affixed in front of the lens of the Pulnix sensor. The purpose of the filter is to block electromagnetic signals that are outside the spectrum of the signal from the beacon.
- the sensor captures a frame of video (data) which comprises a set of pixels. Each pixel is assigned a coordinate corresponding to an x-axis and a y-axis.
- the sensor data includes an eight bit brightness value for each pixel, which are scanned out pixel by pixel to interface 412 along with other timing information.
- Interface 412 outputs four signals: LDV, FDV, CK and DATA.
- LDV line data valid
- FDV frame data valid
- CK pixel clock
- X-Y counters 414 counts X and Y coordinates sequentially in order to keep track of the location of the pixel whose data is being scanned in at the current time.
- LDV is inserted
- FDV is inserted
- the Y counter is reset.
- the signal Data includes the eight bit data value for each pixel.
- memory control 416 determines whether the pixels meets a brightness threshold. That is, noise and other sources will cause a large number of pixels to receive some data. However, the pixels receiving the signal from the beacon will have at least a minimum brightness level. This brightness threshold is set in a register (not shown) which can be set by processor 408 . If the data for a particular pixel is above the brightness threshold, memory control 416 sends a write enable (WE) signal to memory 418 , causing memory 418 to store the X and Y coordinates of the pixel, the data for that pixel and a code for that pixel. The code indicates that the data is valid data, a new frame, end of frame or a flash. Processor 408 can read the data from memory 418 and process the data locally or transmit the data to the production center (e.g., to multiplexer 206 ).
- WE write enable
- a second embodiment connects a signal from a strobe flash to a computer which causes the system to ignore data sensed during a flash.
- a third embodiment includes using flash detectors.
- the flash detector can be located anywhere in the arena suitable for sensing a strobe flash.
- FIG. 7 shows flash detector 422 which detects a flash and sends a signal to memory control 416 .
- Flash detector 422 includes a photo detector which can comprise, at least, a photo diode and an opamp. In front of the photo detector would be a filter that allows detection of signals in a spectrum that includes the signals emitted by the beacon. Connected to the opamp are components which can detect pulse edges.
- FIG. 7 operates similar to the embodiment described in FIG. 3 . Some of the differences between the operation of the two embodiments are depicted in FIG. 8 . Similar to the embodiment in FIG. 3 , the embodiment in FIG. 7 first captures and digitizes video data. In step 450 , infrared data is received. In step 452 , the system determines whether a target is found in the infrared data by monitoring the data stored in memory 418 . Since memory control 416 only allows data above a threshold to be stored in memory 418 , if a given frame of data from a sensor has pixel data stored in memory then a target is found. If a sensor is detecting false targets, then various error correction methods known in the art can be utilized.
- step 454 the position of the target is determined in the frame of video by reading the X and Y coordinates stored with the pixel data in memory 418 .
- Step 456 fine tunes the determined position information of the target to account for the error from the camera's platform or pan/tilt/zoom sensors.
- One alternative for accounting for the difference in optical axis is to use a transformation matrix; however, other mathematical solutions known in the art are also suitable.
- the system can perform steps 312 through 318 as described with respect to FIG. 5 , however, any field of view data used is based on the size and position of the beacon's signal in the sensor's frame of video.
- a further alternative of FIG. 7 includes using polarization. That is the infrared filter on sensor 410 is replaced or augmented with a polarized filter.
- a target to be replaced e.g., a billboard
- the filter and spectral coating are designed such that light reflecting off the billboard to sensor 410 will be completely blacked-out.
- the pixels that represent the position of the target in the sensor's frame of video will have a brightness value of zero or close to zero.
- memory control 416 is used to only store memory that has a brightness value of zero or below a threshold level.
Abstract
Pan, tilt and zoom sensors are coupled to a broadcast camera in order to determine the field of view of the broadcast camera and to make a rough estimate of a target's location in the broadcast camera's field of view. Pattern recognition techniques can be used to determine the exact location of the target in the broadcast camera's field of view. If a preselected target is at least partially within the field of view of the broadcast camera, all or part of the target's image is enhanced. The enhancements include replacing the target image with a second image, overlaying the target image or highlighting the target image. Examples of a target include a billboard, a portion of a playing field or another location at a live event. The enhancements made to the target's image can be seen by the television viewer but are not visible to persons at the live event.
Description
- This application is a continuation of U.S. patent application Ser. No. 09/844,524, filed on Apr. 27, 2001, which is a continuation of U.S. patent application Ser. No. 09/627,106, filed on Jul. 27, 2000, which is a continuation of U.S. patent application Ser. No. 09/264,138, filed Mar. 5, 1999, now U.S. Pat. No. 6,141,060, which is a continuation of U.S. patent application Ser. No. 08/735,020, filed Oct. 22, 1996, now U.S. Pat. No. 5,917,553, incorporated herein by reference.
- 1. Field of the Invention
- The present invention is directed to a method and apparatus for enhancing a television broadcast of a live event.
- 2. Description of the Related Art
- The television presentation of live events could be improved by enhancing the video in real time to make the presentation more interesting to the viewer. For example, television viewers cannot see the entire playing field during a sporting event; therefore, the viewer may lose perspective as to where one of the players or objects are on the field in relation to the rest of the field, players or objects. During the telecast of football games cameras tend to zoom in on the players which allows the viewer to only see a small portion of the field. Because the viewer can only see a small portion of the field a viewer may not know where a particular player is in relation to the pertinent locations on the field. One instance is when a player is carrying the football, the television viewer may not know how far that player has to run for a first down. One enhancement that would be helpful to television viewers of football games is to highlight the field at the point where a player must advance in order to obtain a first down.
- An enhancement that would be helpful to viewers of golf tournaments is to highlight those portions of a golf course that have been notorious trouble spots to golfers. While the professional golfer is aware of these trouble spots and hits the ball to avoid those spots, the television viewer may not be aware of those trouble spots and may wonder why a particular golfer is hitting the ball in a certain direction. If the golf course was highlighted to show these trouble spots, a television viewer would understand the strategy that the golfer is using and get more enjoyment out of viewing the golf tournament. Another useful enhancement would include showing the contours of the green. Similar enhancements to the playing field would be useful in other sports as well.
- Furthermore, live events do not take advantage of the scope of the television audience with respect to advertising. First, advertisements on display at a stadium can be televised; however, many of those advertisements are not applicable to the television audience. For example, a particular sporting event may be played in San Francisco and televised around the world. A local store may pay for a billboard at the stadium. However, viewers in other parts of the United States or in other countries receiving the broadcast may not have access to that store and, thus, the broadcast of the advertisement is not effective. Second, some of the space at a stadium is not used because such use would interfere with the view of the players or the spectators at the stadium. However, using that space for advertisement would be very effective for the television audience. For example, the glass around the perimeter of a hockey rink would provide an effective place for advertisements to the television audience. However, such advertisements would block the view of spectators at the stadium. Third, some advertisements would be more effective if their exposure is limited to particular times when customers are thinking of that type of product. For example, an advertisement for an umbrella would be more effective while it was raining.
- Previous attempts to enhance the video presentation of live events have not been satisfactory. Some broadcasters superimpose advertisements on the screen; however, these advertisements tend to block the view of the event.
- Another solution included digitizing a frame of video and using a computer with pattern recognition software to locate the target image to be replaced in the frame of video. When the target image is found, a replacement image is inserted in its place. The problem with this solution is that the software is too slow and cannot be effectively used in conjunction with a live event. Such systems are even slower when they account for occlusions. An occlusion is something that blocks the target. For example, if the target is a billboard on the boards around a hockey rink, one example of an occlusion is a player standing in front of the billboard. When that billboard is replaced, the new billboard image must be inserted into the video such that the player appears to be in front of the replacement billboard.
- The present invention is directed to a system for enhancing the broadcast of a live event. A target, at a live event, is selected to be enhanced. Examples of targets include advertisements at a stadium, portions of the playing field (e.g., football field, baseball field, soccer field, basketball court, etc.), locations at or near the stadium, or a monochrome background (e.g. for chroma-key) positioned at or near the stadium. The system of the present invention, roughly described, captures video using a camera, senses field of view data for that camera, determines a position and orientation of a video image of the target in the captured video and modifies the captured video by enhancing at least a portion of the video image of the target. Alternative embodiments of the present invention include determining the perspective of the video image of the target and/or preparing an occlusion for the video image of the target.
- One embodiment of the present invention includes one or more field of view sensors coupled to a camera such that the sensors can detect data from which the field of view of the camera can be determined. The field of view sensors could include pan, tilt and/or zoom sensors. The system also includes a processor, a memory and a video modification unit. The memory stores a location of the target and, optionally, data representing at least a portion of the video image of the target. The processor, which is in communication with the memory and the field of view sensors, is programmed to determine whether the target is within the field of view of the camera and, if so, the position of the target within a frame of video of the camera. Alternate embodiments allow for the processor to determine the position of the target in the frame of video using field of view data, pattern (or image) recognition technology, electromagnetic signals and/or other appropriate means. One exemplar embodiment uses field of view data to find a rough location of the target and then uses pattern recognition to find the exact location. Such a combination of field of view data with pattern recognition technology provides for faster resolution of the target's location than using pattern recognition alone.
- The video modification unit, which is in communication with the processor, modifies the frame of video to enhance at least a portion of the video image of the target. That is, a target can be edited, highlighted, overlayed or replaced with a replacement image. For example, a video modification unit can be used to highlight a portion of a football field (or other playing field) or replace a first billboard in a stadium with a second billboard. Because the system can be configured to use pattern recognition technology and field of view sensors, the system can be used with multiple broadcast cameras simultaneously. Therefore, a producer of a live event is free to switch between the various broadcast cameras at the stadium and the television viewer will see the enhancement regardless of which camera is selected by the producer.
- An alternate embodiment contemplates replacing either the field of view sensors and/or the pattern recognition technology with electromagnetic transmitters and sensors. That is, the target can be used to emit an electromagnetic signal. A sensor can be placed at the camera, or the camera can be used as a sensor, to detect the signal from the target in order to locate the target. Once the target is located within the video frame, the system can enhance the video image of the target. A further alternative includes treating the target with spectral coatings so that the target will reflect (or emit) a distinct signal which can be detected by a camera with a filter or other sensor.
- These and other objects and advantages of the invention will appear more clearly from the following description in which the preferred embodiment of the invention has been set forth in conjunction with the drawings.
-
FIG. 1 depicts a perspective view of part of a football stadium. -
FIG. 2 depicts a perspective view of the football stadium ofFIG. 1 as seen by a television viewer after the video has been enhanced. -
FIG. 3 depicts a block diagram of a subset of the components that make up the present invention. -
FIG. 4 depicts a block diagram of a subset of the components that make up the present invention. -
FIG. 5 is a flow chart describing the operation of the present invention. -
FIG. 6 is a flow chart which provides more detail of how the present invention accounts for occlusions. -
FIG. 7 is a partial block diagram of an alternate embodiment of the present invention. -
FIG. 8 is a partial flow chart describing the operation of the alternate embodiment depicted inFIG. 7 . -
FIG. 1 is a partial view offootball stadium 100. In the center ofstadium 100 is afootball field 102. Surroundingfootball field 102 are theseats 104 for the fans. Betweenseats 104 and playingfield 102 is aretaining wall 106. On retainingwall 106 is an advertisement AD1. For example purposes only, assume that a particular television broadcaster has selected four targets for enhancement. The first target is an advertisement AD 1 to be replaced by another advertisement. The second target is a portion of the playing field which is to receive an advertisement. For this example, assume that the broadcaster wishes to place an advertisement in theend zone 108 of the football field. A third target is an area above the stadium. That is, the television broadcaster may wish that when a camera is pointed to the top of the stadium, the viewers sees an advertisement suspended above the stadium. A fourth target is a location on theplaying field 102 representing where a team must cross in order to get a first down. Although the television broadcaster may be enhancing the video image as discussed above, the spectators and players at the stadium would not see any of these enhancements, rather they would view the stadium as depicted inFIG. 1 . -
FIG. 2 shows the view ofFIG. 1 , as seen by viewers watching the broadcast on television, after enhancements are made to the video. Advertisement AD2 is in the same location as advertisement AD1 was inFIG. 1 . Thus, advertisement AD2 has replaced advertisement AD1. Advertisement AD3 is shown inend zone 108. Advertisement AD3 does not replace another advertisement because there was no advertisement inend zone 108 prior to the enhancement.FIG. 2 also shows advertisement AD4, which to the television viewer appears to be suspended abovestadium 100. Also shown inFIG. 2 is athick line 110 which represents the highlighting of the portion of the field at which the team who is offense must cross in order to get a first down at a particular moment during the game. In this particular example, the highlighting of the field consists of a bold thick line. Alternatives include different color lines, shading, using a blinking line, varying the brightness, etc. The enhancement need not be a line. The enhancement may also be any other shape or graphic that is appropriate. Thus, for purposes of this patent an enhancement includes editing an image, replacing part of an image with another image, overlaying all or part of an image, highlighting an image using any appropriate method of highlighting, or replacing an image with video. -
FIG. 3 is a block diagram of a subset of the components that make up the present invention. The components shown onFIG. 3 are typically located at a camera bay in the stadium; however, they can be located in other suitable locations.Broadcast camera 140 captures a frame of video which is sent to a production center as shown by the signal BC1.Broadcast camera 140 has a zoom lens, including a 2X Expander (range extender). Connected to broadcastcamera 140 is a 2X Expander/zoom/focus sensor 152 (collectively a “zoom sensor”) which senses the zoom in the camera, the focal distance of the camera lense, and whether the 2X Expander is being used. The analog output ofsensor 152 is sent to an analog todigital converter 154, which converts the analog signal to a digital signal, and transmits the digital signal toprocessor 156. One alternative includes using a zoom sensor with a digital output, which would remove the need for analog todigital converter 154.Broadcast camera 140 is mounted ontripod 144 which includes pan and tilt heads that enablebroadcast camera 140 to pan and tilt. Attached totripod 144 arepan sensor 146 andtilt sensor 148, both of which are connected topan-tilt electronics 150. Alternatively,broadcast camera 140 can include a built-in pan and tilt unit. In either configuration,pan sensor 146,tilt sensor 148 andzoom sensor 152 are considered to be coupled tobroadcast camera 140 because they can sense data representing the pan tilt, and zoom ofbroadcast camera 140. -
Processor 156 is an Intel Pentium processor with supporting electronics; however, various other processors can be substituted.Processor 156 also includes memory and a disk drive to store data and software. In addition to being in communication withpan-tilt electronics 150 and analog todigital converter 154,processor 156 is in communication (via signal CB1) with a production center which is described below. - In one embodiment,
pan sensor 146 andtilt sensor 148 are optical encoders that output a signal, measured as a number of clicks, indicating the rotation of a shaft. Forty thousand (40,000) clicks represent a full 360° rotation. Thus, a processor can divide the number of measured clicks by 40,000 and multiply by 360 to determine the pan or tilt angle in degrees. The pan and tilt sensors use standard technology known in the art and can be replaced by other suitable pan and tilt sensors known by those skilled in the relevant art. Pan/tilt electronics 150 receives the output ofpan sensor 146 andtilt sensor 148, converts the output to a digital signal (representing pan and tilt) and transmits the digital signal toprocessor 156. The pan, tilt and zoom sensors are used to determine the field of view of the broadcast camera. Thus, one or more of the pan, tilt or zoom sensors can be labeled as a field of view senor(s). For example, if a camera cannot zoom or tilt, the field of view sensor would only include a pan sensor. - An alternative field of view sensor includes placing marks in various known locations in the stadium such that each mark looks different and at least one mark will always be visible to the camera while the camera is pointed at the relevant portions of the stadium. A computer using pattern recognition technology can find the mark in a frame of video and, based on the mark's size and position in the frame of video, determine more precisely the field of view and/or pan, tilt or zoom of the camera. A system can also be set up to use pan/tilt/zoom sensors in combination with the marks described above so that the pan/tilt/zoom can be used to make a rough estimate of where the camera is pointing and the mark is used to achieve a more accurate estimate. In such a combination system the marks need not look different if the placement of the marks is predetermined. Another alternative includes placing infrared emitters or beacons along the perimeter of the playing field or other portions of the stadium. A computer can determine an infrared sensor's field of view based on the location of the signal in the infrared sensor's frame of data. If the infrared sensor is mounted on a broadcast camera, determining the pan and tilt of the infrared sensor determines the pan and tilt of the broadcast camera plus a known offset. A more detailed discussion of using infrared technology, pan/tilt/zoom sensors, three dimensional location finding technology and video enhancement can be found in U.S. patent application Ser. No. 08/585,145, A System For Enhancing The Television Presentation Of An Object At A Sporting Event, incorporated herein by reference.
-
FIG. 3 shows a second and optional camera labeled asdedicated camera 142.Dedicated camera 142 is mounted on atripod 157. In one embodiment,tripod 157 includes anoptional pan sensor 158 and anoptional tilt sensor 160, both of which are in communication withpan-tilt electronics 150. As will be explained below, in one embodiment the dedicated camera is set to one pan and tilt position; therefore, pan and tilt sensors are not needed. The output ofdedicated camera 142 is the camera signal DC1, which is communicated to the production center described below. The present invention will perform its function without the use ofdedicated camera 142; however,dedicated camera 142 improves the ability of the system to account for occlusions.Dedicated camera 142 should be located substantially adjacent to broadcastcamera 140. That means thatdedicated camera 142 should be as close as possible to broadcastcamera 140 so that both will function properly yet their optical axes will be as close as practical. Thus, if both cameras are focused on the same object, their pan and tilt angle should be very similar. In various alternatives, each broadcast camera could be associated with more than one dedicated cameras. In order to further enhance performance, each broadcast camera would include a plurality of dedicated cameras, one dedicated camera for each potential target the broadcast camera will view. -
FIG. 4 is a block diagram of the production center. Typically, the production center is housed in a truck parked outside of the stadium. However, the production center can be at a central office or the components of the production center can be spread out in multiple locations. The heart of the production center isprocessor 200. Thepreferred processor 200 is an Onyx computer from Silicon Graphics; however, various other suitable processors or combinations of processors can perform the necessary functions of the present invention.Processor 200 is in communication withvideo control 202,video mixer 204 andmultiplexor 206. In one alternative,processor 200 includes more than one processor. For example,processor 200 could include two Onyx computers, one for locating the target and one for determining occlusions. - Broadcasters use many broadcast cameras at the stadium to televise a sporting event. The video signals from the various cameras are sent to
video control 202 which is used to select one broadcast camera for transmission to viewers. One embodiment ofvideo control 202 includes a plurality of monitors (one monitor for each video signal) and a selection circuit. A director (or manager, producer, etc.) can monitor the different video signals and choose which signals to broadcast. The choice would be communicated to the selection circuit which selects one camera signal to broadcast. The choice is also communicated toprocessor 200,video mixer 204 andmultiplexer 206 viasignal 208. The selected video signal is sent to delay 210 andprocessor 200 via analog todigital converter 212. If the broadcast camera is a digital camera, then there would be no need for analog todigital converter 212. - The output of
delay 210 is sent tovideo modification unit 214. The purpose ofdelay 210 is to delay the broadcast video signal a fixed number of frames to allow time forprocessor 200 to receive data, determine the position of the target in the frame of video and prepare any enhancements. Although the video is delayed a small number of frames, the television signal is still defined as live. The delay introduced by the system is a small delay (under one second) which does not accumulate. That is, different frames of video are enhanced with the same small delay. For example, a ten frame delay is equivalent to one-third of a second, which is not considered a significant delay for television. -
Video mixer 204 receives the video signals from all of the dedicated cameras.FIG. 4 shows signals DC1 and DC2. Signal DC1 is a dedicated camera associated with the broadcast camera BC1. Ifvideo control 202 selects BC1 then that selection is communicated tovideo mixer 204 which selects DC1. As discussed above, it is contemplated that some alternatives include having many dedicated cameras for one broadcast camera. For example, one broadcast camera may have four dedicated cameras. In that case, the dedicated cameras would be labeled DC1 a, DC1 b, DC1 c and DC1 d. When broadcast camera BC1 is selected,video mixer 204 would select up to all four dedicated cameras: DC1 a, DC1 b, DC1 c and DC1 d. The selected signal(s) fromvideo mixer 204 is sent to analog todigital converter 216 which digitizes the video signal(s) and sends the digital signal(s) toprocessor 200. -
Multiplexer 206 receives signals from the processors at each of the camera locations. For example,FIG. 4 showsmultiplexer 206 receiving signal CB1 fromprocessor 156 ofFIG. 3 . Each of the processor signals (CB1, CB2, . . . ) is associated with a broadcast camera. Thus, the selection byvideo control 202 is communicated to multiplexer 206 so thatmultiplexer 206 can send the corresponding signal toprocessor 200. The signal sent bymultiplexer 206 toprocessor 200 includes the information from the field of view sensors. In one embodiment,processor 156 calculates the field of view and sends the resulting information, viamultiplexer 206, toprocessor 200. In another embodiment,processor 200 receives the data viamultiplexer 206 and determines the field of view. Either alternative is suitable for the present invention. -
Processor 200 is connected tomemory 220 which stores the locations of the targets and images of the targets (or at least partial images).Memory 220 also stores images of the replacement graphics, instructions for creating replacement graphics and/or instructions for highlighting, editing, etc.Memory 200 is loaded with its data and maintained byprocessor 222. The inventors contemplate that during operation of this system,processor 200 will be too busy to use compute time for loading and maintainingmemory 220. Thus, aseparate processor 222 is used to load and maintain the memory during operation. If cost is a factor,processor 222 can be eliminated andprocessor 200 will be used to load and maintainmemory 220; however, foroptimal performance memory 220 should be loaded, if possible, prior to the broadcast. - The images and locations of targets can be loaded into
memory 220 either manually or automatically. For example, if the target's image and location are known in advance (e.g. an advertisement at the stadium) then prior to real-time operation of the system an operator can input the location of the target and scan in (or otherwise download) an image of the target. Alternatively, the operator can point one or more cameras at the target and use a mouse, light pen or other pointing device to select the target's image for storing inmemory 220. The location of the target can be determined by physical measurement, using pan/tilt/zoom sensors, etc. If the target is not known in advance (for example if the target is the first down yard line) then the operator can select the target during operation using a pointing device and the system will download the image of the target and its location (using pan/tilt/zoom data) tomemory 220. Alternatively, the system can be programmed to know that the target is one of a set of possible targets. For example, the system can be programmed to know that the target is a yard line and the operator need only input which yard line is the current target. The replacement graphics are loaded into memory after being digitized, downloaded or the replacement graphics can be created withprocessor 222. Instructions for highlighting or creating replacement graphics can be programmed usingprocessor 222 orprocessor 200. -
Processor 200 is connected tovideo modification unit 214. The output ofvideo modification unit 214, labeled assignal 226, is the video signal intended for broadcast. This signal can be directly broadcast or sent to other hardware for further modification or recording.Video modification unit 214 modifies the video signal fromdelay 210 with the data/signal fromprocessor 200. The type of modification can vary depending on the desired graphic result. One exemplar implementation uses a linear keyer as avideo modification unit 214. When using a keyer, the signal from thevideo processor 200 to the keyer includes two signals: YUV and an external key (alpha). The YUV signal is called foreground and the signal fromdelay 210 is called background. Based on the level of the external key, the keyer determines how much of the foreground and background to mix to determine the output signal, from 100 percent foreground and zero percent background to zero percent foreground and 100 percent background, on a pixel by pixel basis. Alternatively,video modification unit 214 can be another processor orvideo modification unit 214 can be a part ofprocessor 200. - In operation,
processor 200 determines the field of view of the selected broadcast camera and checksmemory 220 to see if any targets are within that field of view. If so,processor 200 then determines the exact position of the target in a frame of video by determining which pixels represent the target.Processor 200 then checksmemory 220 for the replacement graphic or instructions to make a replacement graphic (or highlight). If the replacement strategy is to highlight a certain portion of a field, thenmemory 220 may include instructions for changing the color of a certain portion of the field, shading of a certain portion of the field, etc. Based on the pan, tilt and zoom, and the actual image of the target,processor 200 determines the size and orientation of the replacement graphic (also called mapping). In one embodiment, the enhancement includesprocessor 200 creating a frame of video with a graphic at the position of the enhancement. The frame created byprocessor 200 is sent tovideo modification unit 214 which combines the frame fromprocessor 200 with the frame fromdelay 210. As will be described below,processor 200 is also used to account for occlusions. An alternate embodiment includes eliminating the separate video modification unit and usingprocessor 200 to edit the video signal from the selected broadcast camera. -
FIG. 5 is a flow chart which explains the operation of the present invention. Instep 300, video data is captured by a broadcast camera and is digitized. If the broadcast camera is a digital camera, digitizing is unnecessary. Simultaneously withstep 300, pan, tilt and zoom data (field of view data) is sensed instep 302 and the field of view is determined instep 304. Instep 306,processor 200 determines if any of the targets are within the field of view. Memory 200 (depicted inFIG. 4 ) includes a database. In one alternative, the database stores the three dimensional locations of all the targets. The field of view of a broadcast camera can be thought of as a pyramid whose location and dimensions are determined based on the field of view data. After determining the dimensions and locations of the pyramid,processor 200 accessesmemory 220 to determine if any of the targets are within the pyramid. Step 306 is a quick method for determining if there is a target within the field of view of the camera. If not, the process is done and the system waits until the next frame of data. If there is a target within the field of view of the selected broadcast camera, then the exact position of the target must be determined within the frame of video of the selected broadcast camera. - Preferably, determining the position of the target is a two-step process. In the first step (step 308) a rough estimate is made based on the pan, tilt and zoom values and in the second step the estimate of the target's position is refined (step 310). In regard to step 308, by knowing where the camera is pointed and the target's three dimensional location, the target's position in the video frame can be estimated. The accuracy of
step 308 is determined by the accuracy of the pan/tilt/zoom sensors, the software used to determine the field of view and the stability of the platform on which the camera is located. In some alternatives, the field of view sensor equipment may be so accurate that the position of the target is adequately determined and step 310 is not necessary. In other instances, the pan, tilt and zoom data only provides a rough estimate 308 (e.g a range of positions or general area of position) and step 310 is needed to determine a more accurate position. - Step 310 provides a more accurate determination of the target's position using pattern recognition techniques which are known in the art. Example of known pattern recognition and image processing technology can be found in the following documents: U.S. Pat. No. 3,973,239, Pattern Preliminary Processing System; U.S. Pat. No. 4,612,666, Automatic Pattern Recognition Apparatus; U.S. Pat. No. 4,674,125, Real-Time Hierarchal Pyramid Signal Processing Apparatus; U.S. Pat. No. 4,817,171, Pattern Recognition System; U.S. Pat. No. 4,924,507, Real-Time Optical Multiple Object Recognition and Tracking System and Method; U.S. Pat. No. 4,950,050, Optical Target Recognition System; U.S. Pat. No. 4,995,090, Optoelectronic Pattern Comparison System; U.S. Pat. No. 5,060,282, Optical Pattern Recognition Architecture Implementing The Mean-Square Error Correlation Algorithm; U.S. Pat. No. 5,142,590, Pattern Recognition System; U.S. Pat. No. 5,241,616, Optical Pattern Recognition System Utilizing Resonator Array; U.S. Pat. No. 5,274,716, Optical Pattern Recognition Apparatus; U.S. Pat. No. 5,465,308, Pattern Recognition System; U.S. Pat. No. 5,469,512, Pattern Recognition Device; and U.S. Pat. No. 5,524,065, Method and Apparatus For Pattern Recognition. It is contemplated that
step 310 can use suitable technology other than pattern recognition technology. - In
step 312,processor 200 fetches the replacement graphic frommemory 220. Ifmemory 220 is storing instructions for replacement graphics, thenprocessor 200 fetches the instructions and creates the graphic. For example, creating the graphic can include drawing a highlight for the yard line of a football field. Instep 314,processor 200 determines the size and orientation of the replacement image, and maps the replacement image to the video frame.Memory 220 merely stores one size image. Because of the pan, tilt and zoom of the broadcast camera, the image stored inmemory 220 may need to be mapped to the video frame (e.g. magnified, reduced, twisted, angled, etc.).Processor 200 can determine the orientation based on the field of view data and/or the pattern recognition analysis instep 310. For example, by knowing where the broadcast camera is located and the pan, tilt and zoom of the broadcast camera, a computer can be programmed to figure how to map the replacement image or highlight on to the video frame. - In step 316, the system accounts for occlusions. If there is an object or person in front of the target, then the enhanced video should show the object or person in front of the replacement graphic, highlight, etc. In one embodiment, the system cuts out a silhouette in the shape of the object or person from the replacement image. Step 316 is discussed in more detail with respect to
FIG. 6 . - In
step 318, the system modifies the video of the original broadcast camera. As discussed above, this could include creating a second frame of video which includes a replacement image and using a keyer to combine the second frame of video with the original frame of video. Alternatively, a processor can be used to edit the frame of video of the broadcast camera. It is possible that within a given frame of video there may be more than one target. In that case steps 308-318 may be repeated for each target, or steps 308-316 may be repeated for each target and step 318 be performed only once for all targets. Subsequent to step 318, the enhanced frame of video may be broadcast or stored, and the process (steps 300-318) may repeat for another frame of video. -
FIG. 6 is a more detailed flow diagram explaining how the system accounts for occlusion. The steps described inFIG. 6 are performed by a system which includes one or more dedicated cameras (e.g. dedicated camera 142).Step 350, is performed before the live event occurs. In one embodiment, there is a dedicated camera substantially adjacent to a broadcast camera for each target that the broadcast camera may view. For example, if there are three advertisements which are to be replaced in the stadium and a particular camera can view two of those advertisements, then the system can include two dedicated cameras substantially adjacent to that particular camera. Prior to the game, a dedicated camera is pointed directly at one of the targets; the camera is zoomed in such that the target fills a substantial portion of the dedicated camera's frame of video; and the image of the target is stored inmemory 220. A substantial portion means that the target typically appears to cover over half of the frame of video of the dedicated camera. For optimal results, the dedicated camera should be zoomed in such that the target fills the greatest amount of the frame of video possible while remaining completely within the frame of video, unless it is desired to have clues of the scenery surrounding the target. After the dedicated camera is pointed at the target, its pan, tilt and zoom should remain fixed. - Once the television broadcast of the live event begins, steps 352-362 are repeated for each frame where the occlusion analysis is desired. In
step 352, a video image is captured and digitized by the dedicated camera. Simultaneously, a video image is captured by the broadcast camera. Instep 354, the digitized image from the dedicated camera is compared to the stored image of the target. The stored image is stored inmemory 220. The processor knows which stored image to compare with fromstep 306 ofFIG. 5 . The step of comparing could include altering one of the images such that both images are the same size and orientation, and then subtracting the data. Alternatively, other methods can be used to compare. If there is an occlusion blocking the target (step 356), then the two images will be significantly different and, instep 358, an occlusion will be reported. In reporting the occlusion, the system reports the presence of an occlusion and the coordinates of the occlusion. When performingstep 354, it is possible that there is no occlusion; however, the two images are not exactly the same. The differences between the images must meet a certain minimum threshold to be considered an occlusion. If the differences are not great enough to be an occlusion, then instep 360 the system determines that the differences are due to ambient conditions in the stadium. For example, if the lights have been dimmed then the captured image of the target may appear darker. Weather conditions could also have an effect on the appearance of the target image. If small differences are detected instep 360 that do not meet the threshold for occlusions, then the system “learns” the changes to the target by updating the stored image of the target to reflect the new lighting or weather conditions (step 362). For example, the new stored image of the target may be darker than the original image. Subsequent to step 362 the system performs thereport step 358 and reports that no occlusion was found. - An alternative to the method of
FIG. 6 includes comparing the target image from the broadcast camera to the stored image. However, using the broadcast camera is not as advantageous as using a dedicated camera because it is likely that the broadcast camera would not be zoomed to the image. Thus, the target image is likely to be smaller on the broadcast camera than it will on the dedicated camera. Because there is a small image to work with, the system loses the subpixel accuracy obtained from the dedicated camera. Also, using a separate dedicated camera may increase the speed at which the system accounts for occlusions. -
FIG. 7 shows an alternative embodiment of the present invention which utilizes electromagnetic transmitting beacons at or near a target. The beacons transmit an electromagnetic signal not visible to the human eye. Electromagnetic waves include light, radio, x-rays, gamma rays, microwave, infrared, ultraviolet and others, all involving the propagation of electric and magnetic fields through space. The difference between the various types of electromagnetic waves are in the frequency or wave length. The human eye is sensitive to electromagnetic radiation of wave lengths from approximately 400-700 nm, the range called light, visible light or the visible spectrum. Thus, the phrase “electromagnetic signal not visible to a human eye” means an electromagnetic wave outside of the visible spectrum. It is important that the signal transmitted from the beacon is not visible to human eye so that the visual appearance of the target will not be altered to those people attending the live event. In one embodiment, the beacon is an electromagnetic transmitter which includes infrared emitting diodes. Other sources which transmit electromagnetic waves may also used, for example, radio transmitters, radar repeaters, etc. -
FIG. 7 shows abroadcast camera 400 which outputs avideo signal 402.Broadcast camera 400 includes a zoom lens coupled to azoom detector 404. The output ofzoom detector 404 is transmitted to analog todigital converter 406 which sends the digital output toprocessor 408. Mounted on top ofbroadcast camera 400 issensor 410. In the embodiment which utilizes an infrared emitter as a beacon,sensor 410 is an infrared sensor.Sensor 410 is mounted on top ofbroadcast camera 400 so that the optical axis ofsensor 410 is as close as possible to the optical axis ofbroadcast camera 400. It is also possible to locatesensor 410 nearbroadcast camera 400 and account for differences between optical axes using matrix transformations or other suitable mathematics. - One example of an infrared sensor is a progressive scan, full frame shutter camera, for example, the TM-9701 by Pulnix. The Pulnix sensor is a high resolution 768(H) by 484(V) black and white full frame shutter camera with asynchronous reset capability. The camera has an eight bit digital signal output and progressively scans 525 lines of video data. A narrow band infrared filter is affixed in front of the lens of the Pulnix sensor. The purpose of the filter is to block electromagnetic signals that are outside the spectrum of the signal from the beacon. The sensor captures a frame of video (data) which comprises a set of pixels. Each pixel is assigned a coordinate corresponding to an x-axis and a y-axis. The sensor data includes an eight bit brightness value for each pixel, which are scanned out pixel by pixel to interface 412 along with other timing information.
Interface 412 outputs four signals: LDV, FDV, CK and DATA. LDV (line data valid) is transmitted toX-Y counters 414 and indicates that a new line of valid data is being scanned out ofsensor 410. FDV (frame data valid) which is transmitted toX-Y counters 414 andmemory control 416, indicates that valid data for the next frame is being transmitted. CK (pixel clock) is a 14.318 MHZ clock fromsensor 414 sent to X-Y counters 414 andmemory control 416. X-Y counters 414 counts X and Y coordinates sequentially in order to keep track of the location of the pixel whose data is being scanned in at the current time. When LDV is inserted, the X counter is reset. When FDV is inserted, the Y counter is reset. - The signal Data includes the eight bit data value for each pixel. As data is read from
sensor 410,memory control 416 determines whether the pixels meets a brightness threshold. That is, noise and other sources will cause a large number of pixels to receive some data. However, the pixels receiving the signal from the beacon will have at least a minimum brightness level. This brightness threshold is set in a register (not shown) which can be set byprocessor 408. If the data for a particular pixel is above the brightness threshold,memory control 416 sends a write enable (WE) signal tomemory 418, causingmemory 418 to store the X and Y coordinates of the pixel, the data for that pixel and a code for that pixel. The code indicates that the data is valid data, a new frame, end of frame or a flash.Processor 408 can read the data frommemory 418 and process the data locally or transmit the data to the production center (e.g., to multiplexer 206). - Many arenas do not allow photographers to use flashes on their cameras in order to prevent impairing a player's vision from random flashes during a sporting event. In lieu of individual camera flashes, many arenas install a set of strobe flashes at or near the ceiling of the arenas and provide for communication between each photographer's camera and the set of strobe flashes. When the photographer takes a picture, the strobe flashes emit a flash of light, which may include an electromagnetic wave in the infrared spectrum. In one embodiment, the system avoids using incorrect data due to sensors detecting a flash by using filters. A second embodiment connects a signal from a strobe flash to a computer which causes the system to ignore data sensed during a flash. A third embodiment includes using flash detectors. The flash detector can be located anywhere in the arena suitable for sensing a strobe flash.
FIG. 7 showsflash detector 422 which detects a flash and sends a signal tomemory control 416.Flash detector 422 includes a photo detector which can comprise, at least, a photo diode and an opamp. In front of the photo detector would be a filter that allows detection of signals in a spectrum that includes the signals emitted by the beacon. Connected to the opamp are components which can detect pulse edges. - The embodiment described in
FIG. 7 operates similar to the embodiment described inFIG. 3 . Some of the differences between the operation of the two embodiments are depicted inFIG. 8 . Similar to the embodiment inFIG. 3 , the embodiment inFIG. 7 first captures and digitizes video data. Instep 450, infrared data is received. Instep 452, the system determines whether a target is found in the infrared data by monitoring the data stored inmemory 418. Sincememory control 416 only allows data above a threshold to be stored inmemory 418, if a given frame of data from a sensor has pixel data stored in memory then a target is found. If a sensor is detecting false targets, then various error correction methods known in the art can be utilized. Instep 454, the position of the target is determined in the frame of video by reading the X and Y coordinates stored with the pixel data inmemory 418. Step 456 fine tunes the determined position information of the target to account for the error from the camera's platform or pan/tilt/zoom sensors. One alternative for accounting for the difference in optical axis is to use a transformation matrix; however, other mathematical solutions known in the art are also suitable. Afterstep 456, the system can performsteps 312 through 318 as described with respect toFIG. 5 , however, any field of view data used is based on the size and position of the beacon's signal in the sensor's frame of video. - A further alternative of
FIG. 7 includes using polarization. That is the infrared filter onsensor 410 is replaced or augmented with a polarized filter. A target to be replaced (e.g., a billboard) is treated with a spectral coating that allows only polarized light to reflect off the billboard. The filter and spectral coating are designed such that light reflecting off the billboard tosensor 410 will be completely blacked-out. The pixels that represent the position of the target in the sensor's frame of video will have a brightness value of zero or close to zero. Thus,memory control 416 is used to only store memory that has a brightness value of zero or below a threshold level. - The foregoing detailed description of the invention has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and obviously many modifications and variations are possible in light of the above teaching. The described embodiments of the system for enhancing the broadcast of a live event were chosen in order to best explain the principles of the invention and its practical application to thereby enable others skilled in the art to best utilize the invention in various embodiments and with various modifications as are suited to the particular use contemplated. The invention is, thus, intended to be used with many different types of live events including various sporting events and nonsporting events. It is intended that the scope of the invention be defined by the claims appended hereto.
Claims (1)
1. A method for enhancing the broadcast of a live event, comprising the steps of:
capturing first video using a first camera;
sensing field of view data representing a field of view of said first camera;
determining a position and orientation of a video image of a target in said captured video at least partially based on recognizing one or more portions of said video image of said target in said captured video; and
modifying said captured video data by enhancing at least a segment of said video image of said target.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/560,237 US20070085908A1 (en) | 1996-10-22 | 2006-11-15 | A method and apparatus for enhancing the broadcast of a live event |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/735,020 US5917553A (en) | 1996-10-22 | 1996-10-22 | Method and apparatus for enhancing the broadcast of a live event |
US09/264,138 US6141060A (en) | 1996-10-22 | 1999-03-05 | Method and apparatus for adding a graphic indication of a first down to a live video of a football game |
US62710600A | 2000-07-27 | 2000-07-27 | |
US09/884,524 US6535681B2 (en) | 2001-06-19 | 2001-06-19 | Fiber-optic cable routing and bend limiting device and system |
US11/560,237 US20070085908A1 (en) | 1996-10-22 | 2006-11-15 | A method and apparatus for enhancing the broadcast of a live event |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/884,524 Continuation US6535681B2 (en) | 1996-10-22 | 2001-06-19 | Fiber-optic cable routing and bend limiting device and system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20070085908A1 true US20070085908A1 (en) | 2007-04-19 |
Family
ID=25384820
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/884,524 Expired - Lifetime US6535681B2 (en) | 1996-10-22 | 2001-06-19 | Fiber-optic cable routing and bend limiting device and system |
US11/560,237 Abandoned US20070085908A1 (en) | 1996-10-22 | 2006-11-15 | A method and apparatus for enhancing the broadcast of a live event |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/884,524 Expired - Lifetime US6535681B2 (en) | 1996-10-22 | 2001-06-19 | Fiber-optic cable routing and bend limiting device and system |
Country Status (1)
Country | Link |
---|---|
US (2) | US6535681B2 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050251843A1 (en) * | 2004-05-04 | 2005-11-10 | Walker Gordon K | Method and apparatus for programming blackout and retune |
US20080007567A1 (en) * | 2005-12-18 | 2008-01-10 | Paul Clatworthy | System and Method for Generating Advertising in 2D or 3D Frames and Scenes |
US20080100731A1 (en) * | 2006-10-30 | 2008-05-01 | Jerry Moscovitch | System and Method for Producing and Displaying Images |
US20080184571A1 (en) * | 2007-02-01 | 2008-08-07 | Thomas Kivley | Methods, apparatuses, and systems for advertising on a first-down measurement device |
US20080297304A1 (en) * | 2007-06-01 | 2008-12-04 | Jerry Moscovitch | System and Method for Recording a Person in a Region of Interest |
US20130159295A1 (en) * | 2007-08-14 | 2013-06-20 | John Nicholas Gross | Method for identifying and ranking news sources |
US20170208355A1 (en) * | 2014-07-15 | 2017-07-20 | Motorola Solutions, Inc | Method and apparatus for notifying a user whether or not they are within a camera's field of view |
CN108981670A (en) * | 2018-09-07 | 2018-12-11 | 成都川江信息技术有限公司 | A kind of scene automatic positioning seat calibration method by real-time video |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6839498B2 (en) * | 2001-03-21 | 2005-01-04 | Lucent Technologies Inc. | Optical fiber cable swivel for fiber optic distribution frames |
US20040244310A1 (en) * | 2003-03-28 | 2004-12-09 | Blumberg Marvin R. | Data center |
US7136566B2 (en) * | 2004-11-15 | 2006-11-14 | Sbc Knowledge Ventures, L.P. | Fiber optic jumper routing and securing system having a series of interconnected and anchored enclosures |
US7599582B2 (en) * | 2004-11-22 | 2009-10-06 | Honeywell International Inc. | Optical fiber cable take-up mechanism for scanning sensors |
US6951986B1 (en) * | 2004-12-29 | 2005-10-04 | Sbc Knowledge Ventures, L.P. | Adjustable routing device for routing fiber optic jumpers from fiber optic jumper raceways |
US20060266468A1 (en) * | 2005-05-31 | 2006-11-30 | Lockheed Martin Corporation | Method for facilitating the routing of radio frequency cables |
US7302153B2 (en) * | 2005-10-26 | 2007-11-27 | Telect Inc. | Fiber management access system |
US7245809B1 (en) * | 2005-12-28 | 2007-07-17 | Adc Telecommunications, Inc. | Splitter modules for fiber distribution hubs |
US9285546B2 (en) | 2014-06-04 | 2016-03-15 | All Systems Broadband, Inc. | Conduit for passing a plurality of fiber optic cables through a fiber optic cassette shelf |
CN105676380B (en) | 2014-11-21 | 2019-07-12 | 泰科电子(上海)有限公司 | Cable runs system and multifibre joint |
CN204462469U (en) * | 2015-01-07 | 2015-07-08 | 泰科电子(上海)有限公司 | Optical cable fan-out device |
WO2018022721A1 (en) | 2016-07-26 | 2018-02-01 | Chatsworth Products, Inc. | Features for cable managers and other electronic equipment structures |
US10288818B2 (en) | 2017-04-21 | 2019-05-14 | The Boeing Company | Cable bend limiter adapter |
US11818860B1 (en) | 2020-12-15 | 2023-11-14 | Chatsworth Products, Inc. | Frame structure for electronic equipment enclosure |
US11678456B1 (en) | 2020-12-15 | 2023-06-13 | Chatsworth Products, Inc. | Slidable mounting hardware for electronic equipment enclosure and method for installing same |
US11622458B1 (en) | 2020-12-15 | 2023-04-04 | Chatsworth Products, Inc. | Brush port assembly and method for installing same |
US11920392B1 (en) | 2021-02-02 | 2024-03-05 | Chatsworth Products, Inc. | Electrical bonding door hinges |
TW202343993A (en) | 2021-12-31 | 2023-11-01 | 美商科斯考普科技有限公司 | Data and high voltage power network with consolidated enclosure |
Citations (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3351347A (en) * | 1964-04-10 | 1967-11-07 | Charles J Smith | Electroluminescent game ball |
US3580993A (en) * | 1968-09-27 | 1971-05-25 | Diebold Inc | Multiple camera superimposed message closed circuit television system |
US3840699A (en) * | 1972-05-25 | 1974-10-08 | W Bowerman | Television system for enhancing and tracking an object |
US3944738A (en) * | 1973-12-11 | 1976-03-16 | Minnesota Mining And Manufacturing Company | Method to increase the visibility of game objects during telecasting |
US3973239A (en) * | 1973-10-17 | 1976-08-03 | Hitachi, Ltd. | Pattern preliminary processing system |
US4179704A (en) * | 1977-12-27 | 1979-12-18 | Cbs Inc. | Television system for displaying and recording paths of motion |
US4319266A (en) * | 1979-09-04 | 1982-03-09 | The Grass Valley Group, Inc. | Chroma keying system |
US4386363A (en) * | 1981-04-10 | 1983-05-31 | Ampex Corporation | Chroma key switching signal generator |
US4490741A (en) * | 1982-10-08 | 1984-12-25 | Heath Company | Synchronization signal stabilization for video image overlay |
US4521196A (en) * | 1981-06-12 | 1985-06-04 | Giravions Dorand | Method and apparatus for formation of a fictitious target in a training unit for aiming at targets |
US4541013A (en) * | 1982-07-19 | 1985-09-10 | Alpert Sidney A | Football signaling system |
US4612666A (en) * | 1984-07-05 | 1986-09-16 | The United States Of America As Represented By The Secretary Of The Navy | Automatic pattern recognition apparatus |
US4647969A (en) * | 1984-11-30 | 1987-03-03 | Graham Sr Richard P | Instant T.V. penalty flag alert system |
US4674125A (en) * | 1983-06-27 | 1987-06-16 | Rca Corporation | Real-time hierarchal pyramid signal processing apparatus |
US4700306A (en) * | 1981-06-24 | 1987-10-13 | Kungalvsgruppen Areng, Hjerpe, Wallmander Ab | System for the visualization of the movements of marine vessels by television display |
US4739406A (en) * | 1986-04-11 | 1988-04-19 | Morton Richard G | Method and apparatus for interacting with television images |
US4811084A (en) * | 1984-04-09 | 1989-03-07 | Corporate Communications Consultants, Inc. | Video color detector and chroma key device and method |
US4817171A (en) * | 1984-04-10 | 1989-03-28 | British Telecommunications Public Limited Company | Pattern recognition system |
US4897726A (en) * | 1986-04-11 | 1990-01-30 | Morton Richard G | Method and apparatus for interacting with television images |
US4924507A (en) * | 1988-02-11 | 1990-05-08 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Real-time optical multiple object recognition and tracking system and method |
US4950050A (en) * | 1987-06-19 | 1990-08-21 | Grumman Aerospace Corporation | Optical target recognition system |
US4957297A (en) * | 1986-01-06 | 1990-09-18 | Newcomb Nelson F | Method of playing golf at night |
US4970666A (en) * | 1988-03-30 | 1990-11-13 | Land Development Laboratory, Inc. | Computerized video imaging system for creating a realistic depiction of a simulated object in an actual environment |
US4975770A (en) * | 1989-07-31 | 1990-12-04 | Troxell James D | Method for the enhancement of contours for video broadcasts |
US4995090A (en) * | 1989-02-07 | 1991-02-19 | The University Of Michigan | Optoelectronic pattern comparison system |
US4999709A (en) * | 1988-01-27 | 1991-03-12 | Sony Corporation | Apparatus for inserting title pictures |
US5060282A (en) * | 1990-03-26 | 1991-10-22 | The United States Of America As Represented By The United States Department Of Energy | Optical pattern recognition architecture implementing the mean-square error correlation algorithm |
US5142590A (en) * | 1985-11-27 | 1992-08-25 | Trustees Of Boston University | Pattern recognition system |
US5150895A (en) * | 1990-11-06 | 1992-09-29 | Richard Berger | Method of and system for determining a position of ball relative to a playing field, and ball provided therefor |
US5179421A (en) * | 1990-08-20 | 1993-01-12 | Parkervision, Inc. | Remote tracking system particularly for moving picture cameras and method |
US5184820A (en) * | 1987-03-31 | 1993-02-09 | Keating Michael D | Hockey puck |
US5207720A (en) * | 1992-02-12 | 1993-05-04 | Fortron International Inc. | Hockey puck device |
US5241616A (en) * | 1992-08-31 | 1993-08-31 | The United States Of America As Represented By The Secretary Of The Navy | Optical pattern recognition system utilizing resonator array |
US5249039A (en) * | 1991-11-18 | 1993-09-28 | The Grass Valley Group, Inc. | Chroma key method and apparatus |
US5264933A (en) * | 1991-07-19 | 1993-11-23 | Princeton Electronic Billboard, Inc. | Television displays having selected inserted indicia |
US5268734A (en) * | 1990-05-31 | 1993-12-07 | Parkervision, Inc. | Remote tracking system for moving picture cameras and method |
US5274716A (en) * | 1990-09-05 | 1993-12-28 | Seiko Instruments Inc. | Optical pattern recognition apparatus |
US5305107A (en) * | 1991-04-12 | 1994-04-19 | Alpha Image Limited | Combining digital video key signals |
US5340108A (en) * | 1991-11-22 | 1994-08-23 | Donald A. Wilson | Apparatus for projecting and moving a spot of light in a scene projected on a screen and for controlling operation of a stepper motor used therewith |
US5346210A (en) * | 1992-08-28 | 1994-09-13 | Teem Systems, Inc. | Object locator system |
US5388825A (en) * | 1994-01-24 | 1995-02-14 | Myers Innovation Group | Illuminable ball |
US5392088A (en) * | 1992-09-04 | 1995-02-21 | Nikon Corporation | Target follow-up device and camera comprising the same |
US5413345A (en) * | 1993-02-19 | 1995-05-09 | Nauck; George S. | Golf shot tracking and analysis system |
US5419562A (en) * | 1993-08-10 | 1995-05-30 | Cromarty; John I. | Method and apparatus for analyzing movements of an individual |
US5419565A (en) * | 1993-08-20 | 1995-05-30 | Gordon; Theodore J. | Electrical device for detecting the location and speed or force of impact with a target |
US5465308A (en) * | 1990-06-04 | 1995-11-07 | Datron/Transoc, Inc. | Pattern recognition system |
US5469536A (en) * | 1992-02-25 | 1995-11-21 | Imageware Software, Inc. | Image editing system including masking capability |
US5469512A (en) * | 1992-09-08 | 1995-11-21 | Sony Corporation | Pattern recognition device |
US5524065A (en) * | 1992-02-07 | 1996-06-04 | Canon Kabushiki Kaisha | Method and apparatus for pattern recognition |
US5892554A (en) * | 1995-11-28 | 1999-04-06 | Princeton Video Image, Inc. | System and method for inserting static and dynamic images into a live video broadcast |
US5917553A (en) * | 1996-10-22 | 1999-06-29 | Fox Sports Productions Inc. | Method and apparatus for enhancing the broadcast of a live event |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5249252A (en) * | 1992-08-31 | 1993-09-28 | Alcatel Network Systems, Inc. | Optical fiber splice tray with cable tray hinge |
US5394502A (en) * | 1993-12-21 | 1995-02-28 | United Technologies Corporation | Fiber optic cable harness break-out fitting |
US6056245A (en) * | 1996-01-25 | 2000-05-02 | Phillip E. White | Flared cable support for telecommunication system installations |
US6321017B1 (en) * | 1999-09-21 | 2001-11-20 | Lucent Technologies Inc. | Portal bend limiter/strain reliever for fiber optic closure exit portal |
-
2001
- 2001-06-19 US US09/884,524 patent/US6535681B2/en not_active Expired - Lifetime
-
2006
- 2006-11-15 US US11/560,237 patent/US20070085908A1/en not_active Abandoned
Patent Citations (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3351347A (en) * | 1964-04-10 | 1967-11-07 | Charles J Smith | Electroluminescent game ball |
US3580993A (en) * | 1968-09-27 | 1971-05-25 | Diebold Inc | Multiple camera superimposed message closed circuit television system |
US3840699A (en) * | 1972-05-25 | 1974-10-08 | W Bowerman | Television system for enhancing and tracking an object |
US4064528A (en) * | 1972-05-25 | 1977-12-20 | Bowerman William R | Method and apparatus for enhancing a televised object |
US3973239A (en) * | 1973-10-17 | 1976-08-03 | Hitachi, Ltd. | Pattern preliminary processing system |
US3944738A (en) * | 1973-12-11 | 1976-03-16 | Minnesota Mining And Manufacturing Company | Method to increase the visibility of game objects during telecasting |
US4179704A (en) * | 1977-12-27 | 1979-12-18 | Cbs Inc. | Television system for displaying and recording paths of motion |
US4319266A (en) * | 1979-09-04 | 1982-03-09 | The Grass Valley Group, Inc. | Chroma keying system |
US4386363A (en) * | 1981-04-10 | 1983-05-31 | Ampex Corporation | Chroma key switching signal generator |
US4521196A (en) * | 1981-06-12 | 1985-06-04 | Giravions Dorand | Method and apparatus for formation of a fictitious target in a training unit for aiming at targets |
US4700306A (en) * | 1981-06-24 | 1987-10-13 | Kungalvsgruppen Areng, Hjerpe, Wallmander Ab | System for the visualization of the movements of marine vessels by television display |
US4541013A (en) * | 1982-07-19 | 1985-09-10 | Alpert Sidney A | Football signaling system |
US4490741A (en) * | 1982-10-08 | 1984-12-25 | Heath Company | Synchronization signal stabilization for video image overlay |
US4674125A (en) * | 1983-06-27 | 1987-06-16 | Rca Corporation | Real-time hierarchal pyramid signal processing apparatus |
US4811084A (en) * | 1984-04-09 | 1989-03-07 | Corporate Communications Consultants, Inc. | Video color detector and chroma key device and method |
US4817171A (en) * | 1984-04-10 | 1989-03-28 | British Telecommunications Public Limited Company | Pattern recognition system |
US4612666A (en) * | 1984-07-05 | 1986-09-16 | The United States Of America As Represented By The Secretary Of The Navy | Automatic pattern recognition apparatus |
US4647969A (en) * | 1984-11-30 | 1987-03-03 | Graham Sr Richard P | Instant T.V. penalty flag alert system |
US5142590A (en) * | 1985-11-27 | 1992-08-25 | Trustees Of Boston University | Pattern recognition system |
US4957297A (en) * | 1986-01-06 | 1990-09-18 | Newcomb Nelson F | Method of playing golf at night |
US4739406A (en) * | 1986-04-11 | 1988-04-19 | Morton Richard G | Method and apparatus for interacting with television images |
US4897726A (en) * | 1986-04-11 | 1990-01-30 | Morton Richard G | Method and apparatus for interacting with television images |
US5184820A (en) * | 1987-03-31 | 1993-02-09 | Keating Michael D | Hockey puck |
US4950050A (en) * | 1987-06-19 | 1990-08-21 | Grumman Aerospace Corporation | Optical target recognition system |
US4999709A (en) * | 1988-01-27 | 1991-03-12 | Sony Corporation | Apparatus for inserting title pictures |
US4924507A (en) * | 1988-02-11 | 1990-05-08 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Real-time optical multiple object recognition and tracking system and method |
US4970666A (en) * | 1988-03-30 | 1990-11-13 | Land Development Laboratory, Inc. | Computerized video imaging system for creating a realistic depiction of a simulated object in an actual environment |
US4995090A (en) * | 1989-02-07 | 1991-02-19 | The University Of Michigan | Optoelectronic pattern comparison system |
US4975770A (en) * | 1989-07-31 | 1990-12-04 | Troxell James D | Method for the enhancement of contours for video broadcasts |
US5060282A (en) * | 1990-03-26 | 1991-10-22 | The United States Of America As Represented By The United States Department Of Energy | Optical pattern recognition architecture implementing the mean-square error correlation algorithm |
US5268734A (en) * | 1990-05-31 | 1993-12-07 | Parkervision, Inc. | Remote tracking system for moving picture cameras and method |
US5465308A (en) * | 1990-06-04 | 1995-11-07 | Datron/Transoc, Inc. | Pattern recognition system |
US5179421A (en) * | 1990-08-20 | 1993-01-12 | Parkervision, Inc. | Remote tracking system particularly for moving picture cameras and method |
US5274716A (en) * | 1990-09-05 | 1993-12-28 | Seiko Instruments Inc. | Optical pattern recognition apparatus |
US5150895A (en) * | 1990-11-06 | 1992-09-29 | Richard Berger | Method of and system for determining a position of ball relative to a playing field, and ball provided therefor |
US5305107A (en) * | 1991-04-12 | 1994-04-19 | Alpha Image Limited | Combining digital video key signals |
US5264933A (en) * | 1991-07-19 | 1993-11-23 | Princeton Electronic Billboard, Inc. | Television displays having selected inserted indicia |
US5249039A (en) * | 1991-11-18 | 1993-09-28 | The Grass Valley Group, Inc. | Chroma key method and apparatus |
US5340108A (en) * | 1991-11-22 | 1994-08-23 | Donald A. Wilson | Apparatus for projecting and moving a spot of light in a scene projected on a screen and for controlling operation of a stepper motor used therewith |
US5524065A (en) * | 1992-02-07 | 1996-06-04 | Canon Kabushiki Kaisha | Method and apparatus for pattern recognition |
US5207720A (en) * | 1992-02-12 | 1993-05-04 | Fortron International Inc. | Hockey puck device |
US5469536A (en) * | 1992-02-25 | 1995-11-21 | Imageware Software, Inc. | Image editing system including masking capability |
US5346210A (en) * | 1992-08-28 | 1994-09-13 | Teem Systems, Inc. | Object locator system |
US5241616A (en) * | 1992-08-31 | 1993-08-31 | The United States Of America As Represented By The Secretary Of The Navy | Optical pattern recognition system utilizing resonator array |
US5392088A (en) * | 1992-09-04 | 1995-02-21 | Nikon Corporation | Target follow-up device and camera comprising the same |
US5469512A (en) * | 1992-09-08 | 1995-11-21 | Sony Corporation | Pattern recognition device |
US5413345A (en) * | 1993-02-19 | 1995-05-09 | Nauck; George S. | Golf shot tracking and analysis system |
US5419562A (en) * | 1993-08-10 | 1995-05-30 | Cromarty; John I. | Method and apparatus for analyzing movements of an individual |
US5419565A (en) * | 1993-08-20 | 1995-05-30 | Gordon; Theodore J. | Electrical device for detecting the location and speed or force of impact with a target |
US5388825A (en) * | 1994-01-24 | 1995-02-14 | Myers Innovation Group | Illuminable ball |
US5892554A (en) * | 1995-11-28 | 1999-04-06 | Princeton Video Image, Inc. | System and method for inserting static and dynamic images into a live video broadcast |
US5917553A (en) * | 1996-10-22 | 1999-06-29 | Fox Sports Productions Inc. | Method and apparatus for enhancing the broadcast of a live event |
US6141060A (en) * | 1996-10-22 | 2000-10-31 | Fox Sports Productions, Inc. | Method and apparatus for adding a graphic indication of a first down to a live video of a football game |
US7154540B2 (en) * | 1996-10-22 | 2006-12-26 | Fox Sports Productions, Inc. | System for enhancing video |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8499318B2 (en) | 2004-05-04 | 2013-07-30 | Qualcomm Incorporated | Method and apparatus for programming blackout and retune |
US7555012B2 (en) * | 2004-05-04 | 2009-06-30 | Qualcomm Incorporated | Method and apparatus for programming blackout and retune |
US20090235301A1 (en) * | 2004-05-04 | 2009-09-17 | Qualcomm Incorporated | Method and apparatus for programming blackout and retune |
US20050251843A1 (en) * | 2004-05-04 | 2005-11-10 | Walker Gordon K | Method and apparatus for programming blackout and retune |
US20080007567A1 (en) * | 2005-12-18 | 2008-01-10 | Paul Clatworthy | System and Method for Generating Advertising in 2D or 3D Frames and Scenes |
US20080100731A1 (en) * | 2006-10-30 | 2008-05-01 | Jerry Moscovitch | System and Method for Producing and Displaying Images |
US20080184571A1 (en) * | 2007-02-01 | 2008-08-07 | Thomas Kivley | Methods, apparatuses, and systems for advertising on a first-down measurement device |
US7694424B2 (en) | 2007-02-01 | 2010-04-13 | Thomas Kivley | Methods, apparatuses, and systems for advertising on a first-down measurement device |
US20080297304A1 (en) * | 2007-06-01 | 2008-12-04 | Jerry Moscovitch | System and Method for Recording a Person in a Region of Interest |
US20130159295A1 (en) * | 2007-08-14 | 2013-06-20 | John Nicholas Gross | Method for identifying and ranking news sources |
US8775405B2 (en) * | 2007-08-14 | 2014-07-08 | John Nicholas Gross | Method for identifying and ranking news sources |
US20170208355A1 (en) * | 2014-07-15 | 2017-07-20 | Motorola Solutions, Inc | Method and apparatus for notifying a user whether or not they are within a camera's field of view |
CN108981670A (en) * | 2018-09-07 | 2018-12-11 | 成都川江信息技术有限公司 | A kind of scene automatic positioning seat calibration method by real-time video |
Also Published As
Publication number | Publication date |
---|---|
US20020191936A1 (en) | 2002-12-19 |
US6535681B2 (en) | 2003-03-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7154540B2 (en) | System for enhancing video | |
US20070085908A1 (en) | A method and apparatus for enhancing the broadcast of a live event | |
EP0683961B1 (en) | Apparatus and method for detecting, identifying and incorporating advertisements in a video | |
US6252632B1 (en) | System for enhancing a video presentation | |
US5903317A (en) | Apparatus and method for detecting, identifying and incorporating advertisements in a video | |
US5489886A (en) | Automatic line officiating system and method thereof | |
US5953077A (en) | System for displaying an object that is not visible to a camera | |
AU709805B2 (en) | A system for enhancing the television presentation of an object at a sporting event | |
US6597406B2 (en) | System for enhancing a video presentation of a live event | |
JP3738035B2 (en) | Method and apparatus for automatic electronic replacement of billboards in video images | |
US6154250A (en) | System for enhancing the television presentation of an object at a sporting event | |
EP1010129B1 (en) | Re-registering a sensor during live recording of an event | |
US8441476B2 (en) | Image repair interface for providing virtual viewpoints | |
US8049750B2 (en) | Fading techniques for virtual viewpoint animations | |
US8073190B2 (en) | 3D textured objects for virtual viewpoint animations | |
US20090128577A1 (en) | Updating backround texture for virtual viewpoint animations | |
WO1998032094A9 (en) | A system for re-registering a sensor during a live event | |
HU220409B (en) | Method, apparatus and system for implanting an image into a video stream | |
JP2004522327A (en) | Method for correcting visible objects shot by TV camera | |
JPH08511392A (en) | Device and method for detection, identification and incorporation of advertisements in video |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |