US20120162480A1 - Image capturing apparatus and display control method - Google Patents
Image capturing apparatus and display control method Download PDFInfo
- Publication number
- US20120162480A1 US20120162480A1 US13/408,975 US201213408975A US2012162480A1 US 20120162480 A1 US20120162480 A1 US 20120162480A1 US 201213408975 A US201213408975 A US 201213408975A US 2012162480 A1 US2012162480 A1 US 2012162480A1
- Authority
- US
- United States
- Prior art keywords
- frame
- displayed
- information
- image
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/635—Region indicators; Field of view indicators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
Definitions
- the present invention relates generally to image capturing apparatuses. More particularly, the present invention relates to an image capturing apparatus having a live view shooting function.
- a live view shooting function that allows users to capture images while displaying images of a subject formed on an image pickup element through a lens on a display screen, such as a liquid crystal display, realtime.
- various kinds of information on image capturing settings such as the white balance, the recording image quality, and the color space, are superimposed on displayed live view images, thereby allowing users to easily recognize various settings while watching the displayed images.
- an auto focus (AF) frame that specifies an area of the displayed image subjected to auto focus processing
- an enlargement frame that specifies an area to be enlarged
- a face detection frame that informs users of a detected face of a subject can be superimposed on the displayed images.
- the size and position of the AF frame or the enlargement frame can be freely changed in response to user operations. Additionally, the size and position of the face detection frame change in accordance with a subject. Accordingly, the following problem may be caused. For example, when a displayed AF frame overlaps displayed various kinds of setting information, such as the white balance, the recording image quality, and the color space, one object is hidden by another object, which thus makes it difficult for users to recognize the displayed content.
- Japanese Patent Laid-Open No. 2003-134358 discloses a technique for stopping displaying setting information if a displayed AF cursor, which indicates an in-focus position, overlaps the setting information.
- the present invention provides an image capturing apparatus that allows users to optimally recognize displayed information and displayed frames that are superimposed on displayed images.
- An image capturing apparatus includes: an image capturing unit configured to capture an image of a subject to acquire image data; a display unit configured to display the image of the subject based on the image data acquired by the image capturing unit; and a display control unit configured to perform a control operation so that information displayed at a predetermined position and a movably displayed frame are superimposed on the image of the subject displayed by the display unit.
- the display control unit displays either the information or the frame preferentially in accordance with an overlapping state of the information and the frame when the information overlaps the frame.
- FIG. 2 is a back view of a digital camera according to an exemplary embodiment of the present invention.
- FIG. 3 is a diagram showing various kinds of information displayed during live view shooting.
- FIGS. 4A and 4B are diagrams showing switching of kinds of information displayed during live view shooting.
- FIG. 5 is a diagram showing a screen on which a white balance setting is changed.
- FIG. 6 is a diagram illustrating an AF frame displayed during live view shooting.
- FIGS. 7A-7H are diagrams showing criteria employed when whether an AF frame or various kinds of information is displayed in front is determined.
- FIG. 9 is a diagram showing a state where a face detection frame is displayed according to a second exemplary embodiment of the present invention.
- FIG. 10 is a flowchart showing processing for controlling display of a face detection frame according to a second exemplary embodiment of the present invention.
- FIGS. 11A-11D are diagrams showing criteria employed when whether a circular face detection frame or various kinds of information is displayed in front is determined.
- FIG. 1 is a block diagram showing a configuration of a digital camera according to the first exemplary embodiment of the present invention.
- the digital camera includes a camera main body 100 and a lens unit 150 that is exchangeably attached to the camera main body 100 .
- the camera main body 100 includes an image pickup element 14 for converting an optical image into electrical signals, and a shutter 12 for controlling an exposure amount of the image pickup element 14 .
- An image processing circuit 16 converts analog signals output from the image pickup element 14 into digital signals (image data) with an analog-to-digital (A/D) converter.
- the image processing circuit 16 also performs various kinds of processing, such as pixel interpolation processing and color conversion processing, on image data supplied from a memory control circuit.
- the image processing circuit 16 performs predetermined calculation processing using image data.
- a system control circuit 50 controls a shutter control unit, a focus control unit, and an aperture control unit to perform auto focus (AF) processing, auto exposure (AE) processing, pre-flash (EF) processing.
- the image processing circuit 16 performs auto white balance (AWB) processing based on the acquired calculation result.
- the camera main body 100 also includes an image display memory 24 and an image display unit 28 , which may be a liquid crystal display. Image data written in the image display memory 24 is displayed on the image display unit 28 after being processed by a digital-to-analog (D/A) converter.
- D/A digital-to-analog
- the system control circuit 50 controls the camera main body 100 .
- a memory 52 stores parameters, variables, and programs for operations of the system control circuit 50 .
- a display unit 54 displays, using characters, images, and audio, an operation state and messages in accordance with execution of the programs by the system control circuit 50 .
- One or more display units 54 are provided at a position, which is near an operation unit 68 of the camera main body 100 and is easily recognized by users.
- the display unit 54 may be constituted by a combination of, for example, a liquid crystal display (LCD), light-emitting diodes (LED), and sound-emitting element.
- a nonvolatile memory 56 may be an electrically erasable programmable read-only memory (EEPROM). Data is electrically erased from or recorded on the nonvolatile memory 56 .
- EEPROM electrically erasable programmable read-only memory
- a display control unit 70 displays various kinds of information superimposed on an image displayed on the image display unit 28 .
- the information to be displayed includes setting information indicating settings regarding the white balance and the recording image quality, a histogram indicating luminance distribution of a captured live view image, status information indicating a remaining battery level and the number of capturable images, and warning information indicating that the current state is not suitable for use. These various kinds of information are displayed at predetermined fixed positions on the image display unit 28 .
- the display control unit 70 also displays an AF frame specifying an area of a display image subjected to AF processing, an enlargement frame specifying an area to be enlarged, and a face detection frame informing users of a detected face of a subject.
- the display positions of these frames are not fixed and can be moved to given positions through user operations.
- the display control unit 70 performs a control operation so that either the information or the frame is displayed preferentially in accordance with the overlapping state.
- a face detecting unit 82 performs a predetermined face detecting operation on image data supplied from the image processing circuit 16 or image data supplied from the memory control circuit.
- Interfaces 90 and 94 serve as interfaces with recording media, such as a memory and a hard disk.
- Connectors 92 and 96 connect the camera main body 100 to recording media, such as a memory card and a hard disk.
- a communication unit 110 has various communication functions, such as communication through RS-232C (recommended standard 232 version C), USB, IEEE 1394, P1284, SCSI, modem, LAN, and wireless communication.
- a connector 112 connects the camera main body 100 with other apparatuses through the communication unit 110 .
- the connector 112 may be an antenna when wireless communication is carried out.
- Recording media/units 120 and 130 may be a memory card and a hard disk.
- FIG. 2 is a back view of the digital camera according to the exemplary embodiment.
- FIG. 2 shows a state where an image formed on the image pickup element 14 is displayed on the image display unit 28 .
- a shooting operation performed with the image formed on the image pickup element 14 being displayed on the image display unit 28 is referred to as live view shooting.
- a back face 200 of the camera main body 100 includes a liquid crystal display 201 , which corresponds to the image display unit 28 shown in FIG. 1 .
- Various buttons 210 - 220 correspond to the operation unit 68 shown in FIG. 1 .
- An optical finder 240 is also included.
- An image formed on the image pickup element 14 through a lens is displayed on the liquid crystal display 201 realtime.
- Various kinds of information and various frames can be superimposed on the displayed image.
- setting information 221 regarding image capturing functions, warning information 223 , and a histogram 224 are displayed as the various kinds of information.
- An AF frame 222 indicating an area subjected to the auto focus processing is also displayed.
- the various kinds of information will be described in detail later with reference to FIG. 3 , whereas the AF frame 222 will be described in detail later with reference to FIG. 6 .
- the display control unit 70 controls display of the various kinds of information and the AF frame 222 .
- FIG. 3 Various kinds of information displayed on the liquid crystal display 201 during live view shooting will now be described using FIG. 3 .
- Pieces of information 301 - 304 regarding image capturing functions are displayed in accordance with a content of a setting made by a user.
- the information 301 is white balance setting information.
- the information 302 is recording image quality setting information.
- the information 303 is setting information regarding a photometry method.
- the information 304 is color space setting information.
- the above-described setting information is only an example and other kinds of information regarding image capturing functions may be displayed.
- Warning information 310 indicates that there is no available space in a recording medium inserted into an image capturing apparatus.
- a histogram 320 shows luminance distribution of a live view image.
- the various kinds of information are superimposed on images displayed on the image display unit 28 during live view shooting.
- semitransparent processing may be performed so that the image displayed under the various kinds of information can be seen therethrough.
- the various kinds of information are displayed preferentially of a frame, such as an AF frame, the semitransparent processing is performed on the various kinds of information so that the image displayed under the various kinds of information can be seen therethrough but the frame is not seen therethrough.
- FIGS. 4A and 4B are diagrams showing switching of kinds of information displayed on the liquid crystal display 201 .
- various kinds of information are displayed on the liquid crystal display 201 shown in FIG. 3
- kinds of the displayed information can be switched by operating the INFO button 218 , shown in FIG. 2 , for instructing switching of display.
- FIG. 4A shows a state where only the histogram 320 is hidden from the state shown in FIG. 3
- FIG. 4B shows a state where the setting information 301 - 304 regarding image capturing functions, the warning information 310 , and the histogram 320 are hidden from the state shown in FIG. 3 .
- FIG. 5 is a diagram showing a screen on which settings regarding image capturing functions are changed. As one example, FIG. 5 shows a screen on which a white balance setting is changed.
- a selection frame 501 is for selecting a white balance setting.
- the setting selected with the selection frame 501 is reflected in the digital camera and the displayed screen is switched into the screen shown in FIG. 3 .
- the white balance setting information displayed on the image display unit 28 is also changed into the selected setting.
- FIG. 6 is a diagram illustrating an AF frame displayed during live view shooting.
- An AF frame 601 is similar to the AF frame 222 .
- the AF frame 601 may be drawn by a solid line or a broken line in any given color.
- a user can freely move the AF frame 601 within the screen.
- the size of the AF frame 601 can also be changed freely.
- the size of the AF frame 601 is enlarged as shown by a frame 604 or 605 .
- the size of the AF frame 601 is reduced as shown by a frame 602 or 603 .
- An inner area of the AF frame 601 is transparent so that a user can check the displayed image.
- the AF processing according to a TVAF method is performed during live view shooting.
- an AF evaluation value indicating sharpness of an image is calculated on the basis of a video signal of an image resulting from photoelectric conversion of an image pickup element.
- a focus lens is then driven so that the maximum AF evaluation value is obtained. In this manner, the focus is adjusted.
- the AF evaluation value is calculated on the basis of high-frequency components of the video signal extracted by a bandpass filter. Generally, the position of the focus lens that gives the maximum AF evaluation value corresponds to a focal point.
- AF processing is performed on the area enclosed by the AF frame.
- a setting for displaying a movable AF frame on the image display unit 28 can be selected from an AF frame setting of the menu items. Additionally, for example, a setting for fixing the AF-processing target area at a predetermined area at the center of the image display unit 28 and a setting for automatically selecting the AF-processing target area from a plurality of predetermined areas of the image display unit 28 in accordance with an image capturing state can be selected in the AF frame setting.
- FIGS. 7A-7H are diagrams showing criteria employed when the display control unit 70 determines an object, i.e., an AF frame or various kinds of information, to be preferentially displayed in accordance with an overlapping state of the various kinds of information and the AF frame and controls display of the information and the frame.
- displaying an object in front corresponds to preferentially displaying the object.
- the display control unit 70 displays the various kinds of information in front when the size of the AF frame can be estimated even if the various kinds of information are displayed in front of the AF frame. If estimation of the size of the AF frame is difficult, the display control unit 70 displays the AF frame in front.
- FIGS. 7A , 7 B, and 7 C show cases where it is determined that the AF frame is displayed in front of the various kinds of information.
- FIGS. 7D , 7 E, and 7 F show cases where it is determined that the various kinds of information are displayed in front of the AF frame.
- FIG. 7G shows a case where an object displayed in front is changed from the AF frame to the various kinds of information in response to a change in the size of the AF frame.
- an entire AF frame 802 overlaps a histogram 801 .
- the display control unit 70 performs a control operation to display the AF frame 802 in front of the histogram 801 .
- the display control unit 70 performs a control operation to display the AF frame in front of the histogram 816 .
- an entire side 821 of the AF frame overlaps white balance setting information 825 and an entire side 823 of the AF frame also overlaps color space setting information 826 .
- both of sides 822 and 824 of the AF frame partially overlap the displayed white balance setting information 825 and the displayed color space setting information 826 .
- the display control unit 70 performs a control operation to display the AF frame in front of the setting information.
- a part of a side 841 of the AF frame overlaps white balance setting information 840 .
- a part of a side 843 of the AF frame overlaps color space setting information 848 .
- a part of a side 844 of the AF frame overlaps the white balance setting information 840 and the color space setting information 848 .
- the display control unit 70 performs a control operation to display the setting information in front of the AF frame.
- the display control unit 70 performs a control operation to display the setting information 855 in front of the AF frame.
- the display control unit 70 performs a control operation to display the AF frame in front of the setting information.
- the display control unit 70 performs a control operation to display the setting information in front of the AF frame.
- the display control unit 70 When estimation of the size of the AF frame becomes difficult in response to reduction of the AF frame, the display control unit 70 performs a control operation so that the object displayed in front is changed from the setting information to the AF frame.
- the AF frame is displayed in front of the various kinds of information.
- the display control unit 70 performs a control operation to display the AF frame in front of the various kinds of information.
- the display control unit 70 may perform a control operation to display the AF frame in front of the various kinds of information as shown in FIG. 7H .
- the display control unit 70 performs processing at steps S 102 to S 108 .
- the system control circuit 50 starts live view shooting (S 101 ).
- the display control unit 70 determines whether a setting for displaying various kinds of information, such as setting information regarding image capturing functions and a histogram, on the image display unit 28 is selected (S 102 ).
- the kinds of the information to be displayed on the image display unit 28 can be changed by operating the INFO button 218 . No information may be displayed on the image display unit 28 .
- the display control unit 70 determines whether to display an AF frame on the image display unit 28 (S 103 ).
- the AF frame may be displayed after a user selects the setting for displaying the AF frame and specifies a position of the AF frame.
- the AF frame may be displayed at a predetermined initial position in response to the user's selection of the setting for displaying the AF frame.
- the display control unit 70 determines whether the size of the AF frame can be estimated based on the position and size of the AF frame and the position of the various kinds of information even if the various kinds of information are displayed in front of the AF frame (S 104 ). This determination regarding whether the size of the AF frame can be estimated even if the various kinds of information are displayed in front of the AF frame is performed based on the criteria shown in FIGS. 7A-7H .
- the display control unit 70 displays the AF frame in front of the various kinds of information (S 105 ).
- the display control unit 70 displays the various kinds of information in front of the AF frame (S 106 ).
- the display control unit 70 determines whether the position or the size of the AF frame has been changed (S 107 ). If the position or the size of the AF frame has been changed (YES at S 107 ), the process returns to STEP S 104 . If the position or the size of the AF frame has not been changed (NO at S 107 ), the process proceeds to STEP S 108 . If the displayed AF frame is hidden, the process also proceeds to STEP S 108 . Additionally, if the AF frame, which has been hidden, is newly displayed, the process returns to STEP S 104 .
- the display control unit 70 determines whether the setting for displaying the various kinds of information on the image display unit 28 has been changed (S 108 ).
- the kinds of information to be displayed on the image display unit 28 can be changed by operating the INFO button 218 .
- An AF frame which has not been overlapping the various kinds of information, may overlap the newly displayed information. Conversely, information, which has been overlapping the AF frame, may be hidden.
- the processing at STEP S 108 is performed for such cases.
- the system control circuit 50 determines whether termination of live view shooting is instructed with the live view button 211 (S 109 ). If the termination is instructed (YES at S 109 ), the system control circuit 50 terminates the live view shooting operation (S 110 ). If the termination is not instructed (NO at S 109 ), the process returns to STEP S 107 .
- an object displayed in front i.e., an AF frame or various kinds of information
- an object displayed in front is controlled in accordance with an overlapping state of the AF frame and the various kinds of information, thereby allowing a user to optimally recognize both of the AF frame and the various kinds of information, which thus results in an improvement of a user interface.
- the display control operation may be performed in the similar manner on an enlargement frame specifying an area of a displayed image to be enlarged as well as the AF frame.
- a digital camera according to the second exemplary embodiment of the present invention has a configuration shown in the block diagram of FIG. 1 . Since the configuration is similar to that employed in the first exemplary embodiment, a description thereof is omitted.
- FIG. 9 is a diagram showing an image displayed on the liquid crystal display 201 during live view shooting.
- Each of face detection frames 1001 , 1002 , and 1003 is displayed at an area that is recognized to include a face of a subject by the face detecting unit 82 .
- a plurality of face detection frames are displayed to enclose areas recognized to include the faces.
- the face detection fames are categorized into a main frame and a sub frame based on the size of an area recognized to include a face and the position of the area in an image.
- the system control circuit 50 determines whether a face detection frame is a main frame or a sub frame on the basis of a detection result of the face detecting unit 82 .
- the image display unit 28 displays a main frame for a face determined as a main subject and displays a sub frame for a face determined as a subject other than the main subject.
- a user may select a face detection frame serving as the main frame by operating the operation unit 68 .
- the frame 1001 is a main frame
- the frames 1002 and 1003 are sub frames. Different control operations are performed between a case where a main frame overlaps various kinds of information and a case where a sub frame overlaps various kinds of information. Since the sub frame is displayed for a face of a subject other than the main subject, the user may set the lower priority for the sub frame. Accordingly, when the sub frame overlaps the various kinds of information, the various kinds of information are preferentially displayed. That is, the various kinds of information are displayed in front regardless of whether the size of the sub frame can be estimated or not.
- the display control unit 70 performs a control operation to display the face detection frame 1001 in front of the setting information 1006 .
- the object to be displayed in front i.e., the main frame or the various kinds of information, is determined on the basis of the criteria shown in FIGS. 7A-7H .
- the display control unit 70 performs a control operation to display the histogram 1005 in front of the face detection frame 1003 since the face detection frame 1003 is a sub frame.
- the display control unit 70 performs a control operation to display the setting information 1006 in front of the face detection frame 1001 and to display the face detection frame 1003 in front of the histogram 1005 .
- Steps similar to those included in the flowchart shown in FIG. 8 are designated by similar or like references and a description thereof is omitted.
- the display control unit 70 performs processing at steps S 203 to S 209 .
- the display control unit 70 determines whether to display a face detection frame on the image display unit 28 (S 203 ).
- the face detection frame is displayed to enclose a face of a subject automatically detected in a captured live view image.
- a plurality of faces of subjects are detected, a plurality of face detection frames are displayed.
- the face detection frames are categorized into a main frame and a sub frame based on the size of an area determined to include a face and the position of the area.
- the display control unit 70 determines whether the face detection frame is a main frame (S 204 ). If the face detection frame is determined to be a main frame (YES at S 204 ), the process proceeds to STEP S 205 . If the face detection frame is determined to be a sub frame (NO at S 204 ), the display control unit 70 displays the various kinds of information in front of the face detection frame regardless of the overlapping state of the face detection frame and various kinds of information (S 207 ).
- the display control unit 70 determines whether the size of the face detection frame can be estimated on the basis of the position and size of the face detection frame and the position of the various kinds of information even if the various kinds of information are displayed in front of the face detection frame (S 205 ). Whether the size of the face detection frame can be estimated even if the various kinds of information are displayed in front of the face detection frame is determined based on the criteria shown in FIGS. 7A-7H .
- the display control unit 70 displays the face detection frame in front of the various kinds of information (S 206 ).
- the display control unit 70 displays the various kinds of information in front of the face detection frame (S 207 ).
- the display control unit 70 determines whether there is a face detection frame that has not been displayed yet after displaying the face detection frame at STEP S 206 or S 207 (S 208 ). If there is a face detection frame that has not been displayed yet (YES at S 208 ), the process returns to STEP S 204 . Otherwise, the process proceeds to STEP S 209 .
- the display control unit 70 determines whether the position or the size of the face detection frame has been changed (S 209 ). If the position or the size of the face detection frame has been changed (YES at S 209 ), the process returns to STEP S 204 . Otherwise, the process proceeds to STEP S 108 . If the displayed face detection frame is hidden, the process also proceeds to STEP S 108 . If a face detection frame is newly displayed, the process returns to STEP S 204 . Additionally, if the position or the size of the face detection frame has not been changed but the face detection frame is switched from the main frame to the sub frame or from the sub frame to the main frame, the process also returns to STEP S 204 .
- a face detection frame is a main frame or a sub frame, thereby allowing a user to optimally recognize both of the face detection frame and various kinds of information, which thus results in an improvement of a user interface.
- determination of whether the face detection frame is a main frame is performed one by one when a plurality of face detection frame are displayed in the flowchart shown in FIG. 10 , the determination may be performed regarding the plurality of face detection frames at the same time.
- this determination may be omitted after one face detection frame is determined as the main frame and the process may then proceed to STEP S 207 .
- FIG. 10 shows the flowchart of processing performed during live view shooting
- this display control operation can be applied to a playback mode, which is started in response to an operation of the play button 212 , shown in FIG. 2 , for starting playback of images that have been captured and recorded.
- the display control operation can be applied to a case where various kinds of information and face detection frames are displayed at the same time in order to confirm the various kinds of information employed when the images are captured or a face of a main subject.
- a digital camera according to the third exemplary embodiment of the present invention has a configuration shown in the block diagram of FIG. 1 . Since the configuration is similar to that employed in the first exemplary embodiment, a description thereof is omitted.
- a rectangular face detection frame is used in the second exemplary embodiment
- a circular face detection frame is used in the third exemplary embodiment.
- criteria used in determination of an object to be displayed in front i.e., the face detection frame or the various kinds of information, differ from the criteria employed when a rectangular face detection frame is used.
- FIGS. 11A-11D are diagrams showing criteria employed when the display control unit 70 determines which to display in front, a face detection frame or various kinds of information, in accordance with an overlapping state of the face detection frame and the various kinds of information.
- FIG. 11A shows a case where it is determined that the face detection frame is displayed in front of the various kinds of information.
- FIGS. 11B and 11C show cases where it is determined that the various kinds of information are displayed in front of the face detection frame.
- a half or more part of the circumference of a face detection frame 1201 continuously overlaps a histogram 1202 .
- estimation of the size of the face detection frame 1201 is difficult if the histogram 1202 is displayed in front of the face detection frame 1201 .
- the display control unit 70 performs a control operation to display the face detection frame 1201 in front of the histogram 1202 .
- the display control unit 70 performs a control operation to display the histogram 1212 in front of the face detection frame 1211 .
- a half or more of the total circumference of the face detection frame 1221 overlaps white balance setting information 1223 and color space setting information 1224 .
- the display control unit 70 performs a control operation to display the various kinds of information in front of the face detection frame 1221 .
- an object to be displayed in front i.e., a face detection frame or various kinds of information, may be controlled using similar conditions.
- the display control operation may be performed using the similar conditions when an AF frame or an enlargement frame specifying an area of a displayed image to be enlarged is in a circular shape.
- Processing for determining an object to be displayed in front, i.e., the face detection frame or the various kinds of information, according to this exemplary embodiment is similar to that shown in the flowchart of FIG. 10 . Since only the criteria used at STEP S 205 differ, a description thereof is omitted.
- an overlapping state of a face detection frame and various kinds of information is determined even if the face detection frame is circular and an object to be displayed in front is controlled based on the determination, thereby allowing a user to optimally recognize both of the face detection frame and the various kinds of information, which thus results in an improvement of a user interface.
- a lens-integrated digital camera having a live-view shooting function may be employed.
- different kinds of frames such as an AF frame and a face detection frame, may be displayed on the image display unit 28 at the same time.
- a smaller frame may be displayed in front.
- a frame to be displayed in front may be determined in accordance with the kinds of frames. For example, when the AF frame overlaps the face detection frame, the AF frame, which is displayed at a position reflecting a user operation, may be displayed in front of the face detection frame since the face detection frame is displayed at a position of a face of a subject.
- the shape of frames is not limited to a rectangle or a circle.
- Various shapes such as a star shape and a heart shape, may be used.
- Criteria for determining the object to be displayed in front i.e., various frames or various kinds of information, may be set in accordance with whether the size of the frames can be estimated even if the various kinds of information are displayed in front of the frames. For example, when a ratio of a part overlapping the various kinds of information to the entire frame is equal to or higher than a predetermined value, the frame may be displayed in front. If the ratio is lower than the predetermined value, the various kinds of information may be displayed in front. Alternatively, when a predetermined part of the frame overlaps the various kinds of information, the frame may be displayed in front. Otherwise, the various kinds of information may be displayed in front.
- the predetermined value and the predetermined part may be set in accordance with the shape of the frame.
- the various kinds of information may be displayed in front of the frame. Whether to perform semitransparent processing may be determined on the basis of the overlapping state.
Abstract
When various kinds of information and various frames are superimposed on a displayed image of a subject during live view shooting, an object to be preferentially displayed is switched between the various kinds of information and the various frames in accordance with an overlapping state of the various kinds of information and the various frames if the various kinds of information overlap the various frames.
Description
- This application is a continuation of U.S. patent application Ser. No. 12/402,376 filed Mar. 11, 2009, which claims the benefit of Japanese Patent Application No. 2008-061289 filed on Mar. 11, 2008, all of which are hereby incorporated by reference herein in their entirety.
- 1. Field of the Invention
- The present invention relates generally to image capturing apparatuses. More particularly, the present invention relates to an image capturing apparatus having a live view shooting function.
- 2. Description of the Related Art
- There are digital cameras according to the related art having a live view shooting function that allows users to capture images while displaying images of a subject formed on an image pickup element through a lens on a display screen, such as a liquid crystal display, realtime. During live view shooting, various kinds of information on image capturing settings, such as the white balance, the recording image quality, and the color space, are superimposed on displayed live view images, thereby allowing users to easily recognize various settings while watching the displayed images. In addition, an auto focus (AF) frame that specifies an area of the displayed image subjected to auto focus processing, an enlargement frame that specifies an area to be enlarged, and a face detection frame that informs users of a detected face of a subject can be superimposed on the displayed images.
- The size and position of the AF frame or the enlargement frame can be freely changed in response to user operations. Additionally, the size and position of the face detection frame change in accordance with a subject. Accordingly, the following problem may be caused. For example, when a displayed AF frame overlaps displayed various kinds of setting information, such as the white balance, the recording image quality, and the color space, one object is hidden by another object, which thus makes it difficult for users to recognize the displayed content.
- To cope with such a problem, Japanese Patent Laid-Open No. 2003-134358 discloses a technique for stopping displaying setting information if a displayed AF cursor, which indicates an in-focus position, overlaps the setting information.
- However, since the setting information is hidden if the displayed AF cursor overlaps the setting information in the technique disclosed in Japanese Patent Laid-Open No. 2003-134358, users may be unable to recognize the content of the setting information while the AF cursor is overlapping the setting information.
- The present invention provides an image capturing apparatus that allows users to optimally recognize displayed information and displayed frames that are superimposed on displayed images.
- An image capturing apparatus according to an aspect of the present invention includes: an image capturing unit configured to capture an image of a subject to acquire image data; a display unit configured to display the image of the subject based on the image data acquired by the image capturing unit; and a display control unit configured to perform a control operation so that information displayed at a predetermined position and a movably displayed frame are superimposed on the image of the subject displayed by the display unit. The display control unit displays either the information or the frame preferentially in accordance with an overlapping state of the information and the frame when the information overlaps the frame.
- Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
-
FIG. 1 is a block diagram showing a configuration of a digital camera according to an exemplary embodiment of the present invention. -
FIG. 2 is a back view of a digital camera according to an exemplary embodiment of the present invention. -
FIG. 3 is a diagram showing various kinds of information displayed during live view shooting. -
FIGS. 4A and 4B are diagrams showing switching of kinds of information displayed during live view shooting. -
FIG. 5 is a diagram showing a screen on which a white balance setting is changed. -
FIG. 6 is a diagram illustrating an AF frame displayed during live view shooting. -
FIGS. 7A-7H are diagrams showing criteria employed when whether an AF frame or various kinds of information is displayed in front is determined. -
FIG. 8 is a flowchart showing processing for controlling display of an AF frame according to a first exemplary embodiment of the present invention. -
FIG. 9 is a diagram showing a state where a face detection frame is displayed according to a second exemplary embodiment of the present invention. -
FIG. 10 is a flowchart showing processing for controlling display of a face detection frame according to a second exemplary embodiment of the present invention. -
FIGS. 11A-11D are diagrams showing criteria employed when whether a circular face detection frame or various kinds of information is displayed in front is determined. - A first exemplary embodiment of the present invention will be described.
FIG. 1 is a block diagram showing a configuration of a digital camera according to the first exemplary embodiment of the present invention. The digital camera includes a cameramain body 100 and alens unit 150 that is exchangeably attached to the cameramain body 100. The cameramain body 100 includes animage pickup element 14 for converting an optical image into electrical signals, and ashutter 12 for controlling an exposure amount of theimage pickup element 14. Animage processing circuit 16 converts analog signals output from theimage pickup element 14 into digital signals (image data) with an analog-to-digital (A/D) converter. Theimage processing circuit 16 also performs various kinds of processing, such as pixel interpolation processing and color conversion processing, on image data supplied from a memory control circuit. Additionally, theimage processing circuit 16 performs predetermined calculation processing using image data. On the basis of the acquired calculation result, asystem control circuit 50 controls a shutter control unit, a focus control unit, and an aperture control unit to perform auto focus (AF) processing, auto exposure (AE) processing, pre-flash (EF) processing. Furthermore, theimage processing circuit 16 performs auto white balance (AWB) processing based on the acquired calculation result. The cameramain body 100 also includes animage display memory 24 and animage display unit 28, which may be a liquid crystal display. Image data written in theimage display memory 24 is displayed on theimage display unit 28 after being processed by a digital-to-analog (D/A) converter. - The
system control circuit 50 controls the cameramain body 100. Amemory 52 stores parameters, variables, and programs for operations of thesystem control circuit 50. Adisplay unit 54 displays, using characters, images, and audio, an operation state and messages in accordance with execution of the programs by thesystem control circuit 50. One ormore display units 54 are provided at a position, which is near anoperation unit 68 of the cameramain body 100 and is easily recognized by users. Thedisplay unit 54 may be constituted by a combination of, for example, a liquid crystal display (LCD), light-emitting diodes (LED), and sound-emitting element. Anonvolatile memory 56 may be an electrically erasable programmable read-only memory (EEPROM). Data is electrically erased from or recorded on thenonvolatile memory 56. - When a shutter button, not shown, is pressed half-way, a shutter switch SW1 62 is turned ON to instruct starting of AF processing, AE processing, AWB processing, and EF processing. When a shutter button, not shown, is pressed fully, a shutter switch SW2 64 is turned ON to instruct the
system control circuit 50 to start a series of image capturing operations. The series of image capturing operations indicate exposure processing, development processing, and recording processing. Theoperation unit 68 may include various buttons and a touch panel. - A
display control unit 70 displays various kinds of information superimposed on an image displayed on theimage display unit 28. The information to be displayed includes setting information indicating settings regarding the white balance and the recording image quality, a histogram indicating luminance distribution of a captured live view image, status information indicating a remaining battery level and the number of capturable images, and warning information indicating that the current state is not suitable for use. These various kinds of information are displayed at predetermined fixed positions on theimage display unit 28. - The
display control unit 70 also displays an AF frame specifying an area of a display image subjected to AF processing, an enlargement frame specifying an area to be enlarged, and a face detection frame informing users of a detected face of a subject. The display positions of these frames are not fixed and can be moved to given positions through user operations. - Furthermore, when the position of the displayed information overlaps the position of the displayed frame, the
display control unit 70 performs a control operation so that either the information or the frame is displayed preferentially in accordance with the overlapping state. - A
face detecting unit 82 performs a predetermined face detecting operation on image data supplied from theimage processing circuit 16 or image data supplied from the memory control circuit.Interfaces Connectors main body 100 to recording media, such as a memory card and a hard disk. Acommunication unit 110 has various communication functions, such as communication through RS-232C (recommended standard 232 version C), USB, IEEE 1394, P1284, SCSI, modem, LAN, and wireless communication. Aconnector 112 connects the cameramain body 100 with other apparatuses through thecommunication unit 110. Theconnector 112 may be an antenna when wireless communication is carried out. Recording media/units -
FIG. 2 is a back view of the digital camera according to the exemplary embodiment.FIG. 2 shows a state where an image formed on theimage pickup element 14 is displayed on theimage display unit 28. A shooting operation performed with the image formed on theimage pickup element 14 being displayed on theimage display unit 28 is referred to as live view shooting. - A
back face 200 of the cameramain body 100 includes aliquid crystal display 201, which corresponds to theimage display unit 28 shown inFIG. 1 . Various buttons 210-220 correspond to theoperation unit 68 shown inFIG. 1 . Anoptical finder 240 is also included. - An image formed on the
image pickup element 14 through a lens is displayed on theliquid crystal display 201 realtime. Various kinds of information and various frames can be superimposed on the displayed image. - Referring to
FIG. 2 , settinginformation 221 regarding image capturing functions, warninginformation 223, and ahistogram 224 are displayed as the various kinds of information. AnAF frame 222 indicating an area subjected to the auto focus processing is also displayed. The various kinds of information will be described in detail later with reference toFIG. 3 , whereas theAF frame 222 will be described in detail later with reference toFIG. 6 . Thedisplay control unit 70 controls display of the various kinds of information and theAF frame 222. - Various kinds of information displayed on the
liquid crystal display 201 during live view shooting will now be described usingFIG. 3 . - Pieces of information 301-304 regarding image capturing functions are displayed in accordance with a content of a setting made by a user. The
information 301 is white balance setting information. Theinformation 302 is recording image quality setting information. Theinformation 303 is setting information regarding a photometry method. Theinformation 304 is color space setting information. The above-described setting information is only an example and other kinds of information regarding image capturing functions may be displayed. -
Warning information 310 indicates that there is no available space in a recording medium inserted into an image capturing apparatus. Ahistogram 320 shows luminance distribution of a live view image. - The various kinds of information are superimposed on images displayed on the
image display unit 28 during live view shooting. At this time, semitransparent processing may be performed so that the image displayed under the various kinds of information can be seen therethrough. When the various kinds of information are displayed preferentially of a frame, such as an AF frame, the semitransparent processing is performed on the various kinds of information so that the image displayed under the various kinds of information can be seen therethrough but the frame is not seen therethrough. -
FIGS. 4A and 4B are diagrams showing switching of kinds of information displayed on theliquid crystal display 201. Although various kinds of information are displayed on theliquid crystal display 201 shown inFIG. 3 , kinds of the displayed information can be switched by operating theINFO button 218, shown inFIG. 2 , for instructing switching of display.FIG. 4A shows a state where only thehistogram 320 is hidden from the state shown inFIG. 3 , whereasFIG. 4B shows a state where the setting information 301-304 regarding image capturing functions, thewarning information 310, and thehistogram 320 are hidden from the state shown inFIG. 3 . -
FIG. 5 is a diagram showing a screen on which settings regarding image capturing functions are changed. As one example,FIG. 5 shows a screen on which a white balance setting is changed. - In response to pressing of the
menu button 210, shown inFIG. 2 , for displaying menu items, theliquid crystal display 201 switches the screen shown inFIG. 3 into the screen shown inFIG. 5 . Aselection frame 501 is for selecting a white balance setting. - If a user presses the
SET button 217, shown inFIG. 2 , after operating the arrow keys 213-216 shown inFIG. 2 to move theselection frame 501, the setting selected with theselection frame 501 is reflected in the digital camera and the displayed screen is switched into the screen shown inFIG. 3 . At this time, the white balance setting information displayed on theimage display unit 28 is also changed into the selected setting. -
FIG. 6 is a diagram illustrating an AF frame displayed during live view shooting. - An
AF frame 601 is similar to theAF frame 222. TheAF frame 601 may be drawn by a solid line or a broken line in any given color. By operating the arrow keys 213-216 shown inFIG. 2 , a user can freely move theAF frame 601 within the screen. The size of theAF frame 601 can also be changed freely. In response to pressing of the enlargebutton 219 shown inFIG. 2 , the size of theAF frame 601 is enlarged as shown by aframe reduce button 220 shown inFIG. 2 , the size of theAF frame 601 is reduced as shown by aframe AF frame 601 is transparent so that a user can check the displayed image. - The AF processing according to a TVAF method is performed during live view shooting. In the TVAF method, an AF evaluation value indicating sharpness of an image is calculated on the basis of a video signal of an image resulting from photoelectric conversion of an image pickup element. A focus lens is then driven so that the maximum AF evaluation value is obtained. In this manner, the focus is adjusted. The AF evaluation value is calculated on the basis of high-frequency components of the video signal extracted by a bandpass filter. Generally, the position of the focus lens that gives the maximum AF evaluation value corresponds to a focal point. When the AF frame is displayed on the
image display unit 28, AF processing is performed on the area enclosed by the AF frame. - A setting for displaying a movable AF frame on the
image display unit 28 can be selected from an AF frame setting of the menu items. Additionally, for example, a setting for fixing the AF-processing target area at a predetermined area at the center of theimage display unit 28 and a setting for automatically selecting the AF-processing target area from a plurality of predetermined areas of theimage display unit 28 in accordance with an image capturing state can be selected in the AF frame setting. -
FIGS. 7A-7H are diagrams showing criteria employed when thedisplay control unit 70 determines an object, i.e., an AF frame or various kinds of information, to be preferentially displayed in accordance with an overlapping state of the various kinds of information and the AF frame and controls display of the information and the frame. Hereinafter, displaying an object in front corresponds to preferentially displaying the object. - In the case where the displayed various kinds of information overlap the AF frame, the
display control unit 70 displays the various kinds of information in front when the size of the AF frame can be estimated even if the various kinds of information are displayed in front of the AF frame. If estimation of the size of the AF frame is difficult, thedisplay control unit 70 displays the AF frame in front. -
FIGS. 7A , 7B, and 7C show cases where it is determined that the AF frame is displayed in front of the various kinds of information. -
FIGS. 7D , 7E, and 7F show cases where it is determined that the various kinds of information are displayed in front of the AF frame. -
FIG. 7G shows a case where an object displayed in front is changed from the AF frame to the various kinds of information in response to a change in the size of the AF frame. - Referring to
FIG. 7A , anentire AF frame 802 overlaps ahistogram 801. In this case, when thehistogram 801 is displayed in front of theAF frame 802, recognition of theAF frame 802 becomes difficult. Accordingly, thedisplay control unit 70 performs a control operation to display theAF frame 802 in front of thehistogram 801. - Referring to
FIG. 7B , since anentire side 811 of the AF frame overlaps ahistogram 816 though parts ofsides entire side 813 can be seen, estimation of the size of the AF frame is difficult if thehistogram 816 is displayed in front. Accordingly, thedisplay control unit 70 performs a control operation to display the AF frame in front of thehistogram 816. - Referring to
FIG. 7C , anentire side 821 of the AF frame overlaps whitebalance setting information 825 and anentire side 823 of the AF frame also overlaps colorspace setting information 826. On the other hand, both ofsides balance setting information 825 and the displayed colorspace setting information 826. In this case, if the whitebalance setting information 825 and the colorspace setting information 826 are displayed in front, estimation of the size of the AF frame is difficult. Accordingly, thedisplay control unit 70 performs a control operation to display the AF frame in front of the setting information. - Referring to
FIG. 7D , parts ofsides histogram 835 butentire sides histogram 835 is displayed in front, thedisplay control unit 70 performs a control operation to display thehistogram 835 in front of the AF frame. - Referring to
FIG. 7E , a part of aside 841 of the AF frame overlaps whitebalance setting information 840. In addition, a part of aside 843 of the AF frame overlaps colorspace setting information 848. A part of aside 844 of the AF frame overlaps the whitebalance setting information 840 and the colorspace setting information 848. As described above, although there are areas where sides of the AF frame overlap the setting information, parts of thesides entire side 842 can be seen. In this case, since the size of the AF frame can be estimated even if the whitebalance setting information 840 and the colorspace setting information 848 are displayed in front, thedisplay control unit 70 performs a control operation to display the setting information in front of the AF frame. - Referring to
FIG. 7F , a part of aside 854 of the AF frame overlaps whitebalance setting information 855 butentire sides balance setting information 855 is displayed in front, thedisplay control unit 70 performs a control operation to display the settinginformation 855 in front of the AF frame. - Referring to
FIG. 7G , when the AF frame is small, an entire upper side of anAF frame 861 overlaps whitebalance setting information 865. Thus, estimation of the size of the AF frame is difficult if the whitebalance setting information 865 is displayed in front. Accordingly, thedisplay control unit 70 performs a control operation to display the AF frame in front of the setting information. - When the AF frame is enlarged by a user operation, only a part of the upper side of the AF frame overlaps the white
balance setting information 865. Since the size of the AF frame can be estimated due to enlargement of the AF frame even if the whitebalance setting information 865 is displayed in front, thedisplay control unit 70 performs a control operation to display the setting information in front of the AF frame. - When estimation of the size of the AF frame becomes difficult in response to reduction of the AF frame, the
display control unit 70 performs a control operation so that the object displayed in front is changed from the setting information to the AF frame. - As described above, when at least one entire side of the AF frame overlaps the displayed various kinds of information, estimation of the size of the AF frame is difficult if the various kinds of information are displayed in front of the AF frame. Thus, the AF frame is displayed in front of the various kinds of information.
- In addition, as shown in
FIG. 7G , when the size of the AF frame is shown by displaying only both ends of each side of the AF frame, estimation of the size of the AF frame may be difficult if only a part of the side of the AF frame overlaps the various kinds of information. For example, this case corresponds to a case where both ends of a side of the AF frame overlap the various kinds of information. In such a case, thedisplay control unit 70 performs a control operation to display the AF frame in front of the various kinds of information. - Additionally, when a gap between the white
balance setting information 840 and the colorspace setting information 848 is smaller inFIG. 7E , estimation of the size of the AF frame may be difficult even if the part of theside 844 is displayed. Accordingly, when both ends of at least one side of the AF frame overlap the various kinds of information, thedisplay control unit 70 may perform a control operation to display the AF frame in front of the various kinds of information as shown inFIG. 7H . - Processing for controlling an object, i.e., an AF frame or various kinds of information, displayed in front will now be described using a flowchart shown in
FIG. 8 . Thedisplay control unit 70 performs processing at steps S102 to S108. - First, in response to pressing of the
live view button 211, shown inFIG. 2 , for instructing starting or termination of live view shooting, thesystem control circuit 50 starts live view shooting (S101). - The
display control unit 70 determines whether a setting for displaying various kinds of information, such as setting information regarding image capturing functions and a histogram, on theimage display unit 28 is selected (S102). The kinds of the information to be displayed on theimage display unit 28 can be changed by operating theINFO button 218. No information may be displayed on theimage display unit 28. - If the setting for displaying the various kinds of information on the
image display unit 28 is selected (YES at S102), thedisplay control unit 70 determines whether to display an AF frame on the image display unit 28 (S103). When the setting for displaying the AF frame on theimage display unit 28 is selected (YES at S103), the AF frame may be displayed after a user selects the setting for displaying the AF frame and specifies a position of the AF frame. Alternatively, the AF frame may be displayed at a predetermined initial position in response to the user's selection of the setting for displaying the AF frame. - When the setting for displaying the AF frame on the
image display unit 28 is selected (YES at S103), thedisplay control unit 70 determines whether the size of the AF frame can be estimated based on the position and size of the AF frame and the position of the various kinds of information even if the various kinds of information are displayed in front of the AF frame (S104). This determination regarding whether the size of the AF frame can be estimated even if the various kinds of information are displayed in front of the AF frame is performed based on the criteria shown inFIGS. 7A-7H . - When it is determined that estimation of the size of the AF frame is difficult if the various kinds of information are displayed in front of the AF frame (NO at S104), the
display control unit 70 displays the AF frame in front of the various kinds of information (S105). - When the size of the AF frame can be estimated even if the various kinds of information are displayed in front of the AF frame (YES at S104), the
display control unit 70 displays the various kinds of information in front of the AF frame (S106). - The
display control unit 70 then determines whether the position or the size of the AF frame has been changed (S107). If the position or the size of the AF frame has been changed (YES at S107), the process returns to STEP S104. If the position or the size of the AF frame has not been changed (NO at S107), the process proceeds to STEP S108. If the displayed AF frame is hidden, the process also proceeds to STEP S108. Additionally, if the AF frame, which has been hidden, is newly displayed, the process returns to STEP S104. - The
display control unit 70 then determines whether the setting for displaying the various kinds of information on theimage display unit 28 has been changed (S108). The kinds of information to be displayed on theimage display unit 28 can be changed by operating theINFO button 218. An AF frame, which has not been overlapping the various kinds of information, may overlap the newly displayed information. Conversely, information, which has been overlapping the AF frame, may be hidden. The processing at STEP S108 is performed for such cases. - The
system control circuit 50 then determines whether termination of live view shooting is instructed with the live view button 211 (S109). If the termination is instructed (YES at S109), thesystem control circuit 50 terminates the live view shooting operation (S110). If the termination is not instructed (NO at S109), the process returns to STEP S107. - As described above, an object displayed in front, i.e., an AF frame or various kinds of information, is controlled in accordance with an overlapping state of the AF frame and the various kinds of information, thereby allowing a user to optimally recognize both of the AF frame and the various kinds of information, which thus results in an improvement of a user interface.
- The display control operation may be performed in the similar manner on an enlargement frame specifying an area of a displayed image to be enlarged as well as the AF frame.
- A second exemplary embodiment of the present invention will now be described. A digital camera according to the second exemplary embodiment of the present invention has a configuration shown in the block diagram of
FIG. 1 . Since the configuration is similar to that employed in the first exemplary embodiment, a description thereof is omitted. - Although a method for controlling display of an AF frame and various kinds of information has been described in the first exemplary embodiment, a method for controlling display of a face detection frame and various kinds of information will be described in the second exemplary embodiment.
-
FIG. 9 is a diagram showing an image displayed on theliquid crystal display 201 during live view shooting. Each offace detection frames face detecting unit 82. When a plurality of faces of subjects are detected, a plurality of face detection frames are displayed to enclose areas recognized to include the faces. The face detection fames are categorized into a main frame and a sub frame based on the size of an area recognized to include a face and the position of the area in an image. Thesystem control circuit 50 determines whether a face detection frame is a main frame or a sub frame on the basis of a detection result of theface detecting unit 82. Theimage display unit 28 displays a main frame for a face determined as a main subject and displays a sub frame for a face determined as a subject other than the main subject. A user may select a face detection frame serving as the main frame by operating theoperation unit 68. Referring toFIG. 9 , theframe 1001 is a main frame, whereas theframes - Referring to
FIG. 9 , an entire upper side of theface detection frame 1001 overlaps whitebalance setting information 1006. Since theface detection frame 1001 is a main frame, thedisplay control unit 70 performs a control operation to display theface detection frame 1001 in front of the settinginformation 1006. The object to be displayed in front, i.e., the main frame or the various kinds of information, is determined on the basis of the criteria shown inFIGS. 7A-7H . - Although an entire upper side of the
face detection frame 1003 overlaps ahistogram 1005, thedisplay control unit 70 performs a control operation to display thehistogram 1005 in front of theface detection frame 1003 since theface detection frame 1003 is a sub frame. - If the
face detection frames display control unit 70 performs a control operation to display the settinginformation 1006 in front of theface detection frame 1001 and to display theface detection frame 1003 in front of thehistogram 1005. - Processing for determining an object to be displayed in front, i.e., a face detection frame or various kinds of information, will now be described using a flowchart shown in
FIG. 10 . Steps similar to those included in the flowchart shown inFIG. 8 are designated by similar or like references and a description thereof is omitted. Thedisplay control unit 70 performs processing at steps S203 to S209. - If a setting for displaying various kinds of information on the
image display unit 28 during live view shooting is selected (YES at S102), thedisplay control unit 70 determines whether to display a face detection frame on the image display unit 28 (S203). When a face detection mode is selected by a user, the face detection frame is displayed to enclose a face of a subject automatically detected in a captured live view image. When a plurality of faces of subjects are detected, a plurality of face detection frames are displayed. The face detection frames are categorized into a main frame and a sub frame based on the size of an area determined to include a face and the position of the area. - If a setting for displaying the face detection frame on the
image display unit 28 is selected (YES at S203), thedisplay control unit 70 determines whether the face detection frame is a main frame (S204). If the face detection frame is determined to be a main frame (YES at S204), the process proceeds to STEP S205. If the face detection frame is determined to be a sub frame (NO at S204), thedisplay control unit 70 displays the various kinds of information in front of the face detection frame regardless of the overlapping state of the face detection frame and various kinds of information (S207). - If the face detection frame is a main frame (YES at S204), the
display control unit 70 determines whether the size of the face detection frame can be estimated on the basis of the position and size of the face detection frame and the position of the various kinds of information even if the various kinds of information are displayed in front of the face detection frame (S205). Whether the size of the face detection frame can be estimated even if the various kinds of information are displayed in front of the face detection frame is determined based on the criteria shown inFIGS. 7A-7H . - If it is determined that estimation of the size of the face detection frame is difficult when the various kinds of information are displayed in front of the face detection frame (NO at S205), the
display control unit 70 displays the face detection frame in front of the various kinds of information (S206). - If the size of the face detection frame can be estimated when the various kinds of information are displayed in front of the face detection frame (YES at S205), the
display control unit 70 displays the various kinds of information in front of the face detection frame (S207). - The
display control unit 70 then determines whether there is a face detection frame that has not been displayed yet after displaying the face detection frame at STEP S206 or S207 (S208). If there is a face detection frame that has not been displayed yet (YES at S208), the process returns to STEP S204. Otherwise, the process proceeds to STEP S209. - The
display control unit 70 then determines whether the position or the size of the face detection frame has been changed (S209). If the position or the size of the face detection frame has been changed (YES at S209), the process returns to STEP S204. Otherwise, the process proceeds to STEP S108. If the displayed face detection frame is hidden, the process also proceeds to STEP S108. If a face detection frame is newly displayed, the process returns to STEP S204. Additionally, if the position or the size of the face detection frame has not been changed but the face detection frame is switched from the main frame to the sub frame or from the sub frame to the main frame, the process also returns to STEP S204. - As described above, different control operations are performed in accordance with whether a face detection frame is a main frame or a sub frame, thereby allowing a user to optimally recognize both of the face detection frame and various kinds of information, which thus results in an improvement of a user interface.
- Although determination of whether the face detection frame is a main frame is performed one by one when a plurality of face detection frame are displayed in the flowchart shown in
FIG. 10 , the determination may be performed regarding the plurality of face detection frames at the same time. - Additionally, when the determination of whether the face detection frame is a main frame is performed one by one and only one main frame is displayed in a single image, this determination may be omitted after one face detection frame is determined as the main frame and the process may then proceed to STEP S207.
- In addition, although
FIG. 10 shows the flowchart of processing performed during live view shooting, the above-described display control operation regarding the face detection frame and the various kinds of information is not limited to the live view shooting operation. For example, this display control operation can be applied to a playback mode, which is started in response to an operation of theplay button 212, shown inFIG. 2 , for starting playback of images that have been captured and recorded. During playback of the images that have been captured and recorded, the display control operation can be applied to a case where various kinds of information and face detection frames are displayed at the same time in order to confirm the various kinds of information employed when the images are captured or a face of a main subject. - A third exemplary embodiment of the present invention will now be described. A digital camera according to the third exemplary embodiment of the present invention has a configuration shown in the block diagram of
FIG. 1 . Since the configuration is similar to that employed in the first exemplary embodiment, a description thereof is omitted. - Although a rectangular face detection frame is used in the second exemplary embodiment, a circular face detection frame is used in the third exemplary embodiment. When the circular face detection frame is used, criteria used in determination of an object to be displayed in front, i.e., the face detection frame or the various kinds of information, differ from the criteria employed when a rectangular face detection frame is used.
-
FIGS. 11A-11D are diagrams showing criteria employed when thedisplay control unit 70 determines which to display in front, a face detection frame or various kinds of information, in accordance with an overlapping state of the face detection frame and the various kinds of information. -
FIG. 11A shows a case where it is determined that the face detection frame is displayed in front of the various kinds of information. -
FIGS. 11B and 11C show cases where it is determined that the various kinds of information are displayed in front of the face detection frame. - Referring to
FIG. 11A , a half or more part of the circumference of a face detection frame 1201 continuously overlaps ahistogram 1202. In this case, estimation of the size of the face detection frame 1201 is difficult if thehistogram 1202 is displayed in front of the face detection frame 1201. Accordingly, thedisplay control unit 70 performs a control operation to display the face detection frame 1201 in front of thehistogram 1202. - Referring to
FIG. 11B , less than a half part of the circumference of aface detection frame 1211 continuously overlaps ahistogram 1212. In this case, the size of theface detection frame 1211 can be estimated even if thehistogram 1212 is displayed in front of theface detection frame 1211. Accordingly, thedisplay control unit 70 performs a control operation to display thehistogram 1212 in front of theface detection frame 1211. - Referring to
FIG. 11C , a half or more of the total circumference of theface detection frame 1221 overlaps whitebalance setting information 1223 and colorspace setting information 1224. However, since the continuously overlapping part is less than a half, the size of theface detection frame 1221 can be estimated even if the whitebalance setting information 1223 and the colorspace setting information 1224 are displayed in front of theface detection frame 1221. Accordingly, thedisplay control unit 70 performs a control operation to display the various kinds of information in front of theface detection frame 1221. - When a gap between the various kinds of information is smaller in
FIG. 11C , estimation of the size of theface detection frame 1221 may be difficult though a part continuously overlapping the various kinds of information is less than a half. Accordingly, when the total part overlapping the various kinds of information is equal to or more than a half of the circumference, a control operation may be performed so that theface detection frame 1221 is displayed in front of thehistogram 1222 as shown inFIG. 11D . - In addition to the circular shape, when a face detection frame is in an oval shape, an object to be displayed in front, i.e., a face detection frame or various kinds of information, may be controlled using similar conditions.
- In addition to the face detection frame, the display control operation may be performed using the similar conditions when an AF frame or an enlargement frame specifying an area of a displayed image to be enlarged is in a circular shape.
- Processing for determining an object to be displayed in front, i.e., the face detection frame or the various kinds of information, according to this exemplary embodiment is similar to that shown in the flowchart of
FIG. 10 . Since only the criteria used at STEP S205 differ, a description thereof is omitted. - As described above, an overlapping state of a face detection frame and various kinds of information is determined even if the face detection frame is circular and an object to be displayed in front is controlled based on the determination, thereby allowing a user to optimally recognize both of the face detection frame and the various kinds of information, which thus results in an improvement of a user interface.
- Although the above-described three exemplary embodiments are described regarding a digital camera having an exchangeable lens unit, a lens-integrated digital camera having a live-view shooting function may be employed.
- In addition, for example, different kinds of frames, such as an AF frame and a face detection frame, may be displayed on the
image display unit 28 at the same time. - Additionally, when one frame overlaps another frame in the case where a plurality of frames are displayed on the
image display unit 28, a smaller frame may be displayed in front. Alternatively, a frame to be displayed in front may be determined in accordance with the kinds of frames. For example, when the AF frame overlaps the face detection frame, the AF frame, which is displayed at a position reflecting a user operation, may be displayed in front of the face detection frame since the face detection frame is displayed at a position of a face of a subject. - Furthermore, the shape of frames is not limited to a rectangle or a circle. Various shapes, such as a star shape and a heart shape, may be used. Criteria for determining the object to be displayed in front, i.e., various frames or various kinds of information, may be set in accordance with whether the size of the frames can be estimated even if the various kinds of information are displayed in front of the frames. For example, when a ratio of a part overlapping the various kinds of information to the entire frame is equal to or higher than a predetermined value, the frame may be displayed in front. If the ratio is lower than the predetermined value, the various kinds of information may be displayed in front. Alternatively, when a predetermined part of the frame overlaps the various kinds of information, the frame may be displayed in front. Otherwise, the various kinds of information may be displayed in front. The predetermined value and the predetermined part may be set in accordance with the shape of the frame.
- Moreover, when the frame overlaps the various kinds of information in a configuration where semitransparent processing is performed on the displayed various kinds of information so that an image displayed under the various kinds of information can be seen, the various kinds of information may be displayed in front of the frame. Whether to perform semitransparent processing may be determined on the basis of the overlapping state.
- While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all modifications and equivalent structures and functions.
Claims (20)
1. An image capturing apparatus comprising:
an image capturing unit configured to capture an image of a subject to acquire image data;
a display unit configured to display the image of the subject based on the image data acquired by the image capturing unit; and
a display control unit configured to perform a control operation so that information displayed at a predetermined position and a movably displayed frame are superimposed on the image of the subject displayed by the display unit, wherein the display control unit switches whether to display preferentially either the information or the frame in accordance with an overlapping state of the information and the frame when the information overlaps the frame.
2. The apparatus according to claim 1 , wherein the display control unit performs the control operation so that the frame is displayed more preferentially than the information when a predetermined part of the frame overlaps the information.
3. The apparatus according to claim 1 , wherein the frame is a rectangular frame that encloses a part of the image displayed by the display unit, and
wherein the display control unit performs the control operation so that the frame is displayed more preferentially than the information when at least one entire side of the frame overlaps the information.
4. The apparatus according to claim 1 , wherein the frame is a rectangular frame that encloses a part of the image displayed by the display unit, and
wherein the display control unit performs the control operation so that the frame is displayed more preferentially than the information when both ends of at least one side of the frame overlap the information.
5. The apparatus according to claim 1 , wherein the frame is formed by only both end parts of each side of a rectangle that encloses a part of the image displayed by the display unit, and
wherein the display control unit performs the control operation so that the frame is displayed more preferentially than the information when the both ends of at least one side of the frame overlap the information.
6. The apparatus according to claim 1 , wherein the display control unit performs the control operation so that the frame is displayed more preferentially than the information when a ratio of a part of the frame overlapping the information to the entire frame is equal to or higher than a predetermined value.
7. The apparatus according to claim 1 , wherein the frame is a circular frame that encloses a part of the image displayed by the display unit, and
wherein the display control unit performs the control operation so that the frame is displayed more preferentially than the information when a half or more of the circumference of the frame overlaps the information.
8. The apparatus according to claim 1 , wherein the frame is a circular frame that encloses a part of the image displayed by the display unit, and
wherein the display control unit performs the control operation so that the frame is displayed more preferentially than the information when a half or more of the circumference of the frame continuously overlaps the information.
9. The apparatus according to claim 1 , further comprising:
a face detecting unit configured to detect faces of the subjects based on the image data acquired by the image capturing unit; and
a selecting unit configured to select a face of a main subject from the faces of the subjects detected by the face detecting unit,
wherein, when a plurality of the frames that enclose the faces of the subjects detected by the face detecting unit are displayed, the display control unit switches, in accordance with an overlapping state of the frame and the information, an object to be preferentially displayed between the information and the frame regarding the frame that encloses the face of the main subject selected by the selecting unit.
10. The apparatus according to claim 9 , wherein the display control unit performs the control operation so that the information is displayed preferentially when the information overlaps the frame that encloses a face of a subject other than the main subject.
11. The apparatus according to claim 1 , wherein a plurality of the frames are displayed on the display unit, and
wherein the display control unit performs the control operation so that a smaller frame is displayed preferentially when two of the plurality of frames overlap each other.
12. The apparatus according to claim 1 , wherein the display control unit changes at least one of the position and the size of the frame in accordance with a user operation.
13. An image capturing apparatus comprising:
an image capturing unit configured to capture an image of a subject to acquire image data;
a display unit configured to display the image of the subject based on the image data acquired by the image capturing unit; and
a display control unit configured to perform a control operation so that a movably displayed frame and information that is displayed at a predetermined position and is semitransparent so that the image of the subject can be seen therethrough are superimposed on the image of the subject displayed by the display unit,
wherein the display control unit performs the control operation so that the information is displayed more preferentially than the frame when the information overlaps the frame and switches whether to make the information semitransparent so that the frame is seen therethrough in accordance with an overlapping state of the information and the frame.
14. An image capturing apparatus comprising:
an image capturing unit configured to capture an image of a subject to acquire image data;
a display unit configured to display the image of the subject based on the image data acquired by the image capturing unit; and
a display control unit configured to perform a control operation so that information displayed at a predetermined position and a movably displayed frame are superimposed on the image of the subject displayed by the display unit,
wherein the display control unit switches whether to display the entire frame or a part of the frame according to the position to display the frame.
15. The apparatus according to claim 14 , wherein the frame encloses a part of the image displayed by the display unit.
16. The apparatus according to claim 14 , wherein the frame is displayed based on a result of a face detection operation.
17. The apparatus according to claim 14 , wherein the frame is formed by only both end parts of each side of a rectangle that encloses a part of the image displayed by the display unit.
18. A display control method comprising:
displaying an image of a subject on a display unit based on image data acquired by an image capturing unit; and
controlling information displayed at a predetermined position and a movably displayed frame to be superimposed on the image of the subject displayed on the display unit,
wherein, in the controlling step, it is switched whether to display preferentially either the information or the frame in accordance with an overlapping state of the information and the frame when the information overlaps the frame.
19. A display control method comprising:
displaying an image of a subject on a display unit based on image data acquired by an image capturing unit; and
controlling a movably displayed frame and information, which is displayed at a predetermined position and is semitransparent so that the image of the subject can be seen therethrough, to be superimposed on the image of the subject displayed on the display unit,
wherein, in the controlling step, the information is displayed more preferentially than the frame when the information overlaps the frame and whether to make the information semitransparent so that the frame is seen therethrough is switched in accordance with an overlapping state of the information and the frame.
20. A display control method comprising:
displaying an image of a subject on a display unit based on image data acquired by an image capturing unit; and
controlling information displayed at a predetermined position and a movably displayed frame to be superimposed on the image of the subject displayed on the display unit,
wherein, in the controlling step, it is switched whether to display the entire frame or a part of the frame according to the position to display the frame.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/408,975 US20120162480A1 (en) | 2008-03-11 | 2012-02-29 | Image capturing apparatus and display control method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-061289 | 2008-03-11 | ||
JP2008061289A JP5116514B2 (en) | 2008-03-11 | 2008-03-11 | Imaging apparatus and display control method |
US12/402,376 US8773567B2 (en) | 2008-03-11 | 2009-03-11 | Image capturing apparatus having display control of information and frames on displayed images and display control method |
US13/408,975 US20120162480A1 (en) | 2008-03-11 | 2012-02-29 | Image capturing apparatus and display control method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/402,376 Continuation US8773567B2 (en) | 2008-03-11 | 2009-03-11 | Image capturing apparatus having display control of information and frames on displayed images and display control method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120162480A1 true US20120162480A1 (en) | 2012-06-28 |
Family
ID=41062619
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/402,376 Expired - Fee Related US8773567B2 (en) | 2008-03-11 | 2009-03-11 | Image capturing apparatus having display control of information and frames on displayed images and display control method |
US13/408,975 Abandoned US20120162480A1 (en) | 2008-03-11 | 2012-02-29 | Image capturing apparatus and display control method |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/402,376 Expired - Fee Related US8773567B2 (en) | 2008-03-11 | 2009-03-11 | Image capturing apparatus having display control of information and frames on displayed images and display control method |
Country Status (2)
Country | Link |
---|---|
US (2) | US8773567B2 (en) |
JP (1) | JP5116514B2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130316763A1 (en) * | 2012-05-23 | 2013-11-28 | Steven Earl Kader | Method of displaying images while charging a smartphone |
US20180359422A1 (en) * | 2012-11-21 | 2018-12-13 | Canon Kabushiki Kaisha | Transmission apparatus, setting apparatus, transmission method, reception method, and storage medium |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5610738B2 (en) * | 2009-10-28 | 2014-10-22 | キヤノン株式会社 | Display control apparatus, control method, program, and storage medium |
JPWO2011074296A1 (en) * | 2009-12-15 | 2013-04-25 | シャープ株式会社 | Electronics |
JP5561019B2 (en) * | 2010-08-23 | 2014-07-30 | ソニー株式会社 | Imaging apparatus, program, and imaging method |
US9712750B2 (en) | 2012-04-09 | 2017-07-18 | Sony Corporation | Display control device and associated methodology of identifying a subject in an image |
JP5970937B2 (en) * | 2012-04-25 | 2016-08-17 | ソニー株式会社 | Display control apparatus and display control method |
JP6066593B2 (en) * | 2012-06-13 | 2017-01-25 | キヤノン株式会社 | Imaging system and driving method of imaging system |
JP6247527B2 (en) * | 2013-12-24 | 2017-12-13 | キヤノン株式会社 | Imaging apparatus, control method, program, and storage medium |
JP2019198008A (en) * | 2018-05-10 | 2019-11-14 | キヤノン株式会社 | Imaging apparatus, control method of the same, and program |
JPWO2022181055A1 (en) * | 2021-02-26 | 2022-09-01 |
Citations (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5469536A (en) * | 1992-02-25 | 1995-11-21 | Imageware Software, Inc. | Image editing system including masking capability |
US5491332A (en) * | 1991-07-11 | 1996-02-13 | A.D.P. Adaptive Visual Perception Ltd. | Transparency viewing apparatus having display and masking functions |
US5515494A (en) * | 1992-12-17 | 1996-05-07 | Seiko Epson Corporation | Graphics control planes for windowing and other display operations |
US5796402A (en) * | 1993-12-03 | 1998-08-18 | Microsoft Corporation | Method and system for aligning windows on a computer screen |
US5805163A (en) * | 1996-04-22 | 1998-09-08 | Ncr Corporation | Darkened transparent window overlapping an opaque window |
US5943050A (en) * | 1994-04-07 | 1999-08-24 | International Business Machines Corporation | Digital image capture control |
US6043817A (en) * | 1995-06-30 | 2000-03-28 | Microsoft Corporation | Method and apparatus for arranging displayed graphical representations on a computer interface |
US6344860B1 (en) * | 1998-11-27 | 2002-02-05 | Seriate Solutions, Inc. | Methods and apparatus for a stereoscopic graphic user interface |
US20020196369A1 (en) * | 2001-06-01 | 2002-12-26 | Peter Rieder | Method and device for displaying at least two images within one combined picture |
US20040233224A1 (en) * | 2000-10-06 | 2004-11-25 | Akio Ohba | Image processor, image processing method, recording medium, computer program and semiconductor device |
US20050175251A1 (en) * | 2004-02-09 | 2005-08-11 | Sanyo Electric Co., Ltd. | Image coding apparatus, image decoding apparatus, image display apparatus and image processing apparatus |
US20050219395A1 (en) * | 2004-03-31 | 2005-10-06 | Fuji Photo Film Co., Ltd. | Digital still camera and method of controlling same |
US20060026535A1 (en) * | 2004-07-30 | 2006-02-02 | Apple Computer Inc. | Mode-based graphical user interfaces for touch sensitive input devices |
US20060078173A1 (en) * | 2004-10-13 | 2006-04-13 | Fuji Photo Film Co., Ltd. | Image processing apparatus, image processing method and image processing program |
US20070008338A1 (en) * | 2005-05-28 | 2007-01-11 | Young-Chan Kim | Display system, display apparatus, and method of controlling video source and display apparatus |
US20070070186A1 (en) * | 2005-06-30 | 2007-03-29 | Sony Corporation | Interactive communication apparatus and connecting method |
WO2007052382A1 (en) * | 2005-11-02 | 2007-05-10 | Matsushita Electric Industrial Co., Ltd. | Display-object penetrating apparatus |
US20070121012A1 (en) * | 2004-02-27 | 2007-05-31 | Yoichi Hida | Information display method and information display device |
US7257777B1 (en) * | 2000-07-19 | 2007-08-14 | International Business Machines Corporation | System and method for automatic control of window viewing |
US20070188646A1 (en) * | 2004-05-13 | 2007-08-16 | Sony Corporation | Imaging device,image display method, and user interface |
US20070229695A1 (en) * | 2006-03-31 | 2007-10-04 | Nikon Corporation | Digital camera |
US20070263997A1 (en) * | 2006-05-10 | 2007-11-15 | Canon Kabushiki Kaisha | Focus adjustment method, focus adjustment apparatus, and control method thereof |
US20080024643A1 (en) * | 2006-07-25 | 2008-01-31 | Fujifilm Corporation | Image-taking apparatus and image display control method |
US20080068487A1 (en) * | 2006-09-14 | 2008-03-20 | Canon Kabushiki Kaisha | Image display apparatus, image capturing apparatus, and image display method |
US7359003B1 (en) * | 2001-11-09 | 2008-04-15 | Synerdyne Corporation | Display, input and form factor for portable instruments |
US20080118156A1 (en) * | 2006-11-21 | 2008-05-22 | Sony Corporation | Imaging apparatus, image processing apparatus, image processing method and computer program |
US20080123953A1 (en) * | 2006-11-29 | 2008-05-29 | Gateway Inc. | Digital camera with histogram zoom |
US20080136958A1 (en) * | 2006-12-11 | 2008-06-12 | Pentax Corporation | Camera having a focus adjusting system and a face recognition function |
US20080240563A1 (en) * | 2007-03-30 | 2008-10-02 | Casio Computer Co., Ltd. | Image pickup apparatus equipped with face-recognition function |
US20090021576A1 (en) * | 2007-07-18 | 2009-01-22 | Samsung Electronics Co., Ltd. | Panoramic image production |
US20090089661A1 (en) * | 2007-10-02 | 2009-04-02 | Canon Kabushiki Kaisha | Information processing apparatus and information processing method |
US20090096810A1 (en) * | 2007-10-11 | 2009-04-16 | Green Brian D | Method for selectively remoting windows |
US20090167633A1 (en) * | 2007-12-31 | 2009-07-02 | Searete Llc | Managing multiple display surfaces |
US20090273667A1 (en) * | 2006-04-11 | 2009-11-05 | Nikon Corporation | Electronic Camera |
US7620905B2 (en) * | 2006-04-14 | 2009-11-17 | International Business Machines Corporation | System and method of windows management |
US20090288036A1 (en) * | 2005-12-22 | 2009-11-19 | Kazuya Osawa | Multi-window display apparatus, multi-window display method, and integrated circuit |
US20100013981A1 (en) * | 2007-07-10 | 2010-01-21 | Canon Kabushiki Kaisha | Focus control apparatus, image sensing apparatus, and control method therefor |
US20100142842A1 (en) * | 2008-12-04 | 2010-06-10 | Harris Corporation | Image processing device for determining cut lines and related methods |
US7893950B2 (en) * | 1999-12-22 | 2011-02-22 | Adobe Systems Incorporated | Color compositing using transparency groups |
US8179338B2 (en) * | 1999-08-19 | 2012-05-15 | Igt | Method and system for displaying information |
US8191003B2 (en) * | 2007-02-14 | 2012-05-29 | International Business Machines Corporation | Managing transparent windows |
US8271425B2 (en) * | 2005-07-21 | 2012-09-18 | Konica Minolta Business Technologies, Inc. | Image processing system and image processing device implementing a specific image processing function for each user as well as a computer program product for the same |
US8386956B2 (en) * | 2003-06-20 | 2013-02-26 | Apple Inc. | Computer interface having a virtual single-layer mode for viewing overlapping objects |
US8436873B2 (en) * | 2005-10-05 | 2013-05-07 | Pure Depth Limited | Method of manipulating visibility of images on a volumetric display |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2109681C (en) * | 1993-03-10 | 1998-08-25 | Donald Edgar Blahut | Method and apparatus for the coding and display of overlapping windows with transparency |
KR100228618B1 (en) * | 1994-05-31 | 1999-11-01 | 아끼구사 나오유끼 | Method and apparatus for assigning temporary and true labels to digital image |
US6654014B2 (en) * | 1995-04-20 | 2003-11-25 | Yoshinori Endo | Bird's-eye view forming method, map display apparatus and navigation system |
DE69632808T2 (en) * | 1995-04-28 | 2005-06-30 | Koninklijke Philips Electronics N.V. | WIRELESS COMMUNICATION SYSTEM FOR SECURE COMMUNICATION BETWEEN DIFFERENT DEVICES |
JPH0927921A (en) * | 1995-07-13 | 1997-01-28 | Canon Inc | Video camera |
EP1460604A3 (en) * | 1996-04-16 | 2006-11-02 | Xanavi Informatics Corporation | Map display device, navigation device and map display method |
JPH09288477A (en) * | 1996-04-19 | 1997-11-04 | Mitsubishi Electric Corp | Picture display controller |
US6587118B1 (en) * | 1999-03-31 | 2003-07-01 | Sony Corporation | Image displaying processing method, medium including an image displaying processing program stored thereon, and image displaying processing apparatus |
US6670970B1 (en) * | 1999-12-20 | 2003-12-30 | Apple Computer, Inc. | Graduated visual and manipulative translucency for windows |
JP2003134358A (en) * | 2001-10-19 | 2003-05-09 | Minolta Co Ltd | Digital camera |
US7425968B2 (en) * | 2003-06-16 | 2008-09-16 | Gelber Theodore J | System and method for labeling maps |
US7265762B2 (en) * | 2003-12-17 | 2007-09-04 | Quid Novi, S.A., Inc. | Method and apparatus for representing data using layered objects |
JP2006254129A (en) * | 2005-03-11 | 2006-09-21 | Casio Comput Co Ltd | Histogram display device |
JP2006303961A (en) * | 2005-04-21 | 2006-11-02 | Canon Inc | Imaging apparatus |
JP2007094633A (en) * | 2005-09-28 | 2007-04-12 | Fujifilm Corp | Face detector and program |
US7433741B2 (en) * | 2005-09-30 | 2008-10-07 | Rockwell Automation Technologies, Inc. | Hybrid user interface having base presentation information with variably prominent supplemental information |
JP4880292B2 (en) * | 2005-11-24 | 2012-02-22 | 富士フイルム株式会社 | Image processing method, image processing program, and image processing apparatus |
US8200416B2 (en) * | 2005-12-22 | 2012-06-12 | The Boeing Company | Methods and systems for controlling display of en-route maps |
JP2007199311A (en) * | 2006-01-25 | 2007-08-09 | Fujifilm Corp | Image display device and imaging apparatus |
JP4721434B2 (en) * | 2006-03-31 | 2011-07-13 | キヤノン株式会社 | Imaging apparatus and control method thereof |
JP2007281680A (en) * | 2006-04-04 | 2007-10-25 | Sony Corp | Image processor and image display method |
JP4457358B2 (en) * | 2006-05-12 | 2010-04-28 | 富士フイルム株式会社 | Display method of face detection frame, display method of character information, and imaging apparatus |
JP4264660B2 (en) * | 2006-06-09 | 2009-05-20 | ソニー株式会社 | IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND COMPUTER PROGRAM |
JP4110178B2 (en) * | 2006-07-03 | 2008-07-02 | キヤノン株式会社 | IMAGING DEVICE, ITS CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM |
US7697053B2 (en) * | 2006-11-02 | 2010-04-13 | Eastman Kodak Company | Integrated display having multiple capture devices |
JP4201809B2 (en) * | 2006-11-13 | 2008-12-24 | 三洋電機株式会社 | Camera shake correction apparatus and method, and imaging apparatus |
JP4816538B2 (en) * | 2007-03-28 | 2011-11-16 | セイコーエプソン株式会社 | Image processing apparatus and image processing method |
US8599315B2 (en) * | 2007-07-25 | 2013-12-03 | Silicon Image, Inc. | On screen displays associated with remote video source devices |
JP2009118009A (en) * | 2007-11-02 | 2009-05-28 | Sony Corp | Imaging apparatus, method for controlling same, and program |
US7961908B2 (en) * | 2007-12-21 | 2011-06-14 | Zoran Corporation | Detecting objects in an image being acquired by a digital camera or other electronic image acquisition device |
-
2008
- 2008-03-11 JP JP2008061289A patent/JP5116514B2/en not_active Expired - Fee Related
-
2009
- 2009-03-11 US US12/402,376 patent/US8773567B2/en not_active Expired - Fee Related
-
2012
- 2012-02-29 US US13/408,975 patent/US20120162480A1/en not_active Abandoned
Patent Citations (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5491332A (en) * | 1991-07-11 | 1996-02-13 | A.D.P. Adaptive Visual Perception Ltd. | Transparency viewing apparatus having display and masking functions |
US5469536A (en) * | 1992-02-25 | 1995-11-21 | Imageware Software, Inc. | Image editing system including masking capability |
US5515494A (en) * | 1992-12-17 | 1996-05-07 | Seiko Epson Corporation | Graphics control planes for windowing and other display operations |
US5796402A (en) * | 1993-12-03 | 1998-08-18 | Microsoft Corporation | Method and system for aligning windows on a computer screen |
US5943050A (en) * | 1994-04-07 | 1999-08-24 | International Business Machines Corporation | Digital image capture control |
US6043817A (en) * | 1995-06-30 | 2000-03-28 | Microsoft Corporation | Method and apparatus for arranging displayed graphical representations on a computer interface |
US5805163A (en) * | 1996-04-22 | 1998-09-08 | Ncr Corporation | Darkened transparent window overlapping an opaque window |
US6344860B1 (en) * | 1998-11-27 | 2002-02-05 | Seriate Solutions, Inc. | Methods and apparatus for a stereoscopic graphic user interface |
US8179338B2 (en) * | 1999-08-19 | 2012-05-15 | Igt | Method and system for displaying information |
US7893950B2 (en) * | 1999-12-22 | 2011-02-22 | Adobe Systems Incorporated | Color compositing using transparency groups |
US7257777B1 (en) * | 2000-07-19 | 2007-08-14 | International Business Machines Corporation | System and method for automatic control of window viewing |
US20040233224A1 (en) * | 2000-10-06 | 2004-11-25 | Akio Ohba | Image processor, image processing method, recording medium, computer program and semiconductor device |
US20020196369A1 (en) * | 2001-06-01 | 2002-12-26 | Peter Rieder | Method and device for displaying at least two images within one combined picture |
US7359003B1 (en) * | 2001-11-09 | 2008-04-15 | Synerdyne Corporation | Display, input and form factor for portable instruments |
US8386956B2 (en) * | 2003-06-20 | 2013-02-26 | Apple Inc. | Computer interface having a virtual single-layer mode for viewing overlapping objects |
US20050175251A1 (en) * | 2004-02-09 | 2005-08-11 | Sanyo Electric Co., Ltd. | Image coding apparatus, image decoding apparatus, image display apparatus and image processing apparatus |
US20070121012A1 (en) * | 2004-02-27 | 2007-05-31 | Yoichi Hida | Information display method and information display device |
US20050219395A1 (en) * | 2004-03-31 | 2005-10-06 | Fuji Photo Film Co., Ltd. | Digital still camera and method of controlling same |
US20070188646A1 (en) * | 2004-05-13 | 2007-08-16 | Sony Corporation | Imaging device,image display method, and user interface |
US20060026535A1 (en) * | 2004-07-30 | 2006-02-02 | Apple Computer Inc. | Mode-based graphical user interfaces for touch sensitive input devices |
US20060078173A1 (en) * | 2004-10-13 | 2006-04-13 | Fuji Photo Film Co., Ltd. | Image processing apparatus, image processing method and image processing program |
US20070008338A1 (en) * | 2005-05-28 | 2007-01-11 | Young-Chan Kim | Display system, display apparatus, and method of controlling video source and display apparatus |
US20070070186A1 (en) * | 2005-06-30 | 2007-03-29 | Sony Corporation | Interactive communication apparatus and connecting method |
US8271425B2 (en) * | 2005-07-21 | 2012-09-18 | Konica Minolta Business Technologies, Inc. | Image processing system and image processing device implementing a specific image processing function for each user as well as a computer program product for the same |
US8436873B2 (en) * | 2005-10-05 | 2013-05-07 | Pure Depth Limited | Method of manipulating visibility of images on a volumetric display |
US20090138811A1 (en) * | 2005-11-02 | 2009-05-28 | Masaki Horiuchi | Display object penetrating apparatus |
WO2007052382A1 (en) * | 2005-11-02 | 2007-05-10 | Matsushita Electric Industrial Co., Ltd. | Display-object penetrating apparatus |
US20090288036A1 (en) * | 2005-12-22 | 2009-11-19 | Kazuya Osawa | Multi-window display apparatus, multi-window display method, and integrated circuit |
US20070229695A1 (en) * | 2006-03-31 | 2007-10-04 | Nikon Corporation | Digital camera |
US20090273667A1 (en) * | 2006-04-11 | 2009-11-05 | Nikon Corporation | Electronic Camera |
US7620905B2 (en) * | 2006-04-14 | 2009-11-17 | International Business Machines Corporation | System and method of windows management |
US20070263997A1 (en) * | 2006-05-10 | 2007-11-15 | Canon Kabushiki Kaisha | Focus adjustment method, focus adjustment apparatus, and control method thereof |
US20080024643A1 (en) * | 2006-07-25 | 2008-01-31 | Fujifilm Corporation | Image-taking apparatus and image display control method |
US20080068487A1 (en) * | 2006-09-14 | 2008-03-20 | Canon Kabushiki Kaisha | Image display apparatus, image capturing apparatus, and image display method |
US20080118156A1 (en) * | 2006-11-21 | 2008-05-22 | Sony Corporation | Imaging apparatus, image processing apparatus, image processing method and computer program |
US20080123953A1 (en) * | 2006-11-29 | 2008-05-29 | Gateway Inc. | Digital camera with histogram zoom |
US20080136958A1 (en) * | 2006-12-11 | 2008-06-12 | Pentax Corporation | Camera having a focus adjusting system and a face recognition function |
US8191003B2 (en) * | 2007-02-14 | 2012-05-29 | International Business Machines Corporation | Managing transparent windows |
US20080240563A1 (en) * | 2007-03-30 | 2008-10-02 | Casio Computer Co., Ltd. | Image pickup apparatus equipped with face-recognition function |
US20100013981A1 (en) * | 2007-07-10 | 2010-01-21 | Canon Kabushiki Kaisha | Focus control apparatus, image sensing apparatus, and control method therefor |
US20090021576A1 (en) * | 2007-07-18 | 2009-01-22 | Samsung Electronics Co., Ltd. | Panoramic image production |
US20090089661A1 (en) * | 2007-10-02 | 2009-04-02 | Canon Kabushiki Kaisha | Information processing apparatus and information processing method |
US20090096810A1 (en) * | 2007-10-11 | 2009-04-16 | Green Brian D | Method for selectively remoting windows |
US20090167633A1 (en) * | 2007-12-31 | 2009-07-02 | Searete Llc | Managing multiple display surfaces |
US20100142842A1 (en) * | 2008-12-04 | 2010-06-10 | Harris Corporation | Image processing device for determining cut lines and related methods |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130316763A1 (en) * | 2012-05-23 | 2013-11-28 | Steven Earl Kader | Method of displaying images while charging a smartphone |
US8718716B2 (en) * | 2012-05-23 | 2014-05-06 | Steven Earl Kader | Method of displaying images while charging a smartphone |
US9031618B1 (en) * | 2012-05-23 | 2015-05-12 | Steven Earl Kader | Method of displaying images while charging a smartphone |
US20180359422A1 (en) * | 2012-11-21 | 2018-12-13 | Canon Kabushiki Kaisha | Transmission apparatus, setting apparatus, transmission method, reception method, and storage medium |
US10715732B2 (en) * | 2012-11-21 | 2020-07-14 | Canon Kabushiki Kaisha | Transmission apparatus, setting apparatus, transmission method, reception method, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
US20090231470A1 (en) | 2009-09-17 |
JP5116514B2 (en) | 2013-01-09 |
US8773567B2 (en) | 2014-07-08 |
JP2009218915A (en) | 2009-09-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8773567B2 (en) | Image capturing apparatus having display control of information and frames on displayed images and display control method | |
US7649563B2 (en) | Digital photographing apparatus that adaptively displays icons and method of controlling the digital photographing apparatus | |
JP5779959B2 (en) | Imaging device | |
JP5025532B2 (en) | Imaging apparatus, imaging apparatus control method, and imaging apparatus control program | |
US8228419B2 (en) | Method of controlling digital photographing apparatus for out-focusing operation and digital photographing apparatus adopting the method | |
US20150271387A1 (en) | Digital image processing apparatus and method of controlling the same | |
JP6124700B2 (en) | IMAGING DEVICE, ITS CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM | |
US9177395B2 (en) | Display device and display method for providing image display in first color mode and second color mode | |
US20060082661A1 (en) | Method of controlling digital photographing apparatus for classification reproduction and digital photographing apparatus using the method | |
US11245846B2 (en) | Image capturing control apparatus and control method therefor | |
US7936988B2 (en) | Imaging apparatus and control method therefor | |
JP4879127B2 (en) | Digital camera and digital camera focus area selection method | |
US9635281B2 (en) | Imaging apparatus method for controlling imaging apparatus and storage medium | |
JP2010050573A (en) | Image capturing apparatus and method of controlling the same | |
US11477356B2 (en) | Image capturing apparatus, control method thereof, and non-transitory computer-readable storage medium | |
KR20110057603A (en) | Apparatus and method for digital picturing image | |
US20050185082A1 (en) | Focusing method for digital photographing apparatus | |
JP7378999B2 (en) | Imaging device, imaging method, program and recording medium | |
KR20090043162A (en) | Apparatus for processing digital image capable of multi-disiaplay using television and thereof method | |
JP2009049639A (en) | Image pick-up device | |
US9083939B2 (en) | Moving image playback apparatus, control method therefor, and recording medium | |
US20060152613A1 (en) | Method and apparatus for displaying digital images | |
JP5800625B2 (en) | Imaging apparatus, control method therefor, and program | |
JP4077683B2 (en) | Camera and display control method | |
JP2008060844A (en) | Image processor and image processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |