US20150082201A1 - Terminal device and sharing method thereof - Google Patents
Terminal device and sharing method thereof Download PDFInfo
- Publication number
- US20150082201A1 US20150082201A1 US14/479,493 US201414479493A US2015082201A1 US 20150082201 A1 US20150082201 A1 US 20150082201A1 US 201414479493 A US201414479493 A US 201414479493A US 2015082201 A1 US2015082201 A1 US 2015082201A1
- Authority
- US
- United States
- Prior art keywords
- chatting
- screen
- application execution
- user
- execution screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
- H04L12/1831—Tracking arrangements for later retrieval, e.g. recording contents, participants activities or behavior, network status
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B1/00—Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
- H04B1/38—Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
- H04B1/40—Circuits
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
- H04L51/046—Interoperability with other network applications or services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
Definitions
- Methods and apparatuses consistent with the exemplary embodiments relate to a terminal apparatus and a sharing method thereof, more particularly, to a terminal apparatus providing a chatting service and a sharing method thereof.
- users are being provided with chatting services through terminal apparatuses.
- users may perform chatting with one or more users through their terminal apparatuses without any restrictions in terms of space and time.
- One or more exemplary embodiments may resolve the aforementioned problems, that is, provide a terminal apparatus for sharing with a chatting service counterpart not only chatting contents but also various application execution screens, and a method for sharing thereof.
- a terminal apparatus configured to provide a chatting service
- the apparatus comprising: a communicator configured to perform communication with a chatting service counterpart of another terminal apparatus through a server; a display configured to display on a chatting screen of the chatting service at least one application execution screen shared with the chatting service counterpart of the other terminal apparatus; and a controller configured to control the communicator to share a function related to the application execution screen executed according to a user's manipulation with the chatting service counterpart of the other terminal apparatus.
- the controller may further be configured to control such a GUI (Graphic User Interface) corresponding to the user's manipulation to be displayed on the application execution screen and to control the communicator to share the GUI with the chatting service counterpart of the other terminal apparatus.
- GUI Graphic User Interface
- the controller may be further configured to control a GUI to be displayed on a point on the map screen selected according to the user's manipulation, and to control the communicator to share the map screen where the GUI is displayed is shared with the chatting service counterpart of the other terminal apparatus.
- the controller may be further configured to control a GUI to be displayed on a date on the calendar screen selected according to the user's manipulation, and to control the communicator to share the calendar screen where the GUI is displayed with the chatting service counterpart of the other terminal apparatus.
- controller may be further configured to execute a function corresponding to a user's manipulation of selecting a menu item on the condition that the application execution screen shared by the chatting service counterpart is displayed and a user's manipulation of selecting a menu item included in the application execution screen is input.
- controller may be further configured to control access of an address corresponding to the menu item and downloading of an application.
- the controller may be further configured to store a function related to the application execution screen executed according to a user's manipulation in an application corresponding to the application execution screen.
- controller may be further configured to store a function related to the application execution screen executed according to a user's manipulation in an integrated application.
- controller may be further configured to control the communicator to share a chatting content entered in the chatting screen with the chatting service counterpart of the other terminal apparatus.
- the controller may be further configured to control the display to display on one area of the chatting screen a chatting content selected from among chatting contents entered in the chatting screen according to a user's manipulation, and to control the display to display a selecting menu item related to an application execution screen related to the selected chatting content together with the selected chatting content.
- controller may be further configured to control each application execution screen related to the selected chatting content to be scrapped and stored.
- a sharing method of a terminal apparatus providing a chatting service comprising: displaying on a chatting screen providing the chatting service at least one application execution screen shared with a chatting service counterpart of another terminal apparatus; and controlling such that a function related to the application execution screen executed according to a user's manipulation is shared with the chatting service counterpart of the other terminal apparatus.
- controlling may involve controlling such that the application execution screen where a GUI (Graphic User Interface) corresponding to the user's manipulation is displayed and is shared with the chatting service counterpart of the other terminal apparatus.
- GUI Graphic User Interface
- the controlling may comprise controlling a GUI to be displayed on a point on the map screen selected according to the user's manipulation, and controlling the map screen where the GUI is displayed to be shared with the chatting service counterpart of the other terminal apparatus.
- the controlling may comprise controlling a GUI to be displayed on a date on the calendar screen selected according to the user's manipulation, and controlling the calendar screen where the GUI is displayed to be shared with the chatting service counterpart of the other terminal apparatus.
- the method may further comprise executing a function corresponding to the user's manipulation of selecting a menu item, on a condition that an application execution screen shared by the chatting service counterpart is displayed and a user's manipulation of selecting a menu item included in the application execution screen is input.
- the method may further comprise accessing an address corresponding to the menu item and downloading an application.
- the method according to an exemplary embodiment may further comprise controlling a chatting content entered in the chatting screen to be shared with the chatting service counterpart of the other terminal apparatus.
- the method according to an exemplary embodiment may further comprise displaying on one area of the chatting screen a chatting content selected according to a user's manipulation from among chatting contents entered in the chatting screen, and displaying a selecting menu item related to an application execution screen related to the selected chatting content together with the selected chatting content.
- the method according to an exemplary embodiment may further comprise controlling each application execution screen related to the selected chatting content to be scrapped and stored.
- a terminal apparatus having a communicator configured to communicate with a sharing service counterpart of another terminal apparatus through a server, a display configured to display on a screen of the sharing service at least one application execution screen, and a controller configured to control the communicator to share data corresponding to the application execution screen the with sharing service counterpart of the other terminal apparatus.
- the sharing service and the sharing service counterpart may comprise a chatting service and a chatting service counterpart, respectively, and the controller may be further configured to control the communicator to share chatting content entered in the chatting service or the chatting service counterpart.
- FIG. 1 is a view for explaining a configuration of a system according to an exemplary embodiment
- FIG. 2 is a block diagram for explaining a configuration of a terminal apparatus according to an exemplary embodiment
- FIGS. 3A and 3B illustrate views for explaining a method for displaying an application execution screen according to an exemplary embodiment
- FIGS. 4A to 6 are views for explaining a method for sharing an application execution screen according to an exemplary embodiment
- FIGS. 7A to 7F illustrate views for explaining a method for fixating a particular chatting context and displaying the chatting content according to an exemplary embodiment
- FIG. 8 is a block diagram for explaining a detailed configuration of a terminal apparatus according to an exemplary embodiment.
- FIG. 9 is a flowchart for explaining a sharing method of a terminal apparatus that provides a chatting service according to an exemplary embodiment.
- FIG. 1 is a view for explaining a configuration of a system according to an exemplary embodiment.
- the system 1000 may comprise a terminal apparatus 100 , server 200 , and first terminal apparatus 100 - 1 , second terminal apparatus 100 - 2 , . . . , and nth terminal apparatus 100 - n.
- the terminal apparatus 100 and terminal apparatuses 100 - 1 , 100 - 2 , 100 - n may be embodied as portable terminal apparatuses such as a mobile phone and tablet etc.
- the terminal apparatus 100 and terminal apparatuses 100 - 1 , 100 - 2 , and 100 - n are depicted as portable terminal apparatuses, the terminal apparatuses may be non-portable terminal apparatuses.
- users of the terminal apparatus 100 and terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n may be provided with a chatting service through the server 200 .
- the server 200 may identify the terminal apparatus 100 and terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n through the logged ID, and then either transmit the chatting content entered in the terminal apparatus 100 to the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n , or on the contrary, transmit the chatting content entered in the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n to the terminal apparatus 100 , providing chatting services between the users.
- the terminal apparatus 100 may display the map screen on one area of a chatting window, and transmit information on the map screen to the server 200 .
- the server 200 may transmit the information on the map screen received from the terminal apparatus 100 to the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n , enabling the chatting services users to share the map screen.
- FIG. 2 is a block diagram for explaining a configuration of a terminal apparatus according to an exemplary embodiment.
- the terminal apparatus 100 is a terminal apparatus that provides a chatting service, the terminal apparatus 100 comprising a communicator 110 , display (or displayer) 120 , and controller 130 .
- the terminal apparatus 100 may be a terminal apparatus 100 of FIG. 1 , or in some cases, one of a first terminal apparatus 100 - 1 , second terminal apparatus 100 - 2 , . . . , and nth terminal apparatus 100 - n.
- the communicator 110 performs communication with the terminal apparatus ( 100 - 1 , 100 - 2 , . . . , 100 - n of FIG. 1 ) of the counterpart of the chatting service through the server ( 200 of FIG. 1 ).
- the communicator 110 uses various communication standards such as 3G, 4G, and Wifi, to connect to a network and perform communication with the chatting service counterpart of another terminal apparatus ( 100 - 1 , 100 - 2 , . . . , 100 - n of FIG. 1 ).
- the server 200 may be a server that relays communication between the terminal apparatus 100 and the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n to provide a chatting service.
- the server 200 may control such that the users of the terminal apparatus 100 and terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n perform chatting with one another and share application execution screens.
- the display 120 may display various screens.
- the application execution screen may comprise a map screen, calendar screen, app download screen and memo screen etc.
- the application execution screen may be a screen provided by a sub function execution of the chatting application itself or a screen provided as functions of other applications are executed through interlocked operation with other applications.
- the application may be a native application that was installed in the terminal apparatus 100 when it was manufactured, or an application that was downloaded from outside later on.
- the display 120 may be embodied as a touch screen which receives various touch manipulations and transmits the touch manipulations to the controller 130 .
- the controller 130 may perform functions according to touch manipulations.
- the controller 130 controls the overall operations of the terminal apparatus 100 .
- the controller 130 may comprise a MICOM (or, MICOM and CPU (Central Processing Unit)), a RAM (Random Access Memory) for operating the user terminal apparatus, and a ROM (Read Only Memory).
- MICOM Magnetic Ink Character Recognition
- RAM Random Access Memory
- ROM Read Only Memory
- these modules may be embodied in an SoC (System on Chip) format.
- the controller 130 may execute an application corresponding thereto.
- the application installed in the terminal apparatus 100 may be displayed in an icon format, and the controller 130 may execute an application corresponding to the icon touched when the user manipulation of touching the icon is input.
- the controller 130 may execute the chatting application and provide a chatting service to a user.
- the controller 130 may display a list on other users that subscribed to the server 200 through a predetermined certification process so that a chatting service counterpart may be selected.
- the user list may be received from the server 200 , and the controller 130 may transmit information on the selected user displayed on the list to the server 200 . Accordingly, the user may be provided with the chatting service with the selected users through the server 200 .
- the chatting screen may comprise a contents display area for displaying information on the chatting service counterpart (for example, name, image, ID, telephone number etc.), a window for receiving entering of chatting content, send item for transmitting the entered chatting content, and the transmitted chatting content.
- information on the chatting service counterpart for example, name, image, ID, telephone number etc.
- the controller 130 may display a virtual keyboard and receive an entering of chatting content, and when a send item is entered, may display the entered chatting content on the contents display area at the same time of transmitting the entered chatting content to the server 200 .
- the server 200 may transmit the chatting content received from the terminal apparatus 100 to the chatting service counterparts of terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n , and provide a chatting service between the users of the terminal apparatus 100 and the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n.
- the controller 130 may display an application execution screen. For example, when one of the application execution screen selecting menus provided on the chatting screen is selected, the controller 130 may display an application execution screen corresponding to the selected item on one area of the chatting screen.
- FIGS. 3A to 4D See FIGS. 3A to 4D for a more detailed explanation.
- the terminal apparatus 100 is embodied as a mobile phone.
- the controller 130 may display a calendar screen 330 on one area of the chatting screen 310 as illustrated in FIG. 3B .
- an application execution screen selecting menu 320 is disposed on an upper end of the chatting screen 310 .
- one format of the selecting menu 320 is illustrated, exemplary embodiments are not limited to the illustrated format and the format of the application execution screen selecting menu 320 may be changed in various ways.
- controller 130 may configure and display each application execution screen differently.
- the controller 130 may display a memo set as default on the application execution screen. Furthermore, the controller 130 may display the most recently written and stored memo on the application execution screen.
- the controller 130 may control the communicator 110 such that the displayed application execution screen is shared with the chatting service counterparts of terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n.
- the controller 130 may control such that the data on the application execution screen displayed for sharing the displayed application execution screen with the chatting service counterpart is transmitted to the server 200 .
- the controller 130 may transmit the data on the application execution screen to the server 200 at the point where the application execution screen is displayed on one area of the chatting screen.
- the controller 130 may transmit the data on the application execution screen to the server 200 when an additional user command is input.
- the data on the application execution screen may comprise various information according to the type of the application execution screen.
- the data may be the title of the area, GPS information, and scale information etc. necessary for displaying the map screen that is currently being displayed, while in the case of a calendar screen, the data may be information on the date when the calendar screen is currently being displayed.
- the data may be address information (for example, URL information) of the application download screen that is currently being displayed.
- the data may be information on the text or image etc. included in the memo that is currently being displayed.
- the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n may execute a chatting application and use the data received from the server 200 to display on one area of the chatting screen an application execution screen that is identical to the application execution screen displayed on the terminal apparatus 100 .
- the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n may use the area name, GPS information, and scale information etc. received from the server 200 to configure a map screen that is identical to the map screen being displayed on the terminal apparatus 100 , and display the configured map screen on an area of the chatting screen.
- the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n may execute a sub function provided in the chatting application itself, or execute another application additionally, and then interlock it with the chatting application to display the application execution screen.
- the controller 130 may control the communicator 110 to share the function executed according to the user manipulation regarding the application execution screen with the chatting service counterparts of the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n.
- the function executed according to the user's manipulation may be in various formats according to the type of the application execution screen.
- the controller 130 may control such that the application execution screen where a GUI (Graphic User Interface) corresponding to the user's manipulation is displayed is shared with the chatting service counterparts of the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n.
- GUI Graphic User Interface
- the controller 130 may control to display the GUI on the point selected according to the user's manipulation and to share the map screen where the GUI is displayed with the chatting service counterparts of the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n.
- the controller 130 may display the GUI for identifying the point selected by the user with other points.
- the GUI may be one of various types of GUIs such as a circle, line, certain icon etc. that may show that the user selected the corresponding point.
- the controller 130 may display the GUI for identifying the touched date, that is the date selected by the user from other dates.
- the GUI may be one of various types such as a circle, line, and a particular icon etc., or a GUI representing the weather (for example, sun, cloud, rain etc.).
- the controller 130 may control such that a schedule is added to a date selected according to the user's manipulation on the calendar screen, and that the added schedule information is shared with the chatting service counterparts of the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n.
- the controller 130 may display a virtual keyboard to receive an input of a schedule regarding the touched date. In this case, the controller 130 may display the schedule input by the user on the calendar screen.
- the controller 130 may transmit data on the information on the date where a schedule has been added and on the schedule input to the corresponding date to the server 200 .
- the server 200 may transmit the data received from the terminal apparatus 100 to the chatting service counterparts of the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n . Accordingly, in the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n , it is possible to add and display the schedule that the user of the terminal apparatus 100 input to the calendar screen using the received data.
- the controller 130 may control such that a text input according to the user's manipulation is displayed on the memo screen, and that the text is shared with the chatting service counterparts of the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n.
- the controller 130 may display a virtual keyboard, and display the text input through the virtual keyboard on the memo screen.
- the controller 130 may transmit the information on the text input to the memo to the server 200 .
- the server 200 may transmit the data received from the terminal apparatus 100 to the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n .
- the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n it is possible to display a particular text on the memo screen that is currently displayed using the received data. That is, the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n may display the memo screen where the text input by the user of the terminal apparatus 100 is displayed.
- the controller 130 may control such that an application download screen connected according to a user's command is displayed, and that the application download screen is shared with the chatting service counterparts of the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n.
- an application download screen there may be included a search window for entering a search word for searching an application pre-stored in an application provision server (not illustrated).
- the controller 130 may control the communicator 110 to transmit the input search word to the application provision server (not illustrated), and receive a search result.
- the controller 130 may use the received search result to display an application download screen that may download the searched application.
- the controller 130 may transmit data including address information regarding the application download screen that is currently displayed (for example URL information) to the server 200 .
- the server 200 may transmit the received data to the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n .
- the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n it is possible to update the application download screen or blank screen that is currently displayed on the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n using the received data. That is, the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n may display the application download screen provided in the address information that the user of the terminal apparatus 100 connected.
- the application download screen may include an address window for inputting address information, and the controller 130 may access the address information input into the address window and download the application download screen and display the same.
- the controller 130 may transmit data corresponding thereto to the server 200 , and share the function regarding the application execution screen executed according to a user's manipulation with the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n.
- the controller 130 may display a GUI on the selected particular date.
- the controller 130 may control such that information on the GUI format and information on the date on the calendar screen where the GUI is displayed is transmitted to the server 200 and shared with the chatting service counterparts of the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n.
- the controller 130 may execute the chatting application, and use the data received from the server 200 to display on an area of the chatting screen an application execution screen identical to the application execution screen displayed on the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n.
- the controller 130 may use the received data to determine the point selected by the user in the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n , and display a GUI at a corresponding point, the GUI having the same format as the GUI displayed on the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n .
- the controller 130 may execute the function corresponding to the user's manipulation of selecting a menu item.
- the controller 130 may control such that the address corresponding to the menu item is accessed and an application is downloaded.
- the controller 130 may access the application provision server (not illustrated) via the internet address mapped to the menu item, download an application, and install the downloaded application in the terminal apparatus 100 .
- the controller 130 may control such that the chatting content entered in the chatting screen may be shared with the chatting service counterparts of the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n.
- the controller 130 may display the chatting content on the contents display area of the chatting screen, and transmit data on a text or image corresponding to the chatting content to the server 200 .
- the server 200 may transmit the data received from the terminal apparatus 100 to the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n , and the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n may use the received data to display a chatting screen including the chatting content that was input in the terminal apparatus 100 .
- the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n may continue displaying the application execution screen on an area of the chatting screen, and use the data on the chatting content received to display on the contents display area the chatting content that the user of the terminal apparatus 100 entered.
- the controller 130 may display the new chatting content on the chatting screen.
- the controller 130 may update the contents display area, and display a new chatting content underneath the chatting content which had been displayed on the contents display area.
- the controller 130 may gradually remove the existing chatting content upwards so as to display new chatting contents.
- the controller 130 may display on an area of the chatting screen the chatting content selected according to a predetermined user's manipulation of among the chatting contents entered in the chatting screen.
- the predetermined user's manipulation may be a manipulation of touching the chatting content displayed on the contents display area for or more than a predetermined time.
- the controller 130 may affix the chatting content of which a touch manipulation has been input for or more than the predetermined time to the top end of the contents display area and display the same. Accordingly, even if a new chatting content is entered or received from the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n of the chatting counterparts and is displayed on the contents display area, the chatting content of which a touch manipulation has been input for or more than the predetermined time may not be removed from the contents display area and continue to be displayed on one area of the chatting screen
- the controller 130 may display a selecting item regarding the application execution screen related to the selected chatting content together with the selected chatting content.
- the application execution screen related to the selected chatting content may be an application execution screen for instructing the selected chatting content.
- a map screen it may be a map screen where a corresponding point related to the chatting content mentioning a particular point is displayed so as to be differentiated from other points.
- a calendar screen it may be a calendar screen where a corresponding date related to the chatting content mentioning a particular date is displayed so as to be differentiated from other dates.
- a memo screen it may be a memo screen that includes contents related to the chatting content.
- an application download screen it may be an application download screen where an application related to the chatting content may be downloaded.
- an application execution screen related to the selected chatting content may be determined in various methods.
- the controller 130 may determine that the shared application execution screen is related to the corresponding chatting content.
- the controller 130 may determine that the shared map screen, that is the map screen where the point that the user selected is displayed so as to be differentiated from other points is an application execution screen related to the chatting content “let's meet here”.
- the controller 130 may determine that the shared application download screen is an application execution screen related to the chatting contents “it is this app”.
- the controller 130 may determine that the application execution screen is an application execution screen related to the chatting content in a similar method as the aforementioned.
- the controller 130 may display an application execution screen related to the selected chatting content.
- the controller 130 may display the application execution screen related to the selected chatting content in a full screen format.
- controller 130 may control so that data including information on the selected chatting contents and the application execution screen related to the selected chatting content to the server 200 is transmitted to the server 200 and is shared with the chatting service counterparts of the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n.
- the controller 130 may transmit data including information on the selected chatting content and the application execution screen related to the selected chatting content to the server 200 .
- information on the selected chatting content may include information on the time when the chatting content was entered, the user who entered the chatting content, and a text or image included in the chatting content.
- information related to the application execution screen related to the selected chatting content may include various information according to the type of the application execution screen.
- information related to the application execution screen related to the selected chatting content may include information on the type of GUI displayed on the point that the user selected, coordinates of the point where the GUI is displayed, and GPS information etc.
- the server 200 may transmit the received data to the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n .
- the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n may use the received data to arrange and display the selected chatting content and selecting item on a top end of the contents display area, and when a selecting item is selected, the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n may display the application execution screen related to the chatting contents.
- controller 130 may control so that each application execution screen related to the selected chatting content is scrapped and stored.
- the controller 130 may capture and store the application execution screen related to the selected chatting content, and capture and store the application execution screen related to the next selected chatting content, thereby scrapping the application execution screen related to each of the selected chatting content.
- the controller 130 may store the captured screen in a storage (not illustrated) provided in the terminal apparatus 100 or in the server 200 .
- the controller 130 may display the scrapped application execution screen.
- the controller 130 may display an application execution screen scrapped as having the selected chatting content as the title of each application execution screen.
- the controller 130 may generate a scrap screen consisting of the map screen 1 and map screen 2 , and when a flick manipulation from the right to left direction is input, the controller 130 may newly display a scrap screen in the direction the flick manipulation has been input.
- a chatting content may be displayed together. That is, “let's meet here” may be added to the map screen 1 and “let's eat dinner here after riding the bicycle” may be added to the map screen 2 , and then the map screen 1 and the map screen 2 may be displayed.
- the controller 130 may store a function regarding the application execution screen executed according to a user's manipulation in an application corresponding to each application execution screen. In this case, in the case where a user's manipulation is input on the application execution screen, or where a user's manipulation is input on the application execution screen and then a chatting application is ended, the controller 130 may store data on various information that has been input in the application execution screen according to the user's manipulation per application.
- the data stored in each application may include various information according to the type of application.
- the controller 130 may store in a map application installed in the terminal apparatus 100 the data including the type of the GUI input in the map screen by the user or chatting counterparts, and on the location of the point where the GUI is displayed.
- the controller 130 may store in a calendar application installed in the terminal apparatus 100 the data including the type of the GUI input in the map screen by the user or chatting counterparts, the date when the corresponding GUI was displayed, schedule information, and information on the date when the corresponding schedule was entered.
- the controller 130 may store in an application download application installed in the terminal apparatus 100 the data including address information on the application download screen which was shared by the user or chatting counterparts.
- the controller 130 may store in a memo application installed in the terminal apparatus 100 the information on a text or image which was entered in the memo screen by the user or chatting counterparts.
- controller 130 may control such that, when an application is executed, the application execution screen that was shared with the terminal 100 - 1 , 100 - 2 , . . . , 100 - n is displayed using pre-stored data.
- the controller 130 may display a map screen where the GUI is displayed at the same point as the point that the user or chatting counterparts selected during chatting. That is, the controller 130 may display the map screen that was shared with the chatting counterparts via the previous chatting screen.
- the user is able to receive the application execution screen that was shared during chatting through separate applications.
- controller 130 may store the function regarding the application execution screen executed according to the user's manipulation in an integrated application.
- the integrated application may be an application that manages the user's schedule etc. in an integrated manner.
- controller 130 may synchronize the integrated application with each application, and store data on the application execution screen stored per application in the integrated application.
- the controller may use the information pre-stored in each application to update the schedule provided in the integrated application in an integrated manner.
- the controller 130 may use the information stored in the calendar application to store schedule information in a particular date of the schedule provided in the integrated application, and use the information stored in the map application to store the map screen where a GUI was displayed on a particular point.
- the user may be provided with schedule information that used to be provided through the calendar application and the map screen that used to be provided through the map application through the integrated application at the same time.
- each application is synchronized in the integrated application, but this is just an example. That is, the controller 130 may obviously store the various information that had been input in the application execution screen during chatting directly in the integrated application.
- FIGS. 4A to 7F are views for explaining a method for sharing an application execution screen according to an exemplary embodiment.
- FIGS. 4A to 4D illustrate views for explaining a method for sharing a map screen.
- information on chatting service counterparts 421 , input window 440 , and send item 450 may be displayed on the chatting screen 410 .
- the information on chatting service counterparts 421 may be images of the chatting service counterparts preregistered in the server 200 .
- a virtual keyboard 460 for receiving an entering of a chatting content may be displayed. Accordingly, when a chatting content is entered through the virtual keyboard 460 , the entered chatting content may be displayed on the input window 440 . For example, when the user input a chatting content “LET'S MEET HERE!”, “LET'S MEET HERE!” may be displayed on the input window 440 .
- the “LET'S MEET HERE!” which has been input may be transmitted to the chatting service counterparts of the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n , and the chatting content “LET'S MEET HERE!” 471 may be displayed on the contents display area as illustrated in FIG. 4B .
- an application execution screen selecting menu 423 may be displayed on the chatting screen 410 .
- a map screen 431 may be displayed on one area of the chatting screen.
- a GUI 433 may be displayed on the selected point, as illustrated in FIG. 4A .
- the type and color of the GUI displayed may be determined by the user.
- map screen 431 where the GUI 433 is displayed on the point selected by the user may be shared with the chatting service counterparts.
- the terminal apparatus 100 may receive from the server 200 the chatting content that the chatting service counterparts entered.
- the chatting contents may be aligned in the contents display area in the order they were entered.
- the chatting contents that the chatting service counterparts entered that is “GOOD, SINCE ITS NEAR THE HAN RIVER TOO” 473 , “THERE IS A BICYCLE ROAD TOO ⁇ ” 475 , and “LET'S EAT DINNER AFTER RIDING THE BICYCLE” 477 may be displayed on the contents display area consecutively.
- the terminal apparatus 100 may use the data received as illustrated in FIG. 4D to display the GUI 435 at the point where the chatting service counterparts selected on the map screen 431 .
- the calendar screen 521 may be shared, or as illustrated in FIG. 6 , an application download screen 621 may be shared.
- a calendar screen 521 may be displayed on one area of the chatting screen 510 .
- a GUI 523 may be displayed on the point selected by the user or the chatting service counterparts.
- the chatting content that the user and the chatting service counterparts input may be displayed on the contents display area 530 .
- an application download screen 621 may be displayed on the application download screen 621 on one area of the chatting screen 610 .
- a menu item 623 for receiving an application may be included on an app download screen 621 . Accordingly, in the case where the user selects the menu item 623 , the corresponding app may be downloaded from outside and be installed.
- the chatting contents entered by the user and chatting service counterparts may be displayed on the contents display area 630 .
- FIGS. 7A to 7F illustrate views for explaining a method for affixing a particular chatting content and displaying the same according to an exemplary embodiment.
- a map screen 721 may be displayed on one area of the chatting screen 710 .
- the GUI 723 , 725 may be displayed on each point where the user and chatting service counterparts selected.
- the chatting contents each entered by the user and the chatting service counterpart may be displayed. That is, as illustrated in FIG. 7A , the chatting contents “LET'S MEET HERE!” 731 that the user entered and “GOOD, SINCE ITS NEAR THE HAN RIVER TOO” 733 and “THERE IS A BICYCLE ROAD TOO ⁇ ” 735 , and “LET'S EAT DINNER AFTER RIDING THE BICYCLE” 737 that the chatting service counterparts entered may be displayed.
- the selected chatting content may be displayed on the top end of the contents display area fixatedly. That is, in the case of touching “LET'S MEET HERE!” 731 for or more than a predetermined time, as illustrated in FIG. 7B , the chatting content “LET'S MEET HERE!” may be displayed on the top end 741 of the contents display area. In this case, the chatting contents “LET'S MEET HERE!” 731 may be displayed on the top end of the contents display area in the chatting service counterparts of the terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n as well.
- the chatting content may be displayed together with the menu item for displaying an application execution screen related to the corresponding chatting content.
- the menu item may be displayed such as “SEE MAP”.
- the application execution screen related to the chatting content fixated and displayed may be displayed.
- the map screen 721 related to the chatting content “LET'S MEET HERE!” may be displayed.
- the map screen 721 may be a map screen where a GUI 723 is displayed on the point that the user selected on the map screen after the user entered “LET'S MEET HERE!”, selected a particular point on the map screen within a predetermined time.
- the chatting service counterparts may have a particular chatting contents be displayed on a fixated location on the chatting screen through their terminal apparatuses 100 - 1 , 100 - 2 , . . . , 100 - n .
- This method is the same as that performed in the terminal apparatus 100 .
- “LET'S EAT DINNER HERE AFTER RIDING THE BICYCLE” may be displayed on the top end of the contents display area.
- “LET'S EAT DINNER HERE AFTER RIDING THE BICYCLE” may be arranged underneath the “LET'S MEET HERE!” that was fixated and displayed in the time order.
- the menu item for displaying the application execution screen related to the chatting content may be displayed together.
- the chatting content may be displayed together with “SEE MAP”.
- the selected chatting content and the application execution screen related thereto may be provided through a separate screen.
- a separate screen For example, as illustrated in FIG. 7F , when the user inputs a flick manipulation from the right to left, “LET'S MEET HERE!” and the map screen 751 related thereto and “LET'S EAT DINNER HERE AFTER RIDING THE BICYCLE” and the map screen 753 related thereto may be displayed on one scrap board.
- FIG. 8 is a block diagram for explaining a detailed configuration of a terminal apparatus according to an exemplary embodiment.
- the terminal apparatus 100 may further include a location information generator 140 , storage 150 , audio processor 160 , video processor 170 , speaker 180 , button 181 , camera 182 , and microphone 183 besides a display 120 , communicator 110 , and controller 130 , and these configurations may be controlled by the controller 130 as well.
- a location information generator 140 storage 150
- audio processor 160 video processor 170
- speaker 180 button 181 , camera 182 , and microphone 183
- these configurations may be controlled by the controller 130 as well.
- specific explanation on elements that overlap with those explained in FIG. 2 will be omitted.
- the controller may execute the application using the location information generated in the location information generator 140 .
- the controller 130 may determine the current location of the terminal apparatus 100 using the location information generated in the location information generator 140 , and display the map screen within a certain area based on the current location.
- the storage 150 stores O/S (Operating System) for driving the terminal apparatus 100 .
- the storage 150 may store various application programs and data related to execution of various application programs.
- the storage 150 may store various data for sharing the application execution screen and various data stored as a result of sharing together with the chatting service.
- the storage 150 may store data including various information input on the application execution screen per application.
- the audio processor 160 may perform processing on audio data. For example, in the audio processor 160 , various processing such as decoding, amplification, noise filtering etc. on audio data may be performed.
- the video processor 170 may perform processing on video data.
- the video processor 170 may perform various image processing such as decoding, scaling, noise filtering, frame rate conversion, and resolution conversion etc. regarding video data.
- the speaker 180 is a configurative element that outputs not only various audio data processed in the audio processor 160 but also various alarm sounds and sound messages etc.
- the button 181 may be one of various types such as a mechanical button, wheel etc. formed on an area such as the front portion, side portion, rear portion etc. of the exterior of the main body of the terminal apparatus 100 .
- a button for turning on/off power of the terminal apparatus 100 may be provided.
- the camera 182 is a configurative element for photographing a still image or video.
- the camera 182 may be embodied as a plurality of cameras including a front camera and rear camera etc.
- the microphone 183 is a configurative element for receiving the user's voice or guitar sound and converting the received sound into audio data.
- the controller 130 may use the user's voice input through the microphone 183 in a call process or convert it into audio data and store it in the storage 150 .
- the terminal apparatus 100 may further include various external input ports for connecting with various external terminals such as headset etc.
- the controller 140 includes a RAM 131 , ROM 132 , main CPU 133 , graphic processor 134 , first to nth interfaces 135 - 1 to 135 - n , and bus 136 .
- the RAM 131 , ROM 132 , main CPU 133 , graphic processor 134 , first to nth interfaces 135 - 1 to 135 - n may be connected to one another through the bus 136 .
- the first to nth interfaces 135 - 1 to 135 - n are connected to the aforementioned various configurative elements.
- One of the interfaces may be a network interface that is connected to an external apparatus through network.
- the main CPU 133 may access the storage 150 , and perform booting using an O/S stored in the storage 150 . In addition, the main CPU 133 may perform various operations using various application programs and data etc. stored in the storage 150 .
- command sets for booting the system are stored.
- the main CPU 133 copies the O/S stored in the storage 150 to the RAM 131 according to the command stored in the ROM, and executes the O/S to boot the system.
- the main CPU 133 copies various application programs (that is, application programs) stored in the storage 150 to the RAM 131 , and executes the application programs copied to the RAM 131 to perform various operations.
- the graphic processor 134 uses a calculator (not illustrated) and rendering part (not illustrated) to generate a screen including various objects such as an icon, image, and text etc.
- the calculator (not illustrated) calculates the feature values such as the coordinates, types, sizes, and colors etc. regarding the objects to be displayed according to the layout of the screen based on the control command received.
- the rendering part (not illustrated) generates screens of various layouts that include objects based on the feature values calculated in the calculator (not illustrated). The screens generated in the rendering part (not illustrated) are displayed within the display area of the display 120 .
- FIG. 9 is a flowchart for explaining a method for sharing a terminal apparatus providing a chatting service according to an exemplary embodiment.
- GUI Graphic User Interface
- the application execution screen is a map screen
- it can be controlled such that a GUI is displayed on the point selected according to the user's manipulation, and that the map screen where the GUI is displayed is shared with the chatting service counterpart of the terminal apparatus.
- the application execution screen is a calendar screen
- it can be controlled such that a GUI is displayed on a date selected according to the user's manipulation on the calendar screen, and that the calendar screen where the GUI is displayed is shared with the chatting service counterpart of the terminal apparatus.
- the function corresponding to the user's manipulation of selecting a menu item may be executed.
- the address corresponding to the menu item may be accessed to download an application.
- a non-transitory computer readable medium where a program for consecutively performing the sharing method is stored.
- a non-transitory computer readable medium refers to a medium that may be read by an apparatus and that may store data semi-permanently unlike media that stores data for a short period of time such as a register, cache, and memory etc.
- data semi-permanently unlike media that stores data for a short period of time
- the various applications and programs mentioned above may be provided as being stored in non-transitory computer readable medium such as a CD, DVD, hard disk, Blu-ray disk, USB, memory card, and ROM etc.
- a terminal apparatus may further comprise a processor such as a CPU and microprocessor for performing the various steps mentioned above.
Abstract
Description
- This application claims priority from Korean Patent Application No. 10-2013-0112139, filed in the Korean Intellectual Property Office on Sep. 17, 2013, the disclosure of which is incorporated herein by reference in its entirety.
- 1. Technical Field
- Methods and apparatuses consistent with the exemplary embodiments relate to a terminal apparatus and a sharing method thereof, more particularly, to a terminal apparatus providing a chatting service and a sharing method thereof.
- 2. Description of the Prior Art
- With the recent development of electronic technologies, users are being provided with chatting services through terminal apparatuses. For example, users may perform chatting with one or more users through their terminal apparatuses without any restrictions in terms of space and time.
- However, in conventional terminal apparatuses, users could send and receive mostly texts and sometimes photos while chatting, and thus there existed limitations in terms of sending information.
- One or more exemplary embodiments may resolve the aforementioned problems, that is, provide a terminal apparatus for sharing with a chatting service counterpart not only chatting contents but also various application execution screens, and a method for sharing thereof.
- According to an exemplary embodiment, there is provided a terminal apparatus configured to provide a chatting service, the apparatus comprising: a communicator configured to perform communication with a chatting service counterpart of another terminal apparatus through a server; a display configured to display on a chatting screen of the chatting service at least one application execution screen shared with the chatting service counterpart of the other terminal apparatus; and a controller configured to control the communicator to share a function related to the application execution screen executed according to a user's manipulation with the chatting service counterpart of the other terminal apparatus.
- Herein, the controller may further be configured to control such a GUI (Graphic User Interface) corresponding to the user's manipulation to be displayed on the application execution screen and to control the communicator to share the GUI with the chatting service counterpart of the other terminal apparatus.
- Furthermore, if the application execution screen is a map screen, the controller may be further configured to control a GUI to be displayed on a point on the map screen selected according to the user's manipulation, and to control the communicator to share the map screen where the GUI is displayed is shared with the chatting service counterpart of the other terminal apparatus.
- Furthermore, if the application execution screen is a calendar screen, the controller may be further configured to control a GUI to be displayed on a date on the calendar screen selected according to the user's manipulation, and to control the communicator to share the calendar screen where the GUI is displayed with the chatting service counterpart of the other terminal apparatus.
- In addition, the controller may be further configured to execute a function corresponding to a user's manipulation of selecting a menu item on the condition that the application execution screen shared by the chatting service counterpart is displayed and a user's manipulation of selecting a menu item included in the application execution screen is input.
- Herein, the controller may be further configured to control access of an address corresponding to the menu item and downloading of an application.
- Meanwhile, the controller may be further configured to store a function related to the application execution screen executed according to a user's manipulation in an application corresponding to the application execution screen.
- Furthermore, the controller may be further configured to store a function related to the application execution screen executed according to a user's manipulation in an integrated application.
- In addition, the controller may be further configured to control the communicator to share a chatting content entered in the chatting screen with the chatting service counterpart of the other terminal apparatus.
- Furthermore, the controller may be further configured to control the display to display on one area of the chatting screen a chatting content selected from among chatting contents entered in the chatting screen according to a user's manipulation, and to control the display to display a selecting menu item related to an application execution screen related to the selected chatting content together with the selected chatting content.
- Herein, the controller may be further configured to control each application execution screen related to the selected chatting content to be scrapped and stored.
- According to an exemplary embodiment, there is provided a sharing method of a terminal apparatus providing a chatting service, the method comprising: displaying on a chatting screen providing the chatting service at least one application execution screen shared with a chatting service counterpart of another terminal apparatus; and controlling such that a function related to the application execution screen executed according to a user's manipulation is shared with the chatting service counterpart of the other terminal apparatus.
- Herein, the controlling may involve controlling such that the application execution screen where a GUI (Graphic User Interface) corresponding to the user's manipulation is displayed and is shared with the chatting service counterpart of the other terminal apparatus.
- Furthermore, if the application execution screen is a map screen, the controlling may comprise controlling a GUI to be displayed on a point on the map screen selected according to the user's manipulation, and controlling the map screen where the GUI is displayed to be shared with the chatting service counterpart of the other terminal apparatus.
- Furthermore, if the application execution screen is a calendar screen, the controlling may comprise controlling a GUI to be displayed on a date on the calendar screen selected according to the user's manipulation, and controlling the calendar screen where the GUI is displayed to be shared with the chatting service counterpart of the other terminal apparatus.
- In addition, the method may further comprise executing a function corresponding to the user's manipulation of selecting a menu item, on a condition that an application execution screen shared by the chatting service counterpart is displayed and a user's manipulation of selecting a menu item included in the application execution screen is input.
- Herein, the method may further comprise accessing an address corresponding to the menu item and downloading an application.
- In addition, the method according to an exemplary embodiment may further comprise storing a function related to the application execution screen executed according to a user's manipulation in an application corresponding to the application execution screen.
- In addition, the method according to an exemplary embodiment may further comprise storing a function executed related to the application execution screen executed according to a user's manipulation in an integrated application.
- In addition, the method according to an exemplary embodiment may further comprise controlling a chatting content entered in the chatting screen to be shared with the chatting service counterpart of the other terminal apparatus.
- In addition, the method according to an exemplary embodiment may further comprise displaying on one area of the chatting screen a chatting content selected according to a user's manipulation from among chatting contents entered in the chatting screen, and displaying a selecting menu item related to an application execution screen related to the selected chatting content together with the selected chatting content.
- In this case, the method according to an exemplary embodiment may further comprise controlling each application execution screen related to the selected chatting content to be scrapped and stored.
- According to another exemplary embodiment, there is a provided a terminal apparatus having a communicator configured to communicate with a sharing service counterpart of another terminal apparatus through a server, a display configured to display on a screen of the sharing service at least one application execution screen, and a controller configured to control the communicator to share data corresponding to the application execution screen the with sharing service counterpart of the other terminal apparatus.
- In addition, the sharing service and the sharing service counterpart may comprise a chatting service and a chatting service counterpart, respectively, and the controller may be further configured to control the communicator to share chatting content entered in the chatting service or the chatting service counterpart.
- As aforementioned, according to various exemplary embodiments, it is possible to share not only chatting contents of the users performing chatting but also their application execution screens. That is, users performing chatting service become able to display specific information related to chatting on various types of application executions screens, and share this information with counterparts, thereby improving users' convenience.
- The above and/or other aspects of one or more exemplary embodiments will be more apparent by describing certain present disclosure with reference to the accompanying drawings, in which:
-
FIG. 1 is a view for explaining a configuration of a system according to an exemplary embodiment; -
FIG. 2 is a block diagram for explaining a configuration of a terminal apparatus according to an exemplary embodiment; -
FIGS. 3A and 3B illustrate views for explaining a method for displaying an application execution screen according to an exemplary embodiment; -
FIGS. 4A to 6 are views for explaining a method for sharing an application execution screen according to an exemplary embodiment; -
FIGS. 7A to 7F illustrate views for explaining a method for fixating a particular chatting context and displaying the chatting content according to an exemplary embodiment; -
FIG. 8 is a block diagram for explaining a detailed configuration of a terminal apparatus according to an exemplary embodiment; and -
FIG. 9 is a flowchart for explaining a sharing method of a terminal apparatus that provides a chatting service according to an exemplary embodiment. - Certain exemplary embodiments are described in higher detail below with reference to the accompanying drawings.
- In the following description, like drawing reference numerals are used for the like elements, even in different drawings. The matters defined in the description, such as detailed construction and elements, are provided to assist in a comprehensive understanding of exemplary embodiments. However, exemplary embodiments can be practiced without those specifically defined matters. Also, well-known functions or constructions are not described in detail since they would obscure the application with unnecessary detail.
-
FIG. 1 is a view for explaining a configuration of a system according to an exemplary embodiment. According toFIG. 1 , the system 1000 may comprise aterminal apparatus 100,server 200, and first terminal apparatus 100-1, second terminal apparatus 100-2, . . . , and nth terminal apparatus 100-n. - Herein, the
terminal apparatus 100 and terminal apparatuses 100-1, 100-2, 100-n may be embodied as portable terminal apparatuses such as a mobile phone and tablet etc. Although theterminal apparatus 100 and terminal apparatuses 100-1, 100-2, and 100-n are depicted as portable terminal apparatuses, the terminal apparatuses may be non-portable terminal apparatuses. - Meanwhile, users of the
terminal apparatus 100 and terminal apparatuses 100-1, 100-2, . . . , 100-n may be provided with a chatting service through theserver 200. - More specifically, the users of the
terminal apparatus 100 and terminal apparatus 100-1, 100-2, . . . , 100-n may download a chatting application that theserver 200 provides, install the downloaded chatting application in each of theterminal apparatuses 100, 100-1, 100-2, . . . , 100-n, and create an ID and password through a predetermined certification process, and log in on theserver 200. Herein, the chatting application may be a chatting application that provides real-time chatting (hereinafter referred to as chatting application). Alternatively, the application may be provided by some means other than theserver 200. - In such a case, the
server 200 may identify theterminal apparatus 100 and terminal apparatuses 100-1, 100-2, . . . , 100-n through the logged ID, and then either transmit the chatting content entered in theterminal apparatus 100 to the terminal apparatuses 100-1, 100-2, . . . , 100-n, or on the contrary, transmit the chatting content entered in the terminal apparatuses 100-1, 100-2, . . . , 100-n to theterminal apparatus 100, providing chatting services between the users. - Meanwhile, the chatting service may enable sharing not only chatting contents, but also application execution screens between the chatting service users. Herein, the application execution screen may be a map screen, calendar screen, app download screen, and memo screen etc.
- For example, the
terminal apparatus 100 may display the map screen on one area of a chatting window, and transmit information on the map screen to theserver 200. Accordingly, theserver 200 may transmit the information on the map screen received from theterminal apparatus 100 to the terminal apparatuses 100-1, 100-2, . . . , 100-n, enabling the chatting services users to share the map screen. - As such, according to the exemplary embodiment, the users performing chatting become able to share not only chatting contents but also application execution screens. In this case, the users are provided with increased convenience, in that the application execution screen may include specific information related to the chatting content.
- Meanwhile, it was explained in the aforementioned example that a map screen was shared, but this is just an example, and thus, various types of application execution screens may obviously be shared between users who perform chatting.
- In addition, it was explained in the aforementioned example that a user of the
terminal apparatus 100 perform chatting with a plurality of users, but this is also just an example. That is, an application execution screen may also be shared in the case where a user chats with only one user. -
FIG. 2 is a block diagram for explaining a configuration of a terminal apparatus according to an exemplary embodiment. According toFIG. 2 , theterminal apparatus 100 is a terminal apparatus that provides a chatting service, theterminal apparatus 100 comprising acommunicator 110, display (or displayer) 120, andcontroller 130. - Herein, the
terminal apparatus 100 may be aterminal apparatus 100 ofFIG. 1 , or in some cases, one of a first terminal apparatus 100-1, second terminal apparatus 100-2, . . . , and nth terminal apparatus 100-n. - The
communicator 110 performs communication with the terminal apparatus (100-1, 100-2, . . . , 100-n ofFIG. 1 ) of the counterpart of the chatting service through the server (200 ofFIG. 1 ). For example, thecommunicator 110 uses various communication standards such as 3G, 4G, and Wifi, to connect to a network and perform communication with the chatting service counterpart of another terminal apparatus (100-1, 100-2, . . . , 100-n ofFIG. 1 ). - Herein, the
server 200 may be a server that relays communication between theterminal apparatus 100 and the terminal apparatuses 100-1, 100-2, . . . , 100-n to provide a chatting service. - Specifically, the
server 200 may control such that the users of theterminal apparatus 100 and terminal apparatuses 100-1, 100-2, . . . , 100-n perform chatting with one another and share application execution screens. - The
display 120 may display various screens. - More specifically, the
display 120 may display at least one application execution screen that is shared with the chatting service counterpart of one or more of the terminal apparatuses 100-1, 100-2, . . . , 100-n on a chatting screen that provides the chatting service. - Herein, the application execution screen may comprise a map screen, calendar screen, app download screen and memo screen etc. In this case, the application execution screen may be a screen provided by a sub function execution of the chatting application itself or a screen provided as functions of other applications are executed through interlocked operation with other applications. Herein, the application may be a native application that was installed in the
terminal apparatus 100 when it was manufactured, or an application that was downloaded from outside later on. - Meanwhile, the
display 120 may be embodied as a touch screen which receives various touch manipulations and transmits the touch manipulations to thecontroller 130. In this case, thecontroller 130 may perform functions according to touch manipulations. - The
controller 130 controls the overall operations of theterminal apparatus 100. Thecontroller 130 may comprise a MICOM (or, MICOM and CPU (Central Processing Unit)), a RAM (Random Access Memory) for operating the user terminal apparatus, and a ROM (Read Only Memory). In this case, these modules may be embodied in an SoC (System on Chip) format. - First of all, when a user manipulation for executing an application is input, the
controller 130 may execute an application corresponding thereto. In this case, the application installed in theterminal apparatus 100 may be displayed in an icon format, and thecontroller 130 may execute an application corresponding to the icon touched when the user manipulation of touching the icon is input. - For example, when an icon regarding a chatting application is selected, the
controller 130 may execute the chatting application and provide a chatting service to a user. - In this case, the
controller 130 may display a list on other users that subscribed to theserver 200 through a predetermined certification process so that a chatting service counterpart may be selected. Herein, the user list may be received from theserver 200, and thecontroller 130 may transmit information on the selected user displayed on the list to theserver 200. Accordingly, the user may be provided with the chatting service with the selected users through theserver 200. - Meanwhile, when a chatting service counterpart is selected from the list, the
controller 130 may display a chatting screen that may perform chatting with the users on thedisplay 120. - The chatting screen may comprise a contents display area for displaying information on the chatting service counterpart (for example, name, image, ID, telephone number etc.), a window for receiving entering of chatting content, send item for transmitting the entered chatting content, and the transmitted chatting content.
- More specifically, when an input window is selected, the
controller 130 may display a virtual keyboard and receive an entering of chatting content, and when a send item is entered, may display the entered chatting content on the contents display area at the same time of transmitting the entered chatting content to theserver 200. - In this case, the
server 200 may transmit the chatting content received from theterminal apparatus 100 to the chatting service counterparts of terminal apparatuses 100-1, 100-2, . . . , 100-n, and provide a chatting service between the users of theterminal apparatus 100 and the terminal apparatuses 100-1, 100-2, . . . , 100-n. - Meanwhile, the
controller 130 may control such that at least one application execution screen that is shared with the chatting service counterparts of terminal apparatuses 100-1, 100-2, . . . , 100-n is displayed on the chatting screen. - In this case, when a predetermined event occurs, the
controller 130 may display an application execution screen. For example, when one of the application execution screen selecting menus provided on the chatting screen is selected, thecontroller 130 may display an application execution screen corresponding to the selected item on one area of the chatting screen. - See
FIGS. 3A to 4D for a more detailed explanation. For convenience of explanation, hereinafter is explanation on the case where theterminal apparatus 100 is embodied as a mobile phone. - For example, when a “calendar” application is selected from among the application
execution selecting menus 320 provided on thechatting screen 310, thecontroller 130 may display acalendar screen 330 on one area of thechatting screen 310 as illustrated inFIG. 3B . - Meanwhile, in
FIG. 3A , it is illustrated that an application executionscreen selecting menu 320 is disposed on an upper end of thechatting screen 310. Although one format of the selectingmenu 320 is illustrated, exemplary embodiments are not limited to the illustrated format and the format of the application executionscreen selecting menu 320 may be changed in various ways. - In addition, the application execution
screen selecting menu 320 is illustrated as comprising “calendar”, “map”, and “memo”, but this is also just an example, and thus the application executionscreen selecting menu 320 may also be changed in various formats by default in the chatting application itself or a setting changed by the user. - For example, by default in the chatting application itself, the application execution screen selecting menu may consist of a “calendar”, “map”, and “memo”, and then the user may change the set value such that the application execution screen selecting menu consists of “calendar”, “map”, “app”, and “memo”.
- In this case, the
controller 130 may configure and display each application execution screen differently. - For example, when a calendar screen is displayed, the
controller 130 may display a calendar comprising a date set as default, a date preset by the user, or the current date, on the application execution screen. Otherwise, thecontroller 130 may display the calendar which was most recently displayed on the application execution screen. - In addition, when a map screen is displayed, the
controller 130 may display a map of an area set as default, an area selected by the user, or an area where theterminal apparatus 100 is currently located on the application execution screen. In addition, the controller may display the most recently displayed map on the application execution screen. - In addition, when an application download screen is displayed, the
controller 130 may display a blank screen, or an application download screen regarding the application set as a default on the application execution screen. In addition, thecontroller 130 may display an application download screen that was most recently connected or most recently downloaded on the application execution screen. - In addition, when a memo screen is displayed, the
controller 130 may display a memo set as default on the application execution screen. Furthermore, thecontroller 130 may display the most recently written and stored memo on the application execution screen. - Meanwhile, when the application execution screen is displayed, the
controller 130 may control thecommunicator 110 such that the displayed application execution screen is shared with the chatting service counterparts of terminal apparatuses 100-1, 100-2, . . . , 100-n. - More specifically, when the application execution screen is displayed, the
controller 130 may control such that the data on the application execution screen displayed for sharing the displayed application execution screen with the chatting service counterpart is transmitted to theserver 200. - In this case, the
controller 130 may transmit the data on the application execution screen to theserver 200 at the point where the application execution screen is displayed on one area of the chatting screen. However, this is just an example, and thecontroller 130 may transmit the data on the application execution screen to theserver 200 when an additional user command is input. - Herein, the data on the application execution screen may comprise various information according to the type of the application execution screen.
- For example, in the case of a map screen, the data may be the title of the area, GPS information, and scale information etc. necessary for displaying the map screen that is currently being displayed, while in the case of a calendar screen, the data may be information on the date when the calendar screen is currently being displayed. In addition, in the case of an application download screen, the data may be address information (for example, URL information) of the application download screen that is currently being displayed. Furthermore, in the case of a memo screen, the data may be information on the text or image etc. included in the memo that is currently being displayed.
- Meanwhile, the
server 200 may transmit the data received from theterminal apparatus 100 to the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n. In this case, theserver 200 may additionally transmit a separate command for executing a chatting application to the terminal apparatuses 100-1, 100-2, . . . , 100-n. - Accordingly, the terminal apparatuses 100-1, 100-2, . . . , 100-n may execute a chatting application and use the data received from the
server 200 to display on one area of the chatting screen an application execution screen that is identical to the application execution screen displayed on theterminal apparatus 100. - For example, in the case where a map screen is displayed on the
terminal apparatus 100, the terminal apparatuses 100-1, 100-2, . . . , 100-n may use the area name, GPS information, and scale information etc. received from theserver 200 to configure a map screen that is identical to the map screen being displayed on theterminal apparatus 100, and display the configured map screen on an area of the chatting screen. In this case, the terminal apparatuses 100-1, 100-2, . . . , 100-n may execute a sub function provided in the chatting application itself, or execute another application additionally, and then interlock it with the chatting application to display the application execution screen. - Meanwhile, in the case of the latter, the
server 200 may either transmit a command for executing an application being interlocked with the chatting application to the terminal apparatuses 100-1, 100-2, . . . , 100-n, or the terminal apparatuses 100-1, 100-2, . . . , 100-n themselves may analyze the data received from theserver 200 and determine the application that must be executed to display the application execution screen. - Meanwhile, the
controller 130 may control thecommunicator 110 to share the function executed according to the user manipulation regarding the application execution screen with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n. - In this case, the function executed according to the user's manipulation may be in various formats according to the type of the application execution screen.
- First, hereinafter is an explanation on a case based on an assumption where a map or calendar is displayed on an application execution screen.
- In this case, the
controller 130 may control such that the application execution screen where a GUI (Graphic User Interface) corresponding to the user's manipulation is displayed is shared with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n. - More specifically, in the case where the application execution screen is a map screen, the
controller 130 may control to display the GUI on the point selected according to the user's manipulation and to share the map screen where the GUI is displayed with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n. - That is, when a user's manipulation of touching the map screen is input, the
controller 130 may display the GUI for identifying the point selected by the user with other points. For example, the GUI may be one of various types of GUIs such as a circle, line, certain icon etc. that may show that the user selected the corresponding point. - In addition, the
controller 130 may transmit the information on the GUI format and the information on the point where the GUI is displayed on the map screen (for example, information on the coordinates, area, and GPS etc.) to theserver 200. In this case, theserver 200 may transmit the data received from theterminal apparatus 100 to the terminal apparatuses 100-1, 100-2, . . . , 100-n. Accordingly, in the terminal apparatuses 100-1, 100-2, . . . , 100-n, it is possible to display the GUI on a particular point on the map screen that is currently displayed using the received data. That is, the terminal apparatuses 100-1, 100-2, . . . , 100-n may display the map screen where the GUI is displayed on the particular point selected by the user of theterminal apparatus 100. - Meanwhile, in the case where the application execution screen is a calendar screen, the
controller 130 may control to display the GUI at the date selected on the calendar screen according to the user's manipulation and to share the calendar screen where the GUI is displayed with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n. - More specifically, when a user's manipulation of touching the calendar screen is input, the
controller 130 may display the GUI for identifying the touched date, that is the date selected by the user from other dates. In this case, the GUI may be one of various types such as a circle, line, and a particular icon etc., or a GUI representing the weather (for example, sun, cloud, rain etc.). - In addition, the
controller 130 may transmit data including information on the GUI format and information on the date where the GUI is displayed on the calendar screen to theserver 200. In this case, theserver 200 may transmit the data received from theterminal apparatus 100 to the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n. Accordingly, in the terminal apparatuses 100-1, 100-2, . . . , 100-n, it is possible to display the GUI at a particular date on the calendar screen that is currently displayed using the received data. That is, the terminal apparatuses 100-1, 100-2, . . . , 100-n may display the calendar screen where the GUI is displayed at the particular date selected by the user of theterminal apparatus 100. - Meanwhile, in the aforementioned example, it was explained such that the GUI is displayed on the calendar screen, and this being shared with the chatting counterpart, but this is just an example.
- That is, in the case where the application execution screen is a calendar screen, the
controller 130 may control such that a schedule is added to a date selected according to the user's manipulation on the calendar screen, and that the added schedule information is shared with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n. - More specifically, when a user manipulation of touching a calendar screen is input, the
controller 130 may display a virtual keyboard to receive an input of a schedule regarding the touched date. In this case, thecontroller 130 may display the schedule input by the user on the calendar screen. - Furthermore, the
controller 130 may transmit data on the information on the date where a schedule has been added and on the schedule input to the corresponding date to theserver 200. In this case, theserver 200 may transmit the data received from theterminal apparatus 100 to the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n. Accordingly, in the terminal apparatuses 100-1, 100-2, . . . , 100-n, it is possible to add and display the schedule that the user of theterminal apparatus 100 input to the calendar screen using the received data. - Meanwhile, hereinbelow is an explanation on a case based on the assumption that a memo is displayed on the application execution screen. In the case where the application execution screen is a memo screen, the
controller 130 may control such that a text input according to the user's manipulation is displayed on the memo screen, and that the text is shared with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n. - More specifically, when the memo screen is touched, the
controller 130 may display a virtual keyboard, and display the text input through the virtual keyboard on the memo screen. In addition, thecontroller 130 may transmit the information on the text input to the memo to theserver 200. In this case, theserver 200 may transmit the data received from theterminal apparatus 100 to the terminal apparatuses 100-1, 100-2, . . . , 100-n. Accordingly, in the terminal apparatuses 100-1, 100-2, . . . , 100-n, it is possible to display a particular text on the memo screen that is currently displayed using the received data. That is, the terminal apparatuses 100-1, 100-2, . . . , 100-n may display the memo screen where the text input by the user of theterminal apparatus 100 is displayed. - Lastly, hereinbelow is explanation on a case based on the assumption that an app download screen is displayed on the application execution screen.
- The
controller 130 may control such that an application download screen connected according to a user's command is displayed, and that the application download screen is shared with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n. - More specifically, in the case of an application download screen, there may be included a search window for entering a search word for searching an application pre-stored in an application provision server (not illustrated).
- When a search word is entered into a search window, the
controller 130 may control thecommunicator 110 to transmit the input search word to the application provision server (not illustrated), and receive a search result. In addition, thecontroller 130 may use the received search result to display an application download screen that may download the searched application. - In this case, the
controller 130 may transmit data including address information regarding the application download screen that is currently displayed (for example URL information) to theserver 200. In this case, theserver 200 may transmit the received data to the terminal apparatuses 100-1, 100-2, . . . , 100-n. Accordingly, in the terminal apparatuses 100-1, 100-2, . . . , 100-n, it is possible to update the application download screen or blank screen that is currently displayed on the terminal apparatuses 100-1, 100-2, . . . , 100-n using the received data. That is, the terminal apparatuses 100-1, 100-2, . . . , 100-n may display the application download screen provided in the address information that the user of theterminal apparatus 100 connected. - Meanwhile, in the aforementioned example, it was explained that a search window is displayed, but this is just an example. That is, the application download screen may include an address window for inputting address information, and the
controller 130 may access the address information input into the address window and download the application download screen and display the same. - Meanwhile, hereinabove it was explained that in the aforementioned exemplary embodiments, the
controller 130 may control such that a function executed for each application execution screen is shared with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n, at a point when a user's manipulation regarding the application execution screen is input. - That is, at a point when a particular point is selected on the calendar screen or map screen, or a display download screen is displayed, or a text is entered into the memo screen, the
controller 130 may transmit data corresponding thereto to theserver 200, and share the function regarding the application execution screen executed according to a user's manipulation with the terminal apparatuses 100-1, 100-2, . . . , 100-n. - However, this is just an example, and the
controller 130 may share the function regarding the application execution screen executed according to a user's manipulation with the terminal apparatuses 100-1, 100-2, . . . , 100-n when a user's command is input. - For example, when a particular date is selected from the calendar screen, the
controller 130 may display a GUI on the selected particular date. Next, when the send item provided on the chatting screen is selected, thecontroller 130 may control such that information on the GUI format and information on the date on the calendar screen where the GUI is displayed is transmitted to theserver 200 and shared with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n. - Meanwhile, the
controller 130 may display the data received from theserver 200 on the application execution screen that is shared with the chatting service counterparts. - More specifically, when data is received from the
server 200, thecontroller 130 may execute the chatting application, and use the data received from theserver 200 to display on an area of the chatting screen an application execution screen identical to the application execution screen displayed on the terminal apparatuses 100-1, 100-2, . . . , 100-n. - For example, with a map screen displayed, when data including information on the format of the GUI and the point where the GUI is displayed is received from the
server 200, thecontroller 130 may use the received data to determine the point selected by the user in the terminal apparatuses 100-1, 100-2, . . . , 100-n, and display a GUI at a corresponding point, the GUI having the same format as the GUI displayed on the terminal apparatuses 100-1, 100-2, . . . , 100-n. Furthermore, when the application execution screen that is shared with the chatting service counterparts is displayed and a user's manipulation of selecting a menu item included in the application execution screen is input, thecontroller 130 may execute the function corresponding to the user's manipulation of selecting a menu item. - In this case, the
controller 130 may control such that the address corresponding to the menu item is accessed and an application is downloaded. - That is, in the case of an application download screen, there may be included a menu item for downloading applications provided via the corresponding screen. Accordingly, when a menu item is selected with the application download screen that is shared with the chatting service counterparts being displayed, the
controller 130 may access the application provision server (not illustrated) via the internet address mapped to the menu item, download an application, and install the downloaded application in theterminal apparatus 100. - Meanwhile, the
controller 130 may control such that the chatting content entered in the chatting screen may be shared with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n. - More specifically, when a chatting content is entered in the input window provided on the chatting screen and then the send item is selected, the
controller 130 may display the chatting content on the contents display area of the chatting screen, and transmit data on a text or image corresponding to the chatting content to theserver 200. In this case, theserver 200 may transmit the data received from theterminal apparatus 100 to the terminal apparatuses 100-1, 100-2, . . . , 100-n, and the terminal apparatuses 100-1, 100-2, . . . , 100-n may use the received data to display a chatting screen including the chatting content that was input in theterminal apparatus 100. In this case, the terminal apparatuses 100-1, 100-2, . . . , 100-n may continue displaying the application execution screen on an area of the chatting screen, and use the data on the chatting content received to display on the contents display area the chatting content that the user of theterminal apparatus 100 entered. - Furthermore, when a new chatting content is input, or a new chatting content is received from the terminal apparatuses 100-1, 100-2, . . . , 100-n of the chatting counterparts, the
controller 130 may display the new chatting content on the chatting screen. - More specifically, the
controller 130 may update the contents display area, and display a new chatting content underneath the chatting content which had been displayed on the contents display area. Herein, in the case where there is more chatting contents than the size of the contents display area, thecontroller 130 may gradually remove the existing chatting content upwards so as to display new chatting contents. - Meanwhile, the
controller 130 may display on an area of the chatting screen the chatting content selected according to a predetermined user's manipulation of among the chatting contents entered in the chatting screen. Herein, the predetermined user's manipulation may be a manipulation of touching the chatting content displayed on the contents display area for or more than a predetermined time. - For example, of among the chatting contents displayed on contents display area provided on the chatting screen, the
controller 130 may affix the chatting content of which a touch manipulation has been input for or more than the predetermined time to the top end of the contents display area and display the same. Accordingly, even if a new chatting content is entered or received from the terminal apparatuses 100-1, 100-2, . . . , 100-n of the chatting counterparts and is displayed on the contents display area, the chatting content of which a touch manipulation has been input for or more than the predetermined time may not be removed from the contents display area and continue to be displayed on one area of the chatting screen - In this case, the
controller 130 may display a selecting item regarding the application execution screen related to the selected chatting content together with the selected chatting content. Herein, the application execution screen related to the selected chatting content may be an application execution screen for instructing the selected chatting content. - For example, in the case of a map screen, it may be a map screen where a corresponding point related to the chatting content mentioning a particular point is displayed so as to be differentiated from other points. Likewise, in the case of a calendar screen, it may be a calendar screen where a corresponding date related to the chatting content mentioning a particular date is displayed so as to be differentiated from other dates. In addition, in the case of a memo screen, it may be a memo screen that includes contents related to the chatting content.
- By another example, in the case of an application download screen, it may be an application download screen where an application related to the chatting content may be downloaded.
- Meanwhile, an application execution screen related to the selected chatting content may be determined in various methods.
- For example, in the case where a chatting content is entered within a predetermined time before or after the point where the application execution screen is shared with the terminal apparatuses 100-1, 100-2, . . . , 100-n or an application execution screen is shared within a predetermined time before or after the point when chatting content has been entered and transmitted to the terminal apparatuses 100-1, 100-2, . . . , 100-n, the
controller 130 may determine that the shared application execution screen is related to the corresponding chatting content. - For example, assuming that a user entered a chatting content “let's meet here” in the input window, and then selected a particular point on the map screen within a predetermined time, and thus a map screen where the particular point is selected is shared, the
controller 130 may determine that the shared map screen, that is the map screen where the point that the user selected is displayed so as to be differentiated from other points is an application execution screen related to the chatting content “let's meet here”. - Furthermore, assuming that the user searched an application to be downloaded and shared an application download screen, and the chatting content “it is this app” is entered within a predetermined time, the
controller 130 may determine that the shared application download screen is an application execution screen related to the chatting contents “it is this app”. - Meanwhile, even when the send item is selected and the application execution screen is shared, the
controller 130 may determine that the application execution screen is an application execution screen related to the chatting content in a similar method as the aforementioned. - Meanwhile, when a selecting item is selected, the
controller 130 may display an application execution screen related to the selected chatting content. In this case, thecontroller 130 may display the application execution screen related to the selected chatting content in a full screen format. - In addition, the
controller 130 may control so that data including information on the selected chatting contents and the application execution screen related to the selected chatting content to theserver 200 is transmitted to theserver 200 and is shared with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n. - That is, when a user's manipulation of selecting a chatting content for or more than a predetermined time is input, the
controller 130 may transmit data including information on the selected chatting content and the application execution screen related to the selected chatting content to theserver 200. - Herein, information on the selected chatting content may include information on the time when the chatting content was entered, the user who entered the chatting content, and a text or image included in the chatting content.
- Furthermore, information related to the application execution screen related to the selected chatting content may include various information according to the type of the application execution screen. For example, in the case of a map screen, information related to the application execution screen related to the selected chatting content may include information on the type of GUI displayed on the point that the user selected, coordinates of the point where the GUI is displayed, and GPS information etc.
- In this case, the
server 200 may transmit the received data to the terminal apparatuses 100-1, 100-2, . . . , 100-n. Accordingly, the terminal apparatuses 100-1, 100-2, . . . , 100-n may use the received data to arrange and display the selected chatting content and selecting item on a top end of the contents display area, and when a selecting item is selected, the terminal apparatuses 100-1, 100-2, . . . , 100-n may display the application execution screen related to the chatting contents. - Meanwhile, the
controller 130 may control so that each application execution screen related to the selected chatting content is scrapped and stored. - More specifically, the
controller 130 may capture and store the application execution screen related to the selected chatting content, and capture and store the application execution screen related to the next selected chatting content, thereby scrapping the application execution screen related to each of the selected chatting content. In this case, thecontroller 130 may store the captured screen in a storage (not illustrated) provided in theterminal apparatus 100 or in theserver 200. - For example, when a predetermined user's manipulation is input, the
controller 130 may display the scrapped application execution screen. In this case, thecontroller 130 may display an application execution screen scrapped as having the selected chatting content as the title of each application execution screen. - For example, assuming a case where a map screen 1 related to “let's meet here” is scrapped, and a
map screen 2 related to “let's eat dinner here after riding the bicycle” are scrapped, thecontroller 130 may generate a scrap screen consisting of the map screen 1 andmap screen 2, and when a flick manipulation from the right to left direction is input, thecontroller 130 may newly display a scrap screen in the direction the flick manipulation has been input. In this case, on each map screen, a chatting content may be displayed together. That is, “let's meet here” may be added to the map screen 1 and “let's eat dinner here after riding the bicycle” may be added to themap screen 2, and then the map screen 1 and themap screen 2 may be displayed. - Meanwhile, the
controller 130 may store a function regarding the application execution screen executed according to a user's manipulation in an application corresponding to each application execution screen. In this case, in the case where a user's manipulation is input on the application execution screen, or where a user's manipulation is input on the application execution screen and then a chatting application is ended, thecontroller 130 may store data on various information that has been input in the application execution screen according to the user's manipulation per application. - In this case, the data stored in each application may include various information according to the type of application.
- For example, when a map screen is being shared, the
controller 130 may store in a map application installed in theterminal apparatus 100 the data including the type of the GUI input in the map screen by the user or chatting counterparts, and on the location of the point where the GUI is displayed. - Furthermore, when a calendar screen is being shared, the
controller 130 may store in a calendar application installed in theterminal apparatus 100 the data including the type of the GUI input in the map screen by the user or chatting counterparts, the date when the corresponding GUI was displayed, schedule information, and information on the date when the corresponding schedule was entered. - Furthermore, when an application download screen is being shared, the
controller 130 may store in an application download application installed in theterminal apparatus 100 the data including address information on the application download screen which was shared by the user or chatting counterparts. - In addition, when a memo screen is being shared, the
controller 130 may store in a memo application installed in theterminal apparatus 100 the information on a text or image which was entered in the memo screen by the user or chatting counterparts. - Furthermore, the
controller 130 may control such that, when an application is executed, the application execution screen that was shared with the terminal 100-1, 100-2, . . . , 100-n is displayed using pre-stored data. - For example, in the case where the user execute a map application, the
controller 130 may display a map screen where the GUI is displayed at the same point as the point that the user or chatting counterparts selected during chatting. That is, thecontroller 130 may display the map screen that was shared with the chatting counterparts via the previous chatting screen. - Accordingly, the user is able to receive the application execution screen that was shared during chatting through separate applications.
- Meanwhile, the
controller 130 may store the function regarding the application execution screen executed according to the user's manipulation in an integrated application. Herein, the integrated application may be an application that manages the user's schedule etc. in an integrated manner. - In this case, the
controller 130 may synchronize the integrated application with each application, and store data on the application execution screen stored per application in the integrated application. - For example, assuming that information on the schedule content input in a particular date and on the date for which the corresponding schedule was input is pre-stored in a calendar application, and that information on the type of the GUI, the point where the GUI is displayed, and the date when the map screen was shared is presorted in a map application, the controller may use the information pre-stored in each application to update the schedule provided in the integrated application in an integrated manner.
- More specifically, the
controller 130 may use the information stored in the calendar application to store schedule information in a particular date of the schedule provided in the integrated application, and use the information stored in the map application to store the map screen where a GUI was displayed on a particular point. - Accordingly, the user may be provided with schedule information that used to be provided through the calendar application and the map screen that used to be provided through the map application through the integrated application at the same time.
- Meanwhile, it was explained in the aforementioned example, that in order to manage the user's schedule in the integrated application, each application is synchronized in the integrated application, but this is just an example. That is, the
controller 130 may obviously store the various information that had been input in the application execution screen during chatting directly in the integrated application. - Meanwhile, hereinbelow is a detailed explanation with reference to
FIGS. 4A to 7F . -
FIGS. 4A to 7F are views for explaining a method for sharing an application execution screen according to an exemplary embodiment. - Specifically,
FIGS. 4A to 4D illustrate views for explaining a method for sharing a map screen. - First of all, as illustrated in
FIG. 4A , information on chattingservice counterparts 421,input window 440, and senditem 450 may be displayed on thechatting screen 410. Herein, the information on chattingservice counterparts 421 may be images of the chatting service counterparts preregistered in theserver 200. - Meanwhile, when the
input window 440 is selected, avirtual keyboard 460 for receiving an entering of a chatting content may be displayed. Accordingly, when a chatting content is entered through thevirtual keyboard 460, the entered chatting content may be displayed on theinput window 440. For example, when the user input a chatting content “LET'S MEET HERE!”, “LET'S MEET HERE!” may be displayed on theinput window 440. - Next, when the
send item 450 is selected, the “LET'S MEET HERE!” which has been input may be transmitted to the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n, and the chatting content “LET'S MEET HERE!” 471 may be displayed on the contents display area as illustrated inFIG. 4B . - Meanwhile, as illustrated in
FIG. 4A , on thechatting screen 410, an application executionscreen selecting menu 423 may be displayed. In this case, when “map” is selected through the application executionscreen selecting menu 423, amap screen 431 may be displayed on one area of the chatting screen. - Next, when a particular point is selected by the user on the
map screen 431, aGUI 433 may be displayed on the selected point, as illustrated inFIG. 4A . In this case, the type and color of the GUI displayed may be determined by the user. - Furthermore, the
map screen 431 where theGUI 433 is displayed on the point selected by the user may be shared with the chatting service counterparts. - Meanwhile, in the case where the chatting service counterparts entered a chatting content through their terminal apparatuses 100-1, 100-2, . . . , 100-n, the
terminal apparatus 100 may receive from theserver 200 the chatting content that the chatting service counterparts entered. - In this case, the chatting contents may be aligned in the contents display area in the order they were entered. For example, as illustrated in
FIG. 4C andFIG. 4D , the chatting contents that the chatting service counterparts entered, that is “GOOD, SINCE ITS NEAR THE HAN RIVER TOO” 473, “THERE IS A BICYCLE ROAD TOO˜” 475, and “LET'S EAT DINNER AFTER RIDING THE BICYCLE” 477 may be displayed on the contents display area consecutively. - In addition, in the case where the chatting service counterparts select a particular point on the map screen displayed on their terminal apparatuses 100-1, 100-2, . . . , 100-n, data for displaying a GUI on a point selected on the map screen may be received from the
server 200. In this case, theterminal apparatus 100 may use the data received as illustrated inFIG. 4D to display theGUI 435 at the point where the chatting service counterparts selected on themap screen 431. - In such a method, the user may share in real time not only the chatting contents but also the application execution screen for receiving the information related to the chatting contents.
- However, this is just an example, and thus various application execution screens may obviously be shared.
- For example, as illustrated in
FIG. 5 , thecalendar screen 521 may be shared, or as illustrated inFIG. 6 , anapplication download screen 621 may be shared. - First, in the case of sharing a calendar screen with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n, as in
FIG. 5 , acalendar screen 521 may be displayed on one area of thechatting screen 510. In this case, on a particular date of thecalendar screen 521, aGUI 523 may be displayed on the point selected by the user or the chatting service counterparts. In addition, as illustrated inFIG. 5 , the chatting content that the user and the chatting service counterparts input may be displayed on thecontents display area 530. - Meanwhile, in the case of sharing the application download screen with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n, as illustrated in
FIG. 6 , anapplication download screen 621 may be displayed on theapplication download screen 621 on one area of thechatting screen 610. In this case, on anapp download screen 621, amenu item 623 for receiving an application may be included. Accordingly, in the case where the user selects themenu item 623, the corresponding app may be downloaded from outside and be installed. In addition, as illustrated inFIG. 6 , the chatting contents entered by the user and chatting service counterparts may be displayed on thecontents display area 630. -
FIGS. 7A to 7F illustrate views for explaining a method for affixing a particular chatting content and displaying the same according to an exemplary embodiment. - First, in the case of sharing a map screen with the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n, as illustrated in
FIG. 7A , on one area of thechatting screen 710, amap screen 721 may be displayed. In this case, on themap screen 710, theGUI - Furthermore, on the contents display area provided on the
chatting screen 710, the chatting contents each entered by the user and the chatting service counterpart may be displayed. That is, as illustrated inFIG. 7A , the chatting contents “LET'S MEET HERE!” 731 that the user entered and “GOOD, SINCE ITS NEAR THE HAN RIVER TOO” 733 and “THERE IS A BICYCLE ROAD TOO˜” 735, and “LET'S EAT DINNER AFTER RIDING THE BICYCLE” 737 that the chatting service counterparts entered may be displayed. - Meanwhile, when the user selects one of the chatting contents displayed on the contents display area for more than a predetermined time, the selected chatting content may be displayed on the top end of the contents display area fixatedly. That is, in the case of touching “LET'S MEET HERE!” 731 for or more than a predetermined time, as illustrated in
FIG. 7B , the chatting content “LET'S MEET HERE!” may be displayed on thetop end 741 of the contents display area. In this case, the chatting contents “LET'S MEET HERE!” 731 may be displayed on the top end of the contents display area in the chatting service counterparts of the terminal apparatuses 100-1, 100-2, . . . , 100-n as well. - In this case, the chatting content may be displayed together with the menu item for displaying an application execution screen related to the corresponding chatting content. For example, as illustrated in
FIG. 7B , the menu item may be displayed such as “SEE MAP”. - In this case, when the user selects the menu item, the application execution screen related to the chatting content fixated and displayed may be displayed. For example, as illustrated in
FIG. 7C , when “SEE MAP” is selected, themap screen 721 related to the chatting content “LET'S MEET HERE!” may be displayed. For example, themap screen 721 may be a map screen where aGUI 723 is displayed on the point that the user selected on the map screen after the user entered “LET'S MEET HERE!”, selected a particular point on the map screen within a predetermined time. - Meanwhile, the chatting service counterparts may have a particular chatting contents be displayed on a fixated location on the chatting screen through their terminal apparatuses 100-1, 100-2, . . . , 100-n. This method is the same as that performed in the
terminal apparatus 100. - For example, in the case where the chatting service counterparts touched “LET'S EAT DINNER HERE AFTER RIDING THE BICYCLE” 737 for or more than a predetermined time, “LET'S EAT DINNER HERE AFTER RIDING THE BICYCLE” may be displayed on the top end of the contents display area. In this case, as illustrated in
FIG. 7D , “LET'S EAT DINNER HERE AFTER RIDING THE BICYCLE” may be arranged underneath the “LET'S MEET HERE!” that was fixated and displayed in the time order. - In this case, the menu item for displaying the application execution screen related to the chatting content may be displayed together. For example, as illustrated in
FIG. 7B , the chatting content may be displayed together with “SEE MAP”. - In this case, when the user selects the menu item, the application execution screen related to the chatting contents fixated and displayed may be displayed. For example, when “SEE MAP” is selected, as illustrated in
FIG. 7E , themap screen 721 related to the chatting content “LET'S EAT DINNER HERE AFTER RIDING THE BICYCLE” may be displayed. Herein, themap screen 721 may be a map screen where aGUI 725 is displayed on the point that the chatting service counterparts who entered “LET'S EAT DINNER HERE AFTER RIDING THE BICYCLE” selected on the map screen after selecting a particular point on the map screen within a predetermined time. - Meanwhile, the selected chatting content and the application execution screen related thereto may be provided through a separate screen. For example, as illustrated in
FIG. 7F , when the user inputs a flick manipulation from the right to left, “LET'S MEET HERE!” and themap screen 751 related thereto and “LET'S EAT DINNER HERE AFTER RIDING THE BICYCLE” and themap screen 753 related thereto may be displayed on one scrap board. -
FIG. 8 is a block diagram for explaining a detailed configuration of a terminal apparatus according to an exemplary embodiment. According toFIG. 8 , theterminal apparatus 100 may further include alocation information generator 140,storage 150,audio processor 160,video processor 170,speaker 180,button 181,camera 182, andmicrophone 183 besides adisplay 120,communicator 110, andcontroller 130, and these configurations may be controlled by thecontroller 130 as well. Meanwhile, specific explanation on elements that overlap with those explained inFIG. 2 will be omitted. - The
location information generator 140 generates location information that indicates the location of theterminal apparatus 100. More specifically, thelocation information generator 140 uses the GPS (Global Positioning System) module (not illustrated) to search the location of theterminal apparatus 100. For example, the GPS module (not illustrated) may receive a signal transmitted from a plurality of GPS satellites, and use the time difference between the transmission time and receiving time to calculate the distance between the satellite andterminal apparatus 100. In addition, it may calculate the current location of theterminal apparatus 100 in an arithmetic operation such as trilateration in consideration of the distance calculated between the plurality of satellites, and the location of the satellites etc. - Accordingly, the controller may execute the application using the location information generated in the
location information generator 140. For example, when the map application is executed, thecontroller 130 may determine the current location of theterminal apparatus 100 using the location information generated in thelocation information generator 140, and display the map screen within a certain area based on the current location. - The
storage 150 stores O/S (Operating System) for driving theterminal apparatus 100. - Specifically, the
storage 150 may store various application programs and data related to execution of various application programs. In addition, thestorage 150 may store various data for sharing the application execution screen and various data stored as a result of sharing together with the chatting service. In this case, thestorage 150 may store data including various information input on the application execution screen per application. - The
audio processor 160 may perform processing on audio data. For example, in theaudio processor 160, various processing such as decoding, amplification, noise filtering etc. on audio data may be performed. - The
video processor 170 may perform processing on video data. For example, thevideo processor 170 may perform various image processing such as decoding, scaling, noise filtering, frame rate conversion, and resolution conversion etc. regarding video data. - The
speaker 180 is a configurative element that outputs not only various audio data processed in theaudio processor 160 but also various alarm sounds and sound messages etc. - The
button 181 may be one of various types such as a mechanical button, wheel etc. formed on an area such as the front portion, side portion, rear portion etc. of the exterior of the main body of theterminal apparatus 100. For example, a button for turning on/off power of theterminal apparatus 100 may be provided. - The
camera 182 is a configurative element for photographing a still image or video. Thecamera 182 may be embodied as a plurality of cameras including a front camera and rear camera etc. - The
microphone 183 is a configurative element for receiving the user's voice or guitar sound and converting the received sound into audio data. Thecontroller 130 may use the user's voice input through themicrophone 183 in a call process or convert it into audio data and store it in thestorage 150. - Besides the above, the
terminal apparatus 100 may further include various external input ports for connecting with various external terminals such as headset etc. - Meanwhile, the
controller 140 includes aRAM 131,ROM 132,main CPU 133,graphic processor 134, first to nth interfaces 135-1 to 135-n, andbus 136. - The
RAM 131,ROM 132,main CPU 133,graphic processor 134, first to nth interfaces 135-1 to 135-n may be connected to one another through thebus 136. - The first to nth interfaces 135-1 to 135-n are connected to the aforementioned various configurative elements. One of the interfaces may be a network interface that is connected to an external apparatus through network.
- The
main CPU 133 may access thestorage 150, and perform booting using an O/S stored in thestorage 150. In addition, themain CPU 133 may perform various operations using various application programs and data etc. stored in thestorage 150. - In the
ROM 132, command sets for booting the system are stored. When a turn on command is input and power is supplied, themain CPU 133 copies the O/S stored in thestorage 150 to theRAM 131 according to the command stored in the ROM, and executes the O/S to boot the system. When booting is completed, themain CPU 133 copies various application programs (that is, application programs) stored in thestorage 150 to theRAM 131, and executes the application programs copied to theRAM 131 to perform various operations. - The
graphic processor 134 uses a calculator (not illustrated) and rendering part (not illustrated) to generate a screen including various objects such as an icon, image, and text etc. The calculator (not illustrated) calculates the feature values such as the coordinates, types, sizes, and colors etc. regarding the objects to be displayed according to the layout of the screen based on the control command received. The rendering part (not illustrated) generates screens of various layouts that include objects based on the feature values calculated in the calculator (not illustrated). The screens generated in the rendering part (not illustrated) are displayed within the display area of thedisplay 120. -
FIG. 9 is a flowchart for explaining a method for sharing a terminal apparatus providing a chatting service according to an exemplary embodiment. - First, on the chatting screen providing the chatting service, at least one application execution screen that is shared with the chatting service counterparts is displayed (S910).
- Next, it is controlled such that a function regarding the application execution screen executed according to the user's manipulation is shared with the chatting service counterpart of the terminal apparatus (S920).
- More specifically, it may be controlled such that the application execution screen where the GUI (Graphic User Interface) that corresponds to the user's manipulation is displayed is shared with the chatting service counterpart of the terminal apparatus.
- For example, when the application execution screen is a map screen, it can be controlled such that a GUI is displayed on the point selected according to the user's manipulation, and that the map screen where the GUI is displayed is shared with the chatting service counterpart of the terminal apparatus.
- By another example, in the case where the application execution screen is a calendar screen, it can be controlled such that a GUI is displayed on a date selected according to the user's manipulation on the calendar screen, and that the calendar screen where the GUI is displayed is shared with the chatting service counterpart of the terminal apparatus.
- In addition, when the application execution screen shared with the charting service counterpart is displayed and the user's manipulation of selecting a menu item included in the application execution screen is input, the function corresponding to the user's manipulation of selecting a menu item may be executed. In this case, the address corresponding to the menu item may be accessed to download an application.
- However, this is just an example, and thus various application execution screens may obviously be shared.
- Meanwhile, it is possible to store the function regarding the application execution screen executed according to the user's manipulation in the application corresponding to each application execution screen. In addition, it is possible to store the function regarding the application execution screen executed according to the user's manipulation in the integrated application.
- In addition, it is possible to control such that the chatting contents entered in the chatting screen may be shared with the chatting service counterpart of the terminal apparatus.
- Furthermore, it is possible to display on one area the chatting content selected according to the predetermined user's manipulation of among the chatting contents entered in the chatting screen, and display the selecting menu item regarding the application execution screen related to the selected chatting content together with the selected chatting content. In this case, it is possible to control such that each application execution screen related to the selected chatting content is scrapped and stored.
- Meanwhile, explanation on the specific method for sharing an application execution screen and on the screen displayed for this purpose was made with reference to
FIGS. 1 to 9 . - Meanwhile, there may be provided a non-transitory computer readable medium where a program for consecutively performing the sharing method is stored.
- A non-transitory computer readable medium refers to a medium that may be read by an apparatus and that may store data semi-permanently unlike media that stores data for a short period of time such as a register, cache, and memory etc. For example, the various applications and programs mentioned above may be provided as being stored in non-transitory computer readable medium such as a CD, DVD, hard disk, Blu-ray disk, USB, memory card, and ROM etc.
- Furthermore, although not illustrated in the aforementioned block diagram, communication between each of the configurative elements may be made through a bus. In addition, a terminal apparatus may further comprise a processor such as a CPU and microprocessor for performing the various steps mentioned above.
- Although a few exemplary embodiments have been shown and described, it would be appreciated by those skilled in the art that changes may be made in this embodiment without departing from the principles and spirit of the invention, the scope of which is defined in the claims and their equivalents.
Claims (24)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/441,461 US11003315B2 (en) | 2013-09-17 | 2017-02-24 | Terminal device and sharing method thereof |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130112139A KR102057944B1 (en) | 2013-09-17 | 2013-09-17 | Terminal device and sharing method thereof |
KR10-2013-0112139 | 2013-09-17 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/441,461 Division US11003315B2 (en) | 2013-09-17 | 2017-02-24 | Terminal device and sharing method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150082201A1 true US20150082201A1 (en) | 2015-03-19 |
Family
ID=51494108
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/479,493 Abandoned US20150082201A1 (en) | 2013-09-17 | 2014-09-08 | Terminal device and sharing method thereof |
US15/441,461 Active 2035-03-04 US11003315B2 (en) | 2013-09-17 | 2017-02-24 | Terminal device and sharing method thereof |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/441,461 Active 2035-03-04 US11003315B2 (en) | 2013-09-17 | 2017-02-24 | Terminal device and sharing method thereof |
Country Status (3)
Country | Link |
---|---|
US (2) | US20150082201A1 (en) |
EP (1) | EP2849391B1 (en) |
KR (1) | KR102057944B1 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DK201670649A1 (en) * | 2016-05-18 | 2017-12-04 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Messaging |
US9959037B2 (en) | 2016-05-18 | 2018-05-01 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US20180181560A1 (en) * | 2015-03-24 | 2018-06-28 | Beijing Sogou Technology Development Co., Ltd. | Information input method and device |
US20180260092A1 (en) * | 2016-02-17 | 2018-09-13 | Christopher Alsante | Consumer electronic entertainment and display system |
US10904189B2 (en) * | 2014-09-11 | 2021-01-26 | Lg Electronics Inc. | Terminal and method for displaying previous conversation information while displaying message of current conversation at the terminal |
US11159922B2 (en) | 2016-06-12 | 2021-10-26 | Apple Inc. | Layers in messaging applications |
US11221751B2 (en) | 2016-05-18 | 2022-01-11 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11392275B2 (en) * | 2020-06-10 | 2022-07-19 | Snap Inc. | Contextual sending menu |
US11543935B2 (en) * | 2019-01-31 | 2023-01-03 | Vivo Mobile Communication Co., Ltd. | Information processing method and terminal device |
US11966579B2 (en) | 2016-08-24 | 2024-04-23 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105005457B (en) * | 2014-04-25 | 2019-04-09 | 腾讯科技(深圳)有限公司 | Geographical location methods of exhibiting and device |
US20170357411A1 (en) | 2016-06-11 | 2017-12-14 | Apple Inc. | User interface for initiating a telephone call |
US11765114B2 (en) | 2017-05-16 | 2023-09-19 | Apple Inc. | Voice communication method |
USD950587S1 (en) | 2018-08-31 | 2022-05-03 | Zoox, Inc. | Display screen or portion thereof having a graphical user interface |
KR20210106651A (en) * | 2020-02-21 | 2021-08-31 | 삼성전자주식회사 | An electronic device sharing at least one object and control method thereof |
US11165734B1 (en) | 2020-06-10 | 2021-11-02 | Snap Inc. | Messaging system share-to-chat |
US11893203B2 (en) * | 2021-08-31 | 2024-02-06 | Apple Inc. | Methods and interfaces for initiating communications |
Citations (121)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5848373A (en) * | 1994-06-24 | 1998-12-08 | Delorme Publishing Company | Computer aided map location system |
US6321158B1 (en) * | 1994-06-24 | 2001-11-20 | Delorme Publishing Company | Integrated routing/mapping information |
US20020130904A1 (en) * | 2001-03-19 | 2002-09-19 | Michael Becker | Method, apparatus and computer readable medium for multiple messaging session management with a graphical user interfacse |
US20030037110A1 (en) * | 2001-08-14 | 2003-02-20 | Fujitsu Limited | Method for providing area chat rooms, method for processing area chats on terminal side, computer-readable medium for recording processing program to provide area chat rooms, apparatus for providing area chat rooms, and terminal-side apparatus for use in a system to provide area chat rooms |
US20030154250A1 (en) * | 2001-12-11 | 2003-08-14 | Sony Corporation | Service providing system, information providing apparatus and method, information processing apparatus and method, and program |
US20030163525A1 (en) * | 2002-02-22 | 2003-08-28 | International Business Machines Corporation | Ink instant messaging with active message annotation |
US20030210265A1 (en) * | 2002-05-10 | 2003-11-13 | Haimberg Nadav Y. | Interactive chat messaging |
US6683538B1 (en) * | 1998-08-29 | 2004-01-27 | Robert D Wilkes, Jr. | Position dependent messaging system |
US20040054428A1 (en) * | 2002-03-01 | 2004-03-18 | Sheha Michael A. | Method and apparatus for sending, retrieving and planning location relevant information |
US6748318B1 (en) * | 1993-05-18 | 2004-06-08 | Arrivalstar, Inc. | Advanced notification systems and methods utilizing a computer network |
US6784901B1 (en) * | 2000-05-09 | 2004-08-31 | There | Method, system and computer program product for the delivery of a chat message in a 3D multi-user environment |
US6853911B1 (en) * | 1999-10-12 | 2005-02-08 | Taskin Sakarya | Downloading geographical data to a mobile station and displaying a map |
US20050273496A1 (en) * | 2004-06-07 | 2005-12-08 | Jean Yves D | System for presenting applications on instant messaging clients |
US20060015812A1 (en) * | 2004-07-15 | 2006-01-19 | Cingular Wireless Ii, Llc | Using emoticons, such as for wireless devices |
US20060063539A1 (en) * | 2004-09-21 | 2006-03-23 | Beyer Malcolm K Jr | Cellular phone/pda communication system |
US7089278B1 (en) * | 1999-09-07 | 2006-08-08 | Fuji Xerox Co., Ltd. | Anchored conversations: adhesive, in-context, virtual discussion forums |
US20060184886A1 (en) * | 1999-12-22 | 2006-08-17 | Urbanpixel Inc. | Spatial chat in a multiple browser environment |
US20060223518A1 (en) * | 2005-04-04 | 2006-10-05 | Haney Richard D | Location sharing and tracking using mobile phones or other wireless devices |
US20060221858A1 (en) * | 2005-04-01 | 2006-10-05 | Microsoft Corporation | User experience for collaborative ad-hoc networks |
US20070032945A1 (en) * | 2005-08-02 | 2007-02-08 | Jason Kaufman | System and method for providing location related information to a network user |
US20070050716A1 (en) * | 1995-11-13 | 2007-03-01 | Dave Leahy | System and method for enabling users to interact in a virtual space |
US20070161382A1 (en) * | 2006-01-09 | 2007-07-12 | Melinger Daniel J | System and method including asynchronous location-based messaging |
US20070208802A1 (en) * | 2006-03-03 | 2007-09-06 | Gogroups | Method And System For Messaging And Communication Based On Groups |
US20070266088A1 (en) * | 2006-03-31 | 2007-11-15 | Business Objects, S.A. | Apparatus and method for report sharing within an instant messaging framework |
US20070266104A1 (en) * | 2006-03-31 | 2007-11-15 | Business Objects, S.A. | Apparatus and method for report sharing within an instant messaging framework |
US20070270159A1 (en) * | 2005-09-30 | 2007-11-22 | Sunit Lohtia | Location sensitive messaging |
US20070294229A1 (en) * | 1998-05-28 | 2007-12-20 | Q-Phrase Llc | Chat conversation methods traversing a provisional scaffold of meanings |
US20080036586A1 (en) * | 2006-08-11 | 2008-02-14 | Eric Shigeru Ohki | Method and system for receiving and sending navigational data via a wireless messaging service on a navigation system |
US7370269B1 (en) * | 2001-08-31 | 2008-05-06 | Oracle International Corporation | System and method for real-time annotation of a co-browsed document |
US20080119200A1 (en) * | 2006-11-21 | 2008-05-22 | Verizon Corporate Services Group Inc. | Method and system for flexible product and service bundling |
US20080132252A1 (en) * | 2006-06-01 | 2008-06-05 | Altman Samuel H | Network Manager System for Location-Aware Mobile Communication Devices |
US7386799B1 (en) * | 2002-11-21 | 2008-06-10 | Forterra Systems, Inc. | Cinematic techniques in avatar-centric communication during a multi-user online simulation |
US20080141150A1 (en) * | 2006-12-11 | 2008-06-12 | Yahoo! Inc. | Graphical messages |
US20080171555A1 (en) * | 2007-01-11 | 2008-07-17 | Helio, Llc | Location-based text messaging |
US20080182598A1 (en) * | 2007-01-29 | 2008-07-31 | Research In Motion Limited | Method of e-mailing a map location using predefined context-sensitive messages |
US20090005072A1 (en) * | 2007-06-28 | 2009-01-01 | Apple Inc. | Integration of User Applications in a Mobile Device |
US20090005018A1 (en) * | 2007-06-28 | 2009-01-01 | Apple Inc. | Route Sharing and Location |
US20090005981A1 (en) * | 2007-06-28 | 2009-01-01 | Apple Inc. | Integration of Map Services and User Applications in a Mobile Device |
US20090019085A1 (en) * | 2007-07-10 | 2009-01-15 | Fatdoor, Inc. | Hot news neighborhood banter in a geo-spatial social network |
US20090061833A1 (en) * | 2007-08-30 | 2009-03-05 | Junius Ho | System, method and device to use messaging to implement programmatic actions |
US20090119255A1 (en) * | 2006-06-28 | 2009-05-07 | Metacarta, Inc. | Methods of Systems Using Geographic Meta-Metadata in Information Retrieval and Document Displays |
US20090125228A1 (en) * | 2007-11-09 | 2009-05-14 | Research In Motion Limited | System and method for providing dynamic route information to users of wireless communications devices |
US20090144366A1 (en) * | 2007-12-04 | 2009-06-04 | International Business Machines Corporation | Incorporating user emotion in a chat transcript |
US20090204885A1 (en) * | 2008-02-13 | 2009-08-13 | Ellsworth Thomas N | Automated management and publication of electronic content from mobile nodes |
US20090253512A1 (en) * | 2008-04-07 | 2009-10-08 | Palo Alto Research Center Incorporated | System And Method For Providing Adjustable Attenuation Of Location-Based Communication In An Online Game |
US20090254840A1 (en) * | 2008-04-04 | 2009-10-08 | Yahoo! Inc. | Local map chat |
US20090325603A1 (en) * | 2008-06-30 | 2009-12-31 | Apple Inc. | Location sharing |
US20100070842A1 (en) * | 2008-09-15 | 2010-03-18 | Andrew Aymeloglu | One-click sharing for screenshots and related documents |
US20100088634A1 (en) * | 2007-01-25 | 2010-04-08 | Akira Tsuruta | Multi-window management apparatus and program, storage medium and information processing apparatus |
US20100093370A1 (en) * | 2007-04-27 | 2010-04-15 | Sung-Yong Choi | Method for confirming a reading position using a short message service message and system for performing the same |
US20100110105A1 (en) * | 2008-10-31 | 2010-05-06 | Nokia Corporation | Method, apparatus and computer program product for providing synchronized navigation |
US20100123737A1 (en) * | 2008-11-19 | 2010-05-20 | Apple Inc. | Techniques for manipulating panoramas |
US20100153499A1 (en) * | 2008-12-15 | 2010-06-17 | International Business Machines Corporation | System and method to provide context for an automated agent to service mulitple avatars within a virtual universe |
US20100158097A1 (en) * | 2008-12-23 | 2010-06-24 | At&T Mobility Ii Llc | Dynamically scaled messaging content |
US20100178948A1 (en) * | 2009-01-13 | 2010-07-15 | Samsung Electronics Co., Ltd. | Apparatus and method for providing map information using short message service in portable terminal |
US20100205541A1 (en) * | 2009-02-11 | 2010-08-12 | Jeffrey A. Rapaport | social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic |
US20100216492A1 (en) * | 2009-02-16 | 2010-08-26 | Comverse, Ltd. | Employment of a text message by a user of a first mobile telephone to invoke a process that provides information to a user of a second mobile telephone |
US7797642B1 (en) * | 2005-12-30 | 2010-09-14 | Google Inc. | Method, system, and graphical user interface for meeting-spot-related contact lists |
US20100261489A1 (en) * | 2008-11-26 | 2010-10-14 | Almodovar Herraiz Daniel | Including information in a message |
US20100332218A1 (en) * | 2009-06-29 | 2010-12-30 | Nokia Corporation | Keyword based message handling |
US20110010656A1 (en) * | 2009-07-13 | 2011-01-13 | Ta Keo Ltd | Apparatus and method for improved user interface |
US20110055355A1 (en) * | 2009-08-21 | 2011-03-03 | Samsung Electronics Co., Ltd. | Application downloading method, application providing method, user terminal using the same |
US20110053578A1 (en) * | 2009-09-01 | 2011-03-03 | Nokia Corporation | Centralized control of multiple services |
US7917866B1 (en) * | 2005-12-30 | 2011-03-29 | Google Inc. | Method, system, and graphical user interface for meeting-spot-related online communications |
US20110080356A1 (en) * | 2009-10-05 | 2011-04-07 | Lg Electronics Inc. | Mobile terminal and method of controlling application execution in a mobile terminal |
US20110081973A1 (en) * | 2005-11-30 | 2011-04-07 | Hall Robert J | Geogame for mobile device |
US20110087970A1 (en) * | 2009-10-14 | 2011-04-14 | At&T Mobility Ii Llc | Systems, apparatus, methods and computer-readable storage media for facilitating integrated messaging, contacts and social media for a selected entity |
US20110102459A1 (en) * | 2009-11-04 | 2011-05-05 | At&T Intellectual Property I, L.P. | Augmented reality gaming via geographic messaging |
US20110107227A1 (en) * | 2008-04-07 | 2011-05-05 | Express Mobile Inc. | Systems and methods for presenting information on mobile devices |
US20110105093A1 (en) * | 2009-10-30 | 2011-05-05 | Samsung Electronics Co., Ltd. | Mobile device and control method of the same |
US20110137550A1 (en) * | 2009-12-04 | 2011-06-09 | Samsung Electronics Co., Ltd. | Apparatus and method for generating sketch map information in portable terminal |
US20110161419A1 (en) * | 2005-07-22 | 2011-06-30 | Rathod Yogesh Chunilal | Method and system for dynamically providing a journal feed and searching, sharing and advertising |
US20110173337A1 (en) * | 2010-01-13 | 2011-07-14 | Oto Technologies, Llc | Proactive pre-provisioning for a content sharing session |
US20110238302A1 (en) * | 2010-03-29 | 2011-09-29 | Htc Corporation | Method, mobile device and computer-readable medium for processing location information |
US20110238762A1 (en) * | 2010-02-09 | 2011-09-29 | Google Inc. | Geo-coded comments in a messaging service |
US20110246490A1 (en) * | 2010-04-01 | 2011-10-06 | Sony Ericsson Mobile Communications Ab | Updates with context information |
US20110264783A1 (en) * | 2010-04-23 | 2011-10-27 | Research In Motion Limited | Method and apparatus for receiving data from a plurality of feed sources |
US20110298618A1 (en) * | 2010-06-02 | 2011-12-08 | Apple Inc. | Remote User Status Indicators |
US20120008526A1 (en) * | 2010-07-07 | 2012-01-12 | Hooman Borghei | Ad Hoc Formation and Tracking of Location-Sharing Groups |
US8099462B2 (en) * | 2008-04-28 | 2012-01-17 | Cyberlink Corp. | Method of displaying interactive effects in web camera communication |
US20120166281A1 (en) * | 2010-12-23 | 2012-06-28 | Research In Motion Limited | Method and apparatus for displaying applications on a mobile device |
US20120167154A1 (en) * | 2010-12-24 | 2012-06-28 | Kt Corporation | System and method for providing social network service to multiple screen devices |
US20120189272A1 (en) * | 2009-08-12 | 2012-07-26 | Sony Computer Entertainment Inc. | Information Processing System and Information Processing Device |
US8255810B2 (en) * | 2008-11-19 | 2012-08-28 | Apple Inc. | Portable touch screen device, method, and graphical user interface for using emoji characters while in a locked mode |
US20120226757A1 (en) * | 2011-03-01 | 2012-09-06 | Mcfarland Keith | Location Filtered Messaging |
US20120291110A1 (en) * | 2011-05-10 | 2012-11-15 | Microsoft Corporation | Presenting messages associated with locations |
US20120290950A1 (en) * | 2011-05-12 | 2012-11-15 | Jeffrey A. Rapaport | Social-topical adaptive networking (stan) system allowing for group based contextual transaction offers and acceptances and hot topic watchdogging |
US20130007665A1 (en) * | 2011-06-05 | 2013-01-03 | Apple Inc. | Systems and methods for displaying notifications received from multiple applications |
US20130021368A1 (en) * | 2011-07-20 | 2013-01-24 | Nhn Corporation | System and method for managing and sharing images on per album basis |
US20130067389A1 (en) * | 2011-09-09 | 2013-03-14 | Lg Electronics Inc. | Mobile terminal and control method thereof |
US20130073387A1 (en) * | 2011-09-15 | 2013-03-21 | Stephan HEATH | System and method for providing educational related social/geo/promo link promotional data sets for end user display of interactive ad links, promotions and sale of products, goods, and/or services integrated with 3d spatial geomapping, company and local information for selected worldwide locations and social networking |
US20130225087A1 (en) * | 2012-02-29 | 2013-08-29 | Pantech Co., Ltd. | Mobile terminal device and method for sharing application |
US20130226453A1 (en) * | 2005-02-08 | 2013-08-29 | Bryan Gardner Trussel | Systems and methods for mobile communication integration |
US20130227455A1 (en) * | 2012-02-24 | 2013-08-29 | Samsung Electronics Co. Ltd. | Method of sharing content and mobile terminal thereof |
US8555203B1 (en) * | 2004-06-18 | 2013-10-08 | Verizon Laboratories Inc. | Stackable icons |
US20130275531A1 (en) * | 2012-04-16 | 2013-10-17 | Samsung Electronics Co., Ltd. | Method and apparatus for collecting feed information in mobile terminal |
US20130293664A1 (en) * | 2012-05-02 | 2013-11-07 | Research In Motion Limited | Systems and Methods to Manage Video Chat Contacts |
US8606297B1 (en) * | 2010-03-24 | 2013-12-10 | Grindr LLC | Systems and methods for providing location-based cascading displays |
US20130331070A1 (en) * | 2012-05-08 | 2013-12-12 | 24/7 Customer, Inc. | Data assistance application for mobile devices |
US20130332860A1 (en) * | 2012-06-11 | 2013-12-12 | Samsung Electronics Co., Ltd. | User terminal apparatus, server and controlling method thereof |
US20140068467A1 (en) * | 2011-11-08 | 2014-03-06 | Kakao Corp. | Method of providing instant messaging service and multiple services expanded from instant messaging service |
US20140068497A1 (en) * | 2012-08-31 | 2014-03-06 | Samsung Electronics Co., Ltd. | Method and apparatus for providing intelligent service using inputted character in a user device |
US20140101553A1 (en) * | 2012-10-10 | 2014-04-10 | Jens Nagel | Media insertion interface |
US20140114801A1 (en) * | 2011-06-15 | 2014-04-24 | Kt Corporation | User terminal for providing in-app service and in-app service server |
US20140136990A1 (en) * | 2012-11-14 | 2014-05-15 | invi Labs, Inc. | System for and method of embedding rich media into text messages |
US20140195621A1 (en) * | 2013-01-08 | 2014-07-10 | Vmware, Inc. | Intelligent chat system |
US20140214986A1 (en) * | 2013-01-28 | 2014-07-31 | Naver Corporation | Apparatus, method and computer readable recording medium for sharing real time video through chatting window of messenger service |
US20140240440A1 (en) * | 2013-02-28 | 2014-08-28 | Lg Uplus Corp. | Method for sharing function between terminals and terminal thereof |
US20140365923A1 (en) * | 2013-06-10 | 2014-12-11 | Samsung Electronics Co., Ltd. | Home screen sharing apparatus and method thereof |
US20150032686A1 (en) * | 2013-07-23 | 2015-01-29 | Salesforce.Com, Inc. | Application sharing functionality in an information networking environment |
US20150038161A1 (en) * | 2007-10-12 | 2015-02-05 | Gabriel Jakobson | Mashing mapping content displayed on mobile devices |
US20150067080A1 (en) * | 2013-09-05 | 2015-03-05 | Samsung Electronics Co., Ltd. | Method and apparatus for remotely controlling home device in home network system |
US20150074575A1 (en) * | 2013-09-12 | 2015-03-12 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling application using key inputs or combination thereof |
US20150281145A1 (en) * | 2012-10-22 | 2015-10-01 | Daum Kakao Corp. | Device and method for displaying image in chatting area and server for managing chatting data |
US20150309720A1 (en) * | 2014-04-25 | 2015-10-29 | Timothy Isaac FISHER | Messaging with drawn graphic input |
US9175964B2 (en) * | 2007-06-28 | 2015-11-03 | Apple Inc. | Integrated calendar and map applications in a mobile device |
US20160335686A1 (en) * | 2013-05-23 | 2016-11-17 | yTrre, Inc. | Real-time customer experience management systems and methods |
US9521252B2 (en) * | 2008-02-28 | 2016-12-13 | Computer Products Introductions, Corporation | Computer control of online social interactions based on conversation processing |
US20170221072A1 (en) * | 2013-05-23 | 2017-08-03 | GiriSrinivasaRao AthuluruTlrumala | End-to-end situation aware operations solution for customer experience centric businesses |
US20180367484A1 (en) * | 2017-06-15 | 2018-12-20 | Google Inc. | Suggested items for use with embedded applications in chat conversations |
US20180367483A1 (en) * | 2017-06-15 | 2018-12-20 | Google Inc. | Embedded programs and interfaces for chat conversations |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003150527A (en) * | 2001-11-05 | 2003-05-23 | Internatl Business Mach Corp <Ibm> | Chat system, terminal unit therefor, chat server and program |
US20070192168A1 (en) * | 2006-02-15 | 2007-08-16 | Leviathan Entertainment, Llc | Map and Inventory-Based On-Line Purchases |
US20070288164A1 (en) * | 2006-06-08 | 2007-12-13 | Microsoft Corporation | Interactive map application |
US20080086368A1 (en) * | 2006-10-05 | 2008-04-10 | Google Inc. | Location Based, Content Targeted Online Advertising |
US8724789B2 (en) * | 2007-08-06 | 2014-05-13 | Yellow Pages | Systems and methods to connect people for real time communications via directory assistance |
CA2820983C (en) * | 2008-05-18 | 2019-02-05 | Google Inc. | Secured electronic transaction system |
US8725819B2 (en) * | 2009-03-23 | 2014-05-13 | Sony Corporation | Chat system, server device, chat method, chat execution program, storage medium stored with chat execution program, information processing unit, image display method, image processing program, storage medium stored with image processing program |
-
2013
- 2013-09-17 KR KR1020130112139A patent/KR102057944B1/en active IP Right Grant
-
2014
- 2014-09-03 EP EP14183398.8A patent/EP2849391B1/en active Active
- 2014-09-08 US US14/479,493 patent/US20150082201A1/en not_active Abandoned
-
2017
- 2017-02-24 US US15/441,461 patent/US11003315B2/en active Active
Patent Citations (121)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6748318B1 (en) * | 1993-05-18 | 2004-06-08 | Arrivalstar, Inc. | Advanced notification systems and methods utilizing a computer network |
US6321158B1 (en) * | 1994-06-24 | 2001-11-20 | Delorme Publishing Company | Integrated routing/mapping information |
US5848373A (en) * | 1994-06-24 | 1998-12-08 | Delorme Publishing Company | Computer aided map location system |
US20070050716A1 (en) * | 1995-11-13 | 2007-03-01 | Dave Leahy | System and method for enabling users to interact in a virtual space |
US20070294229A1 (en) * | 1998-05-28 | 2007-12-20 | Q-Phrase Llc | Chat conversation methods traversing a provisional scaffold of meanings |
US6683538B1 (en) * | 1998-08-29 | 2004-01-27 | Robert D Wilkes, Jr. | Position dependent messaging system |
US7089278B1 (en) * | 1999-09-07 | 2006-08-08 | Fuji Xerox Co., Ltd. | Anchored conversations: adhesive, in-context, virtual discussion forums |
US6853911B1 (en) * | 1999-10-12 | 2005-02-08 | Taskin Sakarya | Downloading geographical data to a mobile station and displaying a map |
US20060184886A1 (en) * | 1999-12-22 | 2006-08-17 | Urbanpixel Inc. | Spatial chat in a multiple browser environment |
US6784901B1 (en) * | 2000-05-09 | 2004-08-31 | There | Method, system and computer program product for the delivery of a chat message in a 3D multi-user environment |
US20020130904A1 (en) * | 2001-03-19 | 2002-09-19 | Michael Becker | Method, apparatus and computer readable medium for multiple messaging session management with a graphical user interfacse |
US20030037110A1 (en) * | 2001-08-14 | 2003-02-20 | Fujitsu Limited | Method for providing area chat rooms, method for processing area chats on terminal side, computer-readable medium for recording processing program to provide area chat rooms, apparatus for providing area chat rooms, and terminal-side apparatus for use in a system to provide area chat rooms |
US7370269B1 (en) * | 2001-08-31 | 2008-05-06 | Oracle International Corporation | System and method for real-time annotation of a co-browsed document |
US20030154250A1 (en) * | 2001-12-11 | 2003-08-14 | Sony Corporation | Service providing system, information providing apparatus and method, information processing apparatus and method, and program |
US20030163525A1 (en) * | 2002-02-22 | 2003-08-28 | International Business Machines Corporation | Ink instant messaging with active message annotation |
US20040054428A1 (en) * | 2002-03-01 | 2004-03-18 | Sheha Michael A. | Method and apparatus for sending, retrieving and planning location relevant information |
US20030210265A1 (en) * | 2002-05-10 | 2003-11-13 | Haimberg Nadav Y. | Interactive chat messaging |
US7386799B1 (en) * | 2002-11-21 | 2008-06-10 | Forterra Systems, Inc. | Cinematic techniques in avatar-centric communication during a multi-user online simulation |
US20050273496A1 (en) * | 2004-06-07 | 2005-12-08 | Jean Yves D | System for presenting applications on instant messaging clients |
US8555203B1 (en) * | 2004-06-18 | 2013-10-08 | Verizon Laboratories Inc. | Stackable icons |
US20060015812A1 (en) * | 2004-07-15 | 2006-01-19 | Cingular Wireless Ii, Llc | Using emoticons, such as for wireless devices |
US20060063539A1 (en) * | 2004-09-21 | 2006-03-23 | Beyer Malcolm K Jr | Cellular phone/pda communication system |
US20130226453A1 (en) * | 2005-02-08 | 2013-08-29 | Bryan Gardner Trussel | Systems and methods for mobile communication integration |
US20060221858A1 (en) * | 2005-04-01 | 2006-10-05 | Microsoft Corporation | User experience for collaborative ad-hoc networks |
US20060223518A1 (en) * | 2005-04-04 | 2006-10-05 | Haney Richard D | Location sharing and tracking using mobile phones or other wireless devices |
US20110161419A1 (en) * | 2005-07-22 | 2011-06-30 | Rathod Yogesh Chunilal | Method and system for dynamically providing a journal feed and searching, sharing and advertising |
US20070032945A1 (en) * | 2005-08-02 | 2007-02-08 | Jason Kaufman | System and method for providing location related information to a network user |
US20070270159A1 (en) * | 2005-09-30 | 2007-11-22 | Sunit Lohtia | Location sensitive messaging |
US20110081973A1 (en) * | 2005-11-30 | 2011-04-07 | Hall Robert J | Geogame for mobile device |
US7917866B1 (en) * | 2005-12-30 | 2011-03-29 | Google Inc. | Method, system, and graphical user interface for meeting-spot-related online communications |
US7797642B1 (en) * | 2005-12-30 | 2010-09-14 | Google Inc. | Method, system, and graphical user interface for meeting-spot-related contact lists |
US20070161382A1 (en) * | 2006-01-09 | 2007-07-12 | Melinger Daniel J | System and method including asynchronous location-based messaging |
US20070208802A1 (en) * | 2006-03-03 | 2007-09-06 | Gogroups | Method And System For Messaging And Communication Based On Groups |
US20070266088A1 (en) * | 2006-03-31 | 2007-11-15 | Business Objects, S.A. | Apparatus and method for report sharing within an instant messaging framework |
US20070266104A1 (en) * | 2006-03-31 | 2007-11-15 | Business Objects, S.A. | Apparatus and method for report sharing within an instant messaging framework |
US20080132252A1 (en) * | 2006-06-01 | 2008-06-05 | Altman Samuel H | Network Manager System for Location-Aware Mobile Communication Devices |
US20090119255A1 (en) * | 2006-06-28 | 2009-05-07 | Metacarta, Inc. | Methods of Systems Using Geographic Meta-Metadata in Information Retrieval and Document Displays |
US20080036586A1 (en) * | 2006-08-11 | 2008-02-14 | Eric Shigeru Ohki | Method and system for receiving and sending navigational data via a wireless messaging service on a navigation system |
US20080119200A1 (en) * | 2006-11-21 | 2008-05-22 | Verizon Corporate Services Group Inc. | Method and system for flexible product and service bundling |
US20080141150A1 (en) * | 2006-12-11 | 2008-06-12 | Yahoo! Inc. | Graphical messages |
US20080171555A1 (en) * | 2007-01-11 | 2008-07-17 | Helio, Llc | Location-based text messaging |
US20100088634A1 (en) * | 2007-01-25 | 2010-04-08 | Akira Tsuruta | Multi-window management apparatus and program, storage medium and information processing apparatus |
US20080182598A1 (en) * | 2007-01-29 | 2008-07-31 | Research In Motion Limited | Method of e-mailing a map location using predefined context-sensitive messages |
US20100093370A1 (en) * | 2007-04-27 | 2010-04-15 | Sung-Yong Choi | Method for confirming a reading position using a short message service message and system for performing the same |
US20090005072A1 (en) * | 2007-06-28 | 2009-01-01 | Apple Inc. | Integration of User Applications in a Mobile Device |
US20090005018A1 (en) * | 2007-06-28 | 2009-01-01 | Apple Inc. | Route Sharing and Location |
US20090005981A1 (en) * | 2007-06-28 | 2009-01-01 | Apple Inc. | Integration of Map Services and User Applications in a Mobile Device |
US9175964B2 (en) * | 2007-06-28 | 2015-11-03 | Apple Inc. | Integrated calendar and map applications in a mobile device |
US20090019085A1 (en) * | 2007-07-10 | 2009-01-15 | Fatdoor, Inc. | Hot news neighborhood banter in a geo-spatial social network |
US20090061833A1 (en) * | 2007-08-30 | 2009-03-05 | Junius Ho | System, method and device to use messaging to implement programmatic actions |
US20150038161A1 (en) * | 2007-10-12 | 2015-02-05 | Gabriel Jakobson | Mashing mapping content displayed on mobile devices |
US20090125228A1 (en) * | 2007-11-09 | 2009-05-14 | Research In Motion Limited | System and method for providing dynamic route information to users of wireless communications devices |
US20090144366A1 (en) * | 2007-12-04 | 2009-06-04 | International Business Machines Corporation | Incorporating user emotion in a chat transcript |
US20090204885A1 (en) * | 2008-02-13 | 2009-08-13 | Ellsworth Thomas N | Automated management and publication of electronic content from mobile nodes |
US9521252B2 (en) * | 2008-02-28 | 2016-12-13 | Computer Products Introductions, Corporation | Computer control of online social interactions based on conversation processing |
US20090254840A1 (en) * | 2008-04-04 | 2009-10-08 | Yahoo! Inc. | Local map chat |
US20110107227A1 (en) * | 2008-04-07 | 2011-05-05 | Express Mobile Inc. | Systems and methods for presenting information on mobile devices |
US20090253512A1 (en) * | 2008-04-07 | 2009-10-08 | Palo Alto Research Center Incorporated | System And Method For Providing Adjustable Attenuation Of Location-Based Communication In An Online Game |
US8099462B2 (en) * | 2008-04-28 | 2012-01-17 | Cyberlink Corp. | Method of displaying interactive effects in web camera communication |
US20090325603A1 (en) * | 2008-06-30 | 2009-12-31 | Apple Inc. | Location sharing |
US20100070842A1 (en) * | 2008-09-15 | 2010-03-18 | Andrew Aymeloglu | One-click sharing for screenshots and related documents |
US20100110105A1 (en) * | 2008-10-31 | 2010-05-06 | Nokia Corporation | Method, apparatus and computer program product for providing synchronized navigation |
US20100123737A1 (en) * | 2008-11-19 | 2010-05-20 | Apple Inc. | Techniques for manipulating panoramas |
US8255810B2 (en) * | 2008-11-19 | 2012-08-28 | Apple Inc. | Portable touch screen device, method, and graphical user interface for using emoji characters while in a locked mode |
US20100261489A1 (en) * | 2008-11-26 | 2010-10-14 | Almodovar Herraiz Daniel | Including information in a message |
US20100153499A1 (en) * | 2008-12-15 | 2010-06-17 | International Business Machines Corporation | System and method to provide context for an automated agent to service mulitple avatars within a virtual universe |
US20100158097A1 (en) * | 2008-12-23 | 2010-06-24 | At&T Mobility Ii Llc | Dynamically scaled messaging content |
US20100178948A1 (en) * | 2009-01-13 | 2010-07-15 | Samsung Electronics Co., Ltd. | Apparatus and method for providing map information using short message service in portable terminal |
US20100205541A1 (en) * | 2009-02-11 | 2010-08-12 | Jeffrey A. Rapaport | social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic |
US20100216492A1 (en) * | 2009-02-16 | 2010-08-26 | Comverse, Ltd. | Employment of a text message by a user of a first mobile telephone to invoke a process that provides information to a user of a second mobile telephone |
US20100332218A1 (en) * | 2009-06-29 | 2010-12-30 | Nokia Corporation | Keyword based message handling |
US20110010656A1 (en) * | 2009-07-13 | 2011-01-13 | Ta Keo Ltd | Apparatus and method for improved user interface |
US20120189272A1 (en) * | 2009-08-12 | 2012-07-26 | Sony Computer Entertainment Inc. | Information Processing System and Information Processing Device |
US20110055355A1 (en) * | 2009-08-21 | 2011-03-03 | Samsung Electronics Co., Ltd. | Application downloading method, application providing method, user terminal using the same |
US20110053578A1 (en) * | 2009-09-01 | 2011-03-03 | Nokia Corporation | Centralized control of multiple services |
US20110080356A1 (en) * | 2009-10-05 | 2011-04-07 | Lg Electronics Inc. | Mobile terminal and method of controlling application execution in a mobile terminal |
US20110087970A1 (en) * | 2009-10-14 | 2011-04-14 | At&T Mobility Ii Llc | Systems, apparatus, methods and computer-readable storage media for facilitating integrated messaging, contacts and social media for a selected entity |
US20110105093A1 (en) * | 2009-10-30 | 2011-05-05 | Samsung Electronics Co., Ltd. | Mobile device and control method of the same |
US20110102459A1 (en) * | 2009-11-04 | 2011-05-05 | At&T Intellectual Property I, L.P. | Augmented reality gaming via geographic messaging |
US20110137550A1 (en) * | 2009-12-04 | 2011-06-09 | Samsung Electronics Co., Ltd. | Apparatus and method for generating sketch map information in portable terminal |
US20110173337A1 (en) * | 2010-01-13 | 2011-07-14 | Oto Technologies, Llc | Proactive pre-provisioning for a content sharing session |
US20110238762A1 (en) * | 2010-02-09 | 2011-09-29 | Google Inc. | Geo-coded comments in a messaging service |
US8606297B1 (en) * | 2010-03-24 | 2013-12-10 | Grindr LLC | Systems and methods for providing location-based cascading displays |
US20110238302A1 (en) * | 2010-03-29 | 2011-09-29 | Htc Corporation | Method, mobile device and computer-readable medium for processing location information |
US20110246490A1 (en) * | 2010-04-01 | 2011-10-06 | Sony Ericsson Mobile Communications Ab | Updates with context information |
US20110264783A1 (en) * | 2010-04-23 | 2011-10-27 | Research In Motion Limited | Method and apparatus for receiving data from a plurality of feed sources |
US20110298618A1 (en) * | 2010-06-02 | 2011-12-08 | Apple Inc. | Remote User Status Indicators |
US20120008526A1 (en) * | 2010-07-07 | 2012-01-12 | Hooman Borghei | Ad Hoc Formation and Tracking of Location-Sharing Groups |
US20120166281A1 (en) * | 2010-12-23 | 2012-06-28 | Research In Motion Limited | Method and apparatus for displaying applications on a mobile device |
US20120167154A1 (en) * | 2010-12-24 | 2012-06-28 | Kt Corporation | System and method for providing social network service to multiple screen devices |
US20120226757A1 (en) * | 2011-03-01 | 2012-09-06 | Mcfarland Keith | Location Filtered Messaging |
US20120291110A1 (en) * | 2011-05-10 | 2012-11-15 | Microsoft Corporation | Presenting messages associated with locations |
US20120290950A1 (en) * | 2011-05-12 | 2012-11-15 | Jeffrey A. Rapaport | Social-topical adaptive networking (stan) system allowing for group based contextual transaction offers and acceptances and hot topic watchdogging |
US20130007665A1 (en) * | 2011-06-05 | 2013-01-03 | Apple Inc. | Systems and methods for displaying notifications received from multiple applications |
US20140114801A1 (en) * | 2011-06-15 | 2014-04-24 | Kt Corporation | User terminal for providing in-app service and in-app service server |
US20130021368A1 (en) * | 2011-07-20 | 2013-01-24 | Nhn Corporation | System and method for managing and sharing images on per album basis |
US20130067389A1 (en) * | 2011-09-09 | 2013-03-14 | Lg Electronics Inc. | Mobile terminal and control method thereof |
US20130073387A1 (en) * | 2011-09-15 | 2013-03-21 | Stephan HEATH | System and method for providing educational related social/geo/promo link promotional data sets for end user display of interactive ad links, promotions and sale of products, goods, and/or services integrated with 3d spatial geomapping, company and local information for selected worldwide locations and social networking |
US20140068467A1 (en) * | 2011-11-08 | 2014-03-06 | Kakao Corp. | Method of providing instant messaging service and multiple services expanded from instant messaging service |
US20130227455A1 (en) * | 2012-02-24 | 2013-08-29 | Samsung Electronics Co. Ltd. | Method of sharing content and mobile terminal thereof |
US20130225087A1 (en) * | 2012-02-29 | 2013-08-29 | Pantech Co., Ltd. | Mobile terminal device and method for sharing application |
US20130275531A1 (en) * | 2012-04-16 | 2013-10-17 | Samsung Electronics Co., Ltd. | Method and apparatus for collecting feed information in mobile terminal |
US20130293664A1 (en) * | 2012-05-02 | 2013-11-07 | Research In Motion Limited | Systems and Methods to Manage Video Chat Contacts |
US20130331070A1 (en) * | 2012-05-08 | 2013-12-12 | 24/7 Customer, Inc. | Data assistance application for mobile devices |
US20130332860A1 (en) * | 2012-06-11 | 2013-12-12 | Samsung Electronics Co., Ltd. | User terminal apparatus, server and controlling method thereof |
US20140068497A1 (en) * | 2012-08-31 | 2014-03-06 | Samsung Electronics Co., Ltd. | Method and apparatus for providing intelligent service using inputted character in a user device |
US20140101553A1 (en) * | 2012-10-10 | 2014-04-10 | Jens Nagel | Media insertion interface |
US20150281145A1 (en) * | 2012-10-22 | 2015-10-01 | Daum Kakao Corp. | Device and method for displaying image in chatting area and server for managing chatting data |
US20140136990A1 (en) * | 2012-11-14 | 2014-05-15 | invi Labs, Inc. | System for and method of embedding rich media into text messages |
US20140195621A1 (en) * | 2013-01-08 | 2014-07-10 | Vmware, Inc. | Intelligent chat system |
US20140214986A1 (en) * | 2013-01-28 | 2014-07-31 | Naver Corporation | Apparatus, method and computer readable recording medium for sharing real time video through chatting window of messenger service |
US20140240440A1 (en) * | 2013-02-28 | 2014-08-28 | Lg Uplus Corp. | Method for sharing function between terminals and terminal thereof |
US20160335686A1 (en) * | 2013-05-23 | 2016-11-17 | yTrre, Inc. | Real-time customer experience management systems and methods |
US20170221072A1 (en) * | 2013-05-23 | 2017-08-03 | GiriSrinivasaRao AthuluruTlrumala | End-to-end situation aware operations solution for customer experience centric businesses |
US20140365923A1 (en) * | 2013-06-10 | 2014-12-11 | Samsung Electronics Co., Ltd. | Home screen sharing apparatus and method thereof |
US20150032686A1 (en) * | 2013-07-23 | 2015-01-29 | Salesforce.Com, Inc. | Application sharing functionality in an information networking environment |
US20150067080A1 (en) * | 2013-09-05 | 2015-03-05 | Samsung Electronics Co., Ltd. | Method and apparatus for remotely controlling home device in home network system |
US20150074575A1 (en) * | 2013-09-12 | 2015-03-12 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling application using key inputs or combination thereof |
US20150309720A1 (en) * | 2014-04-25 | 2015-10-29 | Timothy Isaac FISHER | Messaging with drawn graphic input |
US20180367484A1 (en) * | 2017-06-15 | 2018-12-20 | Google Inc. | Suggested items for use with embedded applications in chat conversations |
US20180367483A1 (en) * | 2017-06-15 | 2018-12-20 | Google Inc. | Embedded programs and interfaces for chat conversations |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10904189B2 (en) * | 2014-09-11 | 2021-01-26 | Lg Electronics Inc. | Terminal and method for displaying previous conversation information while displaying message of current conversation at the terminal |
US10628524B2 (en) * | 2015-03-24 | 2020-04-21 | Beijing Sogou Technology Development Co., Ltd. | Information input method and device |
US20180181560A1 (en) * | 2015-03-24 | 2018-06-28 | Beijing Sogou Technology Development Co., Ltd. | Information input method and device |
US10838613B2 (en) * | 2016-02-17 | 2020-11-17 | Trufan Llc | Consumer electronic entertainment and display system |
US20180260092A1 (en) * | 2016-02-17 | 2018-09-13 | Christopher Alsante | Consumer electronic entertainment and display system |
US10949081B2 (en) | 2016-05-18 | 2021-03-16 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
DK201670649A1 (en) * | 2016-05-18 | 2017-12-04 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Messaging |
US10592098B2 (en) | 2016-05-18 | 2020-03-17 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US10254956B2 (en) * | 2016-05-18 | 2019-04-09 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US9959037B2 (en) | 2016-05-18 | 2018-05-01 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US10852935B2 (en) | 2016-05-18 | 2020-12-01 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
DK179174B1 (en) * | 2016-05-18 | 2018-01-02 | Apple Inc | Devices, methods and graphical user interfaces for messaging |
US11625165B2 (en) | 2016-05-18 | 2023-04-11 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US10983689B2 (en) | 2016-05-18 | 2021-04-20 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11112963B2 (en) | 2016-05-18 | 2021-09-07 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11126348B2 (en) | 2016-05-18 | 2021-09-21 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US10331336B2 (en) * | 2016-05-18 | 2019-06-25 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11221751B2 (en) | 2016-05-18 | 2022-01-11 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11320982B2 (en) | 2016-05-18 | 2022-05-03 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11954323B2 (en) | 2016-05-18 | 2024-04-09 | Apple Inc. | Devices, methods, and graphical user interfaces for initiating a payment action in a messaging session |
US11513677B2 (en) | 2016-05-18 | 2022-11-29 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11159922B2 (en) | 2016-06-12 | 2021-10-26 | Apple Inc. | Layers in messaging applications |
US11778430B2 (en) | 2016-06-12 | 2023-10-03 | Apple Inc. | Layers in messaging applications |
US11966579B2 (en) | 2016-08-24 | 2024-04-23 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11543935B2 (en) * | 2019-01-31 | 2023-01-03 | Vivo Mobile Communication Co., Ltd. | Information processing method and terminal device |
US11392275B2 (en) * | 2020-06-10 | 2022-07-19 | Snap Inc. | Contextual sending menu |
Also Published As
Publication number | Publication date |
---|---|
KR20150032095A (en) | 2015-03-25 |
EP2849391B1 (en) | 2018-08-15 |
EP2849391A3 (en) | 2015-04-01 |
US20170160890A1 (en) | 2017-06-08 |
KR102057944B1 (en) | 2019-12-23 |
US11003315B2 (en) | 2021-05-11 |
EP2849391A2 (en) | 2015-03-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11003315B2 (en) | Terminal device and sharing method thereof | |
US11366490B2 (en) | User terminal device and displaying method thereof | |
US11347372B2 (en) | User terminal device and displaying method thereof | |
US10671115B2 (en) | User terminal device and displaying method thereof | |
CN106227344B (en) | Electronic device and control method thereof | |
EP3105657B1 (en) | User terminal device and displaying method thereof | |
US9002699B2 (en) | Adaptive input language switching | |
CN110168487B (en) | Touch control method and device | |
US20100030549A1 (en) | Mobile device having human language translation capability with positional feedback | |
KR20140144104A (en) | Electronic apparatus and Method for providing service thereof | |
KR20140011073A (en) | Method and apparatus for recommending text | |
CN106227452B (en) | Device, method and the graphical user interface of view are selected in three-dimensional map | |
CN106233237A (en) | A kind of method and apparatus of the new information processed with association | |
EP3944070A1 (en) | Mini-program production method and apparatus, and terminal and storage medium | |
KR20160073714A (en) | Electronic Device and Method of Displaying Web Page Using the same | |
CN110377220B (en) | Instruction response method and device, storage medium and electronic equipment | |
CN107025051B (en) | Information embedding method and device and client equipment | |
CN111399722A (en) | Mail signature generation method, device, terminal and storage medium | |
EP3185515B1 (en) | Method and device for inputting information | |
CN110853643A (en) | Method, device, equipment and storage medium for voice recognition in fast application | |
KR102425957B1 (en) | Mobile apparatus displaying end effect and cotrol method there of | |
KR101875485B1 (en) | Electronic apparatus and Method for providing service thereof | |
CN112230906A (en) | List control creating method, device and equipment and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SUNG, JU-YUN;NAH, HYUN-SOO;KIM, HYE-RIN;AND OTHERS;REEL/FRAME:033690/0107 Effective date: 20140724 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |