Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
According to an aspect of an embodiment of the present invention, an operation control method is provided, which is applied to a terminal.
Wherein, this terminal station includes at least: the first screen and the second screen are arranged oppositely, that is, the first screen and the second screen are located on different sides of the terminal, for example, the first screen is located on the front side of the terminal, the second screen is located on the back side of the terminal, or the first screen is located on the back side of the terminal, the second screen is located on the front side of the terminal, and the like. The first screen and the second screen are respectively controlled by two independent touch chips, and the two touch chips are controlled by the same CPU. The touch control chips of the first screen and the second screen are simultaneously started to capture touch control information of a user.
In the embodiment of the present invention, the terminal may be a mobile terminal (e.g., a mobile phone, a tablet computer, a laptop computer, a palmtop computer, a desktop computer, a vehicle-mounted terminal, a wearable device, or a pedometer), a desktop computer, a smart television, and other electronic devices.
As shown in fig. 1, the operation control method includes:
step 101: the target object is displayed at a second position of the second screen.
Wherein, the target objects described herein include but are not limited to: at least one of a screen locking interface of the terminal, a main interface of the terminal, an application program interface, virtual keys and an object for performing message prompt.
Step 102: and receiving touch operation of a user.
Wherein, the touch operation described herein includes: the method comprises the following steps of firstly, enabling a user to perform first touch operation at a first position of a first screen, and/or secondly, enabling the user to perform second touch operation at a second position of a second screen. That is, in the embodiment of the present invention, the target object displayed in the second screen may be controlled by the input on the first screen or may be controlled by the input on the second screen.
In the embodiment of the present invention, a corresponding relationship is pre-established between the position coordinates in the first screen and the position coordinates in the second screen, and a display position (i.e., a second position) of the target object on the second screen has a corresponding position (i.e., a first position) on the first screen, i.e., the first position of the first screen corresponds to the second position of the second screen.
The touch operation includes but is not limited to: a slide operation on the screen, a click operation, a double click operation, a press operation, or the like. When these operations are performed, the operations may be single-point touch operations, such as a sliding operation, a single-click operation, a double-click operation, or a pressing operation on the screen with a single finger, or multi-point touch operations, such as a sliding operation, a single-click operation, a double-click operation, or a pressing operation on the screen with two fingers.
Step 103, responding to the touch operation, executing a first instruction corresponding to the target object.
When receiving the touch operation, the terminal judges whether an instruction generated when the touch operation touches a target object exists, and if so, responds to the touch operation and executes the instruction (namely a first instruction) corresponding to the target object; if not, the touch operation is ignored.
For example, if the target object is a screen-locking interface, it may be determined whether a corresponding instruction (e.g., an instruction to unlock the screen) in the screen-locking state is generated if the first input touch is on the screen-locking interface; if the target object is a main page of the terminal, whether a corresponding instruction (such as an instruction for switching a desktop page) in a main page state is generated or not when the first input touch is on the main page can be judged; if the target object is an application program interface, whether an instruction related to the application program is generated or not when the first input touch is on the application program interface can be judged.
In the embodiment of the invention, the target object displayed in the second screen can be subjected to touch operation through the first screen, and the cooperation and interaction among a plurality of screens of the terminal are increased.
And 104, displaying the execution result of the first instruction on a second screen.
In this step, the terminal executes the first instruction and controls to display an execution result of the first instruction on the second screen.
Continuing with the foregoing example, if it is determined that the first instruction is an instruction to unlock the screen, controlling the second screen to unlock; if the first instruction is an instruction for switching the desktop page, controlling the desktop page displayed by the second screen to be switched; and if the first instruction is to mark the unread message as a read message, controlling the unread message displayed in the application program interface in the second screen to be marked as a read message.
In order to further understand the method described in the above steps 101 to 104, the following explanation is continued by way of an example.
For example, the terminal has two screens, wherein the first screen is on the back side of the terminal and the second screen is on the front side of the terminal. When the user uses the terminal, the front face of the terminal faces towards the user and the back face of the terminal faces away from the user, when the user holds the terminal, the thumb is located on one side of the second screen, and the rest four fingers are located on one side of the first screen. Assume that the user is using the application software in the terminal to watch a novel, and the novel interface (i.e., the target object) is displayed in the second screen. Generally, the page turning can be performed by sliding left and right on the second screen. In the embodiment of the present invention, the touch right of page turning may be set on the first screen, when a user wants to turn a page, the user may slide left and right on the first screen by using a finger on the side of the first screen to control the novel displayed in the second screen to turn the page, so that the situation that the content of the novel is blocked by the finger when the user turns the page on the second screen through the touch operation may be avoided.
For another example, when a user views a video played on a second screen (in this case, the video interface is a target object), if the screen brightness or sound level is adjusted through a touch screen operation, a touch operation may be usually performed on the second screen to adjust the screen brightness or sound level.
In the embodiment of the invention, the touch operation for generating the instruction is carried out on one screen of the terminal, and the execution result of executing the instruction is displayed on the other screen.
Preferably, in the embodiment of the present invention, the projection of the first position and the second position on the first plane is overlapped, wherein the first plane is parallel to the first screen or the second screen.
For example, when the first screen and the second screen are the same size, parallel to each other, and the edge positions are aligned with each other, the first position and the second position appear to overlap in projection on the first plane.
When the first screen and the second screen are different in size (e.g., same in shape but different in area), the first position and the second position may also be represented as overlapping projections on the first plane. If the shape of the first screen is the same as that of the second screen but the area of the first screen is larger than that of the second screen, a touch area can be set in the first screen, the touch area and the second screen have the same area and shape, and the edge of the touch area and the edge of the second screen are aligned with each other. The user can control the target object in the touch area, and at the moment, the first position and the second position are projected and overlapped on the first plane.
It will of course be appreciated that the first and second positions may also appear such that the projections on the first plane do not overlap when the first and second screens are different sizes. If the first screen and the second screen have the same shape, but the area of the first screen is smaller than that of the second screen, especially if the area difference between the first screen and the second screen is large, it is difficult to determine a touch area in the first screen, so that the touch area and the second screen have the same area and shape, and the edge of the touch area and the edge of the second screen are aligned with each other. At this time, projections of the determined first position and the second position corresponding to each other on the first plane may not overlap, based on a correspondence relationship established in advance between the position coordinates in the first screen and the position coordinates in the second screen.
In any case, the correspondence between the first position and the second position is determined based on a correspondence previously established between the position coordinates in the first screen and the position coordinates in the second screen.
Further, the target object in the embodiment of the present invention may be not only a screen lock interface of the terminal, a main interface of the terminal, and an application interface, but also a virtual key, an object for performing message prompt (such as a message prompt box, a message prompt bar, a message prompt icon, and the like), and the like. To further understand this, the following examples are given as follows:
for example, for sports game applications, the requirement for operability is high, and generally at least 8 virtual keys are provided. The 8 virtual keys are generally operated by two thumbs of two hands, and the rest fingers are in an idle state. Assuming that the game interface is currently displayed on the second screen, in the embodiment of the present invention, the touch right of at least one virtual key in the game interface may be set on the first screen. When virtual key touch is carried out, a finger on the first screen side is directly used for touch control, and therefore interactivity between the finger and the screen is improved. In order to facilitate the user to know the virtual keys in the game interface, the virtual keys on the first screen are displayed on the second screen, for example, several circular figures displayed by lighter color lines in fig. 2 are the virtual keys on the first screen. Preferably, in order to distinguish the virtual key of the touch right on the second screen, the virtual key of the touch right on the first screen may be displayed differently, such as adjusting the transparency or color of the virtual key for displaying differently, which may be designed according to actual requirements.
For another example, as shown in fig. 3, when the user is experiencing a game application through the second screen, the terminal receives a new message to another application (e.g., WeChat) and performs message prompt (e.g., prompt through a message prompt box). In the embodiment of the invention, for the message prompt, the message can be viewed by performing touch operation on the first screen, if a user wants to view a received new message, click operation can be performed on a position, corresponding to the position of the message prompt box displayed on the second screen, on the first screen, the terminal responds to the click operation to generate an instruction for displaying the message interface, and the message interface is displayed on the second screen for the user to view the message. In addition, the user can drag the message prompt box, the message prompt bar or the message prompt icon and the like displayed in the second screen through the first screen, and move the message prompt box, the message prompt bar or the message prompt icon and the like to a game interface which the user needs to watch is not blocked. Of course, it can be understood that the message prompt box, the message prompt bar, or the message prompt icon may also be controlled by performing a touch operation on the second screen.
Preferably, since the user cannot see the touch object on the first screen side, the user may have a certain difficulty in controlling the target object displayed on the second screen by using the touch object, thereby reducing the accuracy of touch control. In order to solve the technical problem, in the embodiment of the present invention, feature information of a touch object located on the first screen side may be detected, and a virtual image matching the touch object is displayed at a third position on the second screen according to the detected feature information of the touch object. When the position of the touch body changes, the virtual image also changes correspondingly, so that a user can better control the touch body on the first screen side to operate the target object according to the virtual image, and the accuracy of the user in touch on the first screen is improved.
The touch object herein includes but is not limited to: a user's hand, a stylus, etc. The third position described herein corresponds to the fourth position on the first screen. The fourth position is the position of the orthographic projection of the touch body on the first screen.
It should be noted that, when the position coordinates in the first screen and the position coordinates in the second screen are displayed on the second screen, the virtual image is mapped to the fourth position corresponding to the third position on the second screen for display according to the correspondence relationship between the position coordinates in the first screen and the position coordinates in the second screen and the third position where the orthographic projection of the touch object on the first screen is located.
Wherein, in order to avoid the virtual image from blocking the picture displayed by the second screen, the transparency of the virtual image can be set.
Specifically, the characteristic information of the touch object described herein includes at least one of the following: the depth information of the touch body within a preset distance from the first screen and the position information of the touch body touching the first screen are obtained.
When the characteristic information of the touch object is the depth information, a virtual image of the touch object matched with the touch object is displayed, for example, as shown in a finger graph displayed by a lighter color line in fig. 5. When the characteristic information of the touch object is the position information, displaying a virtual image indicating the touch position of the touch object, for example, as shown in a square in fig. 6.
When the touch object is a hand of a user, the depth information of the touch object is depth information of a finger of the hand, and the position information of the touch object touching the first screen is position information of the hand (such as a fingertip, a knuckle, and the like) of the user touching the first screen.
In the embodiment of the present invention, the first screen and the second screen of the terminal are capacitive screens, and for the principle of implementing the space-isolated touch recognition on the screens, the following explanation is given by taking a hand as an example:
as shown in fig. 4, although the finger does not actually touch the first screen, the finger may form a capacitor with the first screen due to being close to the first screen, and the capacitance of the capacitor formed is different due to the different distance between the finger and the first screen. Through the processing of the CPU, the software synthesizes a virtual finger image (the size ratio of the virtual finger image is the same as that of the human finger), and finally maps the virtual finger image to the corresponding position of the second screen.
Further, in one embodiment of the present invention, the target object includes: a first target object and a second target object. And the first target object and the second target object are displayed at a second position on the second screen in an overlapping manner.
When a user uses a terminal, for example, when the user is experiencing a game application, the terminal receives a new message to another application and performs message prompt (such as prompt through a message prompt box), and an object for performing message prompt just covers a virtual key in the game interface. For this situation, the embodiment of the present invention provides two solutions, which are specifically described as follows:
the first mode comprises the following steps: receiving a first pressing operation of a user at a first position on a first screen; and responding to the first pressing operation, determining a target object to be controlled corresponding to the pressing degree according to the pressing degree of the first pressing operation, and executing an instruction corresponding to the target object to be controlled. The target object to be controlled is a first target object or a second target object.
The second mode comprises the following steps:
receiving a second pressing operation of the user at a second position on the second screen; and responding to the second pressing operation, determining a target object to be controlled corresponding to the pressing force degree of the second pressing operation according to the pressing force degree, and executing an instruction corresponding to the target object to be controlled. The target object to be controlled is a first target object or a second target object.
As can be seen from the foregoing first and second manners, in the embodiment of the present invention, the user can select the target object that the user wants to manipulate according to the difference of the pressing force. For example, there are set in advance: when the pressing force is smaller than a preset pressure value, the first target object is a target object selected to be controlled; and when the pressing degree is greater than or equal to the preset pressure value, the second target object is the target object selected to be operated. In this way, when the pressing force of the user at the first position on the first screen (or at the second position on the second screen) is smaller than the preset pressure value, the first target object is determined as the target object selected to be manipulated, and the instruction corresponding to the first target object is executed; when the pressing force of the user at the first position on the first screen (or the second position on the second screen) is greater than or equal to the preset pressure value, the second target object is determined as the target object selected to be manipulated, and an instruction corresponding to the second target object is executed.
For better understanding of the above technical solution, the following is exemplified:
for example, when the message prompt box overlaps with a virtual key in the game interface, if a pressing operation is performed on the first screen, the following may be set: lightly pressing a position on the first screen corresponding to the overlapping position of the message prompt box and the virtual key in the game interface, and displaying a result of responding to the WeChat function on the first screen; and re-pressing a position on the first screen corresponding to the overlapping position of the message prompt box and the virtual key in the game interface, and displaying the result of the response game function on the first screen. If a pressing operation is performed on the second screen, it may be set that: pressing the overlapping position again, and displaying the result of the response WeChat function on a first screen; the overlapped position is tapped and the result of the response game function is displayed on the first screen. It is understood that the light press and the heavy press may be exchanged for the corresponding target object, and are not limited to the above.
Wherein, the user can be according to the use habit of oneself, the press intensity of lightly pressing and heavily pressing that the training set up to can select whether to open this function.
Preferably, in the embodiment of the present invention, when the user inputs the first input on the first screen, the first screen is in a screen-off state, that is, when the user performs a touch operation on the first screen, the screen backlight of the first screen is in a closed state, but the touch chip is in an open state, so that the power of the terminal can be saved.
In summary, in the embodiments of the present invention, a touch operation for generating an instruction is performed on one screen of the terminal, and an execution result of executing the instruction is displayed on another screen. Furthermore, in the embodiment of the present invention, the condition of the touch object on the first screen side can be displayed on the second screen, so that a user can better control the touch object to operate on the first screen.
According to another aspect of the embodiments of the present invention, a terminal is provided, which can achieve the details of the middle frame method of the terminal described above and achieve the same effects.
Wherein, this terminal station includes at least: the first screen and the second screen which are oppositely arranged are positioned on the opposite sides of the terminal. The first screen and the second screen are located on different sides of the terminal, for example, the first screen is located on the front side of the terminal, the second screen is located on the back side of the terminal, or the first screen is located on the back side of the terminal, the second screen is located on the front side of the terminal, and the like. The first screen and the second screen are respectively controlled by two independent touch chips, and the two touch chips are controlled by the same CPU. The touch control chips of the first screen and the second screen are simultaneously started to capture touch control information of a user.
As shown in fig. 7, the terminal further includes:
a first display module 701, configured to display the target object at a second position on the second screen.
The first receiving module 702 is configured to receive a touch operation of a user.
Wherein, this touch operation includes: the method comprises the following steps that a first touch operation of a user at a first position of a first screen and/or a second touch operation of the user at a second position of a second screen are/is carried out; wherein the first position of the first screen corresponds to the second position of the second screen.
The executing module 703 is configured to execute a first instruction corresponding to the target object in response to the first input received by the first receiving module 702.
And a second display module 704 for displaying the execution result of the first instruction on a second screen.
Further, the terminal further includes:
the detection module is used for detecting the characteristic information of the touch body positioned on the first screen side.
And the third display module is used for displaying the virtual image matched with the touch control body at a third position on the second screen according to the characteristic information of the touch control body detected by the detection module.
The third position corresponds to a fourth position on the first screen, and the fourth position is a position where the orthographic projection of the touch body on the first screen is located.
Further, the characteristic information of the touch object includes at least one of the following: the depth information of the touch body within a preset distance from the first screen and the position information of the touch body touching the first screen are obtained. Wherein the preset distance is greater than or equal to 0.
Wherein the third display module comprises:
the first display unit is used for displaying a virtual image of the touch control body matched with the touch control body when the characteristic information of the touch control body is depth information; and the second display unit is used for displaying a virtual image used for indicating the touch position of the touch body when the characteristic information of the touch body is position information.
Further, the target object includes: the first target object and the second target object are displayed at the second position in an overlapping mode.
Further, the first receiving module 702 includes:
the first receiving unit is used for receiving a first pressing operation of a user at a first position on a first screen.
The execution module 703 includes:
and the first determining unit is used for responding to the first pressing operation and determining the target object to be controlled corresponding to the pressing degree according to the pressing degree of the first pressing operation received by the receiving unit.
Wherein the target object to be controlled is the first target object or the second target object;
and the first execution unit is used for executing the instruction corresponding to the target object to be controlled.
Further, the first receiving module 702 includes:
a second receiving unit for receiving a second pressing operation of the user at a second position on the second screen.
The execution module 703 includes:
and the second determining unit is used for responding to the second pressing operation received by the second receiving module and determining the target object to be controlled corresponding to the pressing degree of the second pressing operation according to the pressing degree of the second pressing operation.
Wherein the target object to be controlled is the first target object or the second target object;
and the second execution unit is used for executing the instruction corresponding to the target object to be controlled.
Further, the target object includes: at least one of a screen locking interface of the terminal, a main interface of the terminal, an application program interface, virtual keys and an object for performing message prompt.
Preferably, the first location overlaps with a projection of the second location on a first plane. The first plane is parallel to the first screen or the second screen.
Preferably, in the embodiment of the present invention, the first screen is in a screen-off state.
In the embodiment of the invention, the touch operation for generating the instruction is carried out on one screen of the terminal, and the execution result of executing the instruction is displayed on the other screen.
Fig. 8 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention.
The terminal 800 includes but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, a processor 810, and a power supply 811. Those skilled in the art will appreciate that the terminal configuration shown in fig. 8 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
And a processor 810, configured to, when the user input unit 807 receives a touch operation of the user after the display unit 806 controls to display the target object at the second position on the second screen, execute a first instruction corresponding to the target object in response to the touch operation, and display an execution result of the first instruction on the second screen through the display unit 806.
Wherein, the touch operation described herein includes: the method comprises the following steps that a first touch operation of a user at a first position of a first screen and/or a second touch operation of the user at a second position of a second screen are/is carried out; wherein the first position of the first screen corresponds to the second position of the second screen.
In the embodiment of the invention, the touch operation for generating the instruction is carried out on one screen of the terminal, and the execution result of executing the instruction is displayed on the other screen.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 801 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 810; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 801 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 801 can also communicate with a network and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user through the network module 802, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 803 may convert audio data received by the radio frequency unit 801 or the network module 802 or stored in the memory 809 into an audio signal and output as sound. Also, the audio output unit 803 may also provide audio output related to a specific function performed by the terminal 800 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 803 includes a speaker, a buzzer, a receiver, and the like.
The input unit 804 is used for receiving an audio or video signal. The input Unit 804 may include a Graphics Processing Unit (GPU) 8041 and a microphone 8042, and the Graphics processor 8041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 806. The image frames processed by the graphics processor 8041 may be stored in the memory 809 (or other storage medium) or transmitted via the radio frequency unit 801 or the network module 802. The microphone 8042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 801 in case of a phone call mode.
The terminal 800 also includes at least one sensor 805, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 8061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 8061 and/or the backlight when the terminal 800 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 805 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 806 is used to display information input by the user or information provided to the user. The Display unit 806 may include a Display panel 8061, and the Display panel 8061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 807 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 807 includes a touch panel 8071 and other input devices 8072. The touch panel 8071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 8071 (e.g., operations by a user on or near the touch panel 8071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 8071 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 810, receives a command from the processor 810, and executes the command. In addition, the touch panel 8071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 8071, the user input unit 807 can include other input devices 8072. In particular, other input devices 8072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 8071 can be overlaid on the display panel 8061, and when the touch panel 8071 detects a touch operation on or near the touch panel 8071, the touch operation is transmitted to the processor 810 to determine the type of the touch event, and then the processor 810 provides a corresponding visual output on the display panel 8061 according to the type of the touch event. Although in fig. 8, the touch panel 8071 and the display panel 8061 are two independent components to implement the input and output functions of the terminal, in some embodiments, the touch panel 8071 and the display panel 8061 may be integrated to implement the input and output functions of the terminal, which is not limited herein.
The interface unit 808 is an interface for connecting an external device to the terminal 800. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 808 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the terminal 800 or may be used to transmit data between the terminal 800 and external devices.
The memory 809 may be used to store software programs as well as various data. The memory 809 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 809 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 810 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 809 and calling data stored in the memory 809, thereby integrally monitoring the terminal. Processor 810 may include one or more processing units; preferably, the processor 810 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 810.
The terminal 800 may also include a power supply 811 (e.g., a battery) for powering the various components, and preferably, the power supply 811 may be logically coupled to the processor 810 via a power management system to provide management of charging, discharging, and power consumption via the power management system.
In addition, the terminal 800 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a terminal, including a processor 810, a memory 809, and a computer program stored in the memory 809 and capable of running on the processor 810, where the computer program, when executed by the processor 810, implements each process of the above operation control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the operation control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.