CN109558061B - An operation control method and terminal - Google Patents

An operation control method and terminal Download PDF

Info

Publication number
CN109558061B
CN109558061B CN201811452861.8A CN201811452861A CN109558061B CN 109558061 B CN109558061 B CN 109558061B CN 201811452861 A CN201811452861 A CN 201811452861A CN 109558061 B CN109558061 B CN 109558061B
Authority
CN
China
Prior art keywords
screen
target object
touch
terminal
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811452861.8A
Other languages
Chinese (zh)
Other versions
CN109558061A (en
Inventor
龚贺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811452861.8A priority Critical patent/CN109558061B/en
Publication of CN109558061A publication Critical patent/CN109558061A/en
Application granted granted Critical
Publication of CN109558061B publication Critical patent/CN109558061B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明实施例提供了一种操作控制方法及终端。其中,该操作控制方法包括:在所述第二屏幕的第二位置处显示目标对象;接收用户的触控操作,其中,所述触控操作包括:用户在所述第一屏幕的第一位置处的第一触控操作,和/或,用户在所述第二屏幕的第二位置处的第二触控操作;其中,所述第一屏幕的第一位置与所述第二屏幕的第二位置相对应;响应于所述触控操作,执行与所述目标对象对应的第一指令;在所述第二屏幕上显示所述第一指令的执行结果。本发明实施例中,在终端的一个屏幕上进行用于生成指令的触控操作,在另一个屏幕上显示执行该指令的执行结果,通过这样的方式,可以增加终端多个屏幕之间的配合与交互。

Figure 201811452861

Embodiments of the present invention provide an operation control method and a terminal. The operation control method includes: displaying a target object at a second position on the second screen; receiving a user's touch operation, wherein the touch operation includes: the user is at a first position on the first screen The first touch operation at the second screen, and/or the second touch operation performed by the user at the second position of the second screen; wherein the first position of the first screen is the same as the second position of the second screen. The two positions are corresponding; the first instruction corresponding to the target object is executed in response to the touch operation; the execution result of the first instruction is displayed on the second screen. In the embodiment of the present invention, a touch operation for generating an instruction is performed on one screen of the terminal, and an execution result of executing the instruction is displayed on another screen. In this way, the cooperation between multiple screens of the terminal can be increased. interact with.

Figure 201811452861

Description

Operation control method and terminal
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an operation control method and a terminal.
Background
With the progress of science and technology and the rapid development of the communication industry, the functions and forms of terminals (such as mobile phones, platform computers and the like) are changed over the ground. With the continuous development of the process, the terminal screen gradually develops from a capacitive screen to a flexible screen, a folding screen and a double screen. Double-screen mobile phones are more and more, and more screen interaction experiences are provided for people. However, in the prior art, the operations of the two screens are separated, that is, when the content displayed in each screen is operated, the operation can only be performed on the current screen, and the cooperation and interaction between the two screens are lacked.
Disclosure of Invention
The embodiment of the invention provides an operation control method and a terminal, and aims to solve the problem that a double-screen terminal in the prior art is lack of cooperation and interaction.
In order to solve the technical problems, the invention adopts the following technical scheme:
in a first aspect, an embodiment of the present invention provides an operation control method, which is applied to a terminal, where the terminal at least includes: the screen comprises a first screen and a second screen which are arranged oppositely, and is characterized in that the method comprises the following steps:
displaying a target object at a second location of the second screen;
receiving a touch operation of a user, wherein the touch operation comprises: a first touch operation of a user at a first position of the first screen and/or a second touch operation of the user at a second position of the second screen; wherein a first position of the first screen corresponds to a second position of the second screen;
responding to the touch operation, and executing a first instruction corresponding to the target object;
displaying an execution result of the first instruction on the second screen.
In a second aspect, an embodiment of the present invention provides a terminal, where the terminal at least includes: the first screen and the second screen are oppositely arranged. Wherein, the terminal further includes:
a first display module for displaying a target object at a second position of the second screen;
a first receiving module, configured to receive a touch operation of a user, where the touch operation includes: a first touch operation of a user at a first position of the first screen and/or a second touch operation of the user at a second position of the second screen; wherein a first position of the first screen corresponds to a second position of the second screen;
the execution module is used for responding to the touch operation and executing a first instruction corresponding to the target object;
and the second display module is used for displaying the execution result of the first instruction on the second screen.
In a third aspect, an embodiment of the present invention provides a terminal, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the operation control method described above.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the operation control method as described above.
In the embodiment of the invention, the touch operation for generating the instruction is carried out on one screen of the terminal, and the execution result of executing the instruction is displayed on the other screen.
Drawings
FIG. 1 is a flow chart of an operation control method provided by an embodiment of the present invention;
FIG. 2 is a diagram illustrating one example of a display screen in a second screen according to an embodiment of the present invention;
FIG. 3 is a second schematic diagram of a display screen in a second screen according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of gesture recognition provided by embodiments of the present invention;
FIG. 5 is a third schematic diagram of a display screen in a second screen according to an embodiment of the invention;
FIG. 6 is a fourth schematic diagram illustrating a display screen of the second screen according to the embodiment of the invention;
fig. 7 shows one of block diagrams of a terminal according to an embodiment of the present invention;
fig. 8 shows a second block diagram of the terminal according to the embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
According to an aspect of an embodiment of the present invention, an operation control method is provided, which is applied to a terminal.
Wherein, this terminal station includes at least: the first screen and the second screen are arranged oppositely, that is, the first screen and the second screen are located on different sides of the terminal, for example, the first screen is located on the front side of the terminal, the second screen is located on the back side of the terminal, or the first screen is located on the back side of the terminal, the second screen is located on the front side of the terminal, and the like. The first screen and the second screen are respectively controlled by two independent touch chips, and the two touch chips are controlled by the same CPU. The touch control chips of the first screen and the second screen are simultaneously started to capture touch control information of a user.
In the embodiment of the present invention, the terminal may be a mobile terminal (e.g., a mobile phone, a tablet computer, a laptop computer, a palmtop computer, a desktop computer, a vehicle-mounted terminal, a wearable device, or a pedometer), a desktop computer, a smart television, and other electronic devices.
As shown in fig. 1, the operation control method includes:
step 101: the target object is displayed at a second position of the second screen.
Wherein, the target objects described herein include but are not limited to: at least one of a screen locking interface of the terminal, a main interface of the terminal, an application program interface, virtual keys and an object for performing message prompt.
Step 102: and receiving touch operation of a user.
Wherein, the touch operation described herein includes: the method comprises the following steps of firstly, enabling a user to perform first touch operation at a first position of a first screen, and/or secondly, enabling the user to perform second touch operation at a second position of a second screen. That is, in the embodiment of the present invention, the target object displayed in the second screen may be controlled by the input on the first screen or may be controlled by the input on the second screen.
In the embodiment of the present invention, a corresponding relationship is pre-established between the position coordinates in the first screen and the position coordinates in the second screen, and a display position (i.e., a second position) of the target object on the second screen has a corresponding position (i.e., a first position) on the first screen, i.e., the first position of the first screen corresponds to the second position of the second screen.
The touch operation includes but is not limited to: a slide operation on the screen, a click operation, a double click operation, a press operation, or the like. When these operations are performed, the operations may be single-point touch operations, such as a sliding operation, a single-click operation, a double-click operation, or a pressing operation on the screen with a single finger, or multi-point touch operations, such as a sliding operation, a single-click operation, a double-click operation, or a pressing operation on the screen with two fingers.
Step 103, responding to the touch operation, executing a first instruction corresponding to the target object.
When receiving the touch operation, the terminal judges whether an instruction generated when the touch operation touches a target object exists, and if so, responds to the touch operation and executes the instruction (namely a first instruction) corresponding to the target object; if not, the touch operation is ignored.
For example, if the target object is a screen-locking interface, it may be determined whether a corresponding instruction (e.g., an instruction to unlock the screen) in the screen-locking state is generated if the first input touch is on the screen-locking interface; if the target object is a main page of the terminal, whether a corresponding instruction (such as an instruction for switching a desktop page) in a main page state is generated or not when the first input touch is on the main page can be judged; if the target object is an application program interface, whether an instruction related to the application program is generated or not when the first input touch is on the application program interface can be judged.
In the embodiment of the invention, the target object displayed in the second screen can be subjected to touch operation through the first screen, and the cooperation and interaction among a plurality of screens of the terminal are increased.
And 104, displaying the execution result of the first instruction on a second screen.
In this step, the terminal executes the first instruction and controls to display an execution result of the first instruction on the second screen.
Continuing with the foregoing example, if it is determined that the first instruction is an instruction to unlock the screen, controlling the second screen to unlock; if the first instruction is an instruction for switching the desktop page, controlling the desktop page displayed by the second screen to be switched; and if the first instruction is to mark the unread message as a read message, controlling the unread message displayed in the application program interface in the second screen to be marked as a read message.
In order to further understand the method described in the above steps 101 to 104, the following explanation is continued by way of an example.
For example, the terminal has two screens, wherein the first screen is on the back side of the terminal and the second screen is on the front side of the terminal. When the user uses the terminal, the front face of the terminal faces towards the user and the back face of the terminal faces away from the user, when the user holds the terminal, the thumb is located on one side of the second screen, and the rest four fingers are located on one side of the first screen. Assume that the user is using the application software in the terminal to watch a novel, and the novel interface (i.e., the target object) is displayed in the second screen. Generally, the page turning can be performed by sliding left and right on the second screen. In the embodiment of the present invention, the touch right of page turning may be set on the first screen, when a user wants to turn a page, the user may slide left and right on the first screen by using a finger on the side of the first screen to control the novel displayed in the second screen to turn the page, so that the situation that the content of the novel is blocked by the finger when the user turns the page on the second screen through the touch operation may be avoided.
For another example, when a user views a video played on a second screen (in this case, the video interface is a target object), if the screen brightness or sound level is adjusted through a touch screen operation, a touch operation may be usually performed on the second screen to adjust the screen brightness or sound level.
In the embodiment of the invention, the touch operation for generating the instruction is carried out on one screen of the terminal, and the execution result of executing the instruction is displayed on the other screen.
Preferably, in the embodiment of the present invention, the projection of the first position and the second position on the first plane is overlapped, wherein the first plane is parallel to the first screen or the second screen.
For example, when the first screen and the second screen are the same size, parallel to each other, and the edge positions are aligned with each other, the first position and the second position appear to overlap in projection on the first plane.
When the first screen and the second screen are different in size (e.g., same in shape but different in area), the first position and the second position may also be represented as overlapping projections on the first plane. If the shape of the first screen is the same as that of the second screen but the area of the first screen is larger than that of the second screen, a touch area can be set in the first screen, the touch area and the second screen have the same area and shape, and the edge of the touch area and the edge of the second screen are aligned with each other. The user can control the target object in the touch area, and at the moment, the first position and the second position are projected and overlapped on the first plane.
It will of course be appreciated that the first and second positions may also appear such that the projections on the first plane do not overlap when the first and second screens are different sizes. If the first screen and the second screen have the same shape, but the area of the first screen is smaller than that of the second screen, especially if the area difference between the first screen and the second screen is large, it is difficult to determine a touch area in the first screen, so that the touch area and the second screen have the same area and shape, and the edge of the touch area and the edge of the second screen are aligned with each other. At this time, projections of the determined first position and the second position corresponding to each other on the first plane may not overlap, based on a correspondence relationship established in advance between the position coordinates in the first screen and the position coordinates in the second screen.
In any case, the correspondence between the first position and the second position is determined based on a correspondence previously established between the position coordinates in the first screen and the position coordinates in the second screen.
Further, the target object in the embodiment of the present invention may be not only a screen lock interface of the terminal, a main interface of the terminal, and an application interface, but also a virtual key, an object for performing message prompt (such as a message prompt box, a message prompt bar, a message prompt icon, and the like), and the like. To further understand this, the following examples are given as follows:
for example, for sports game applications, the requirement for operability is high, and generally at least 8 virtual keys are provided. The 8 virtual keys are generally operated by two thumbs of two hands, and the rest fingers are in an idle state. Assuming that the game interface is currently displayed on the second screen, in the embodiment of the present invention, the touch right of at least one virtual key in the game interface may be set on the first screen. When virtual key touch is carried out, a finger on the first screen side is directly used for touch control, and therefore interactivity between the finger and the screen is improved. In order to facilitate the user to know the virtual keys in the game interface, the virtual keys on the first screen are displayed on the second screen, for example, several circular figures displayed by lighter color lines in fig. 2 are the virtual keys on the first screen. Preferably, in order to distinguish the virtual key of the touch right on the second screen, the virtual key of the touch right on the first screen may be displayed differently, such as adjusting the transparency or color of the virtual key for displaying differently, which may be designed according to actual requirements.
For another example, as shown in fig. 3, when the user is experiencing a game application through the second screen, the terminal receives a new message to another application (e.g., WeChat) and performs message prompt (e.g., prompt through a message prompt box). In the embodiment of the invention, for the message prompt, the message can be viewed by performing touch operation on the first screen, if a user wants to view a received new message, click operation can be performed on a position, corresponding to the position of the message prompt box displayed on the second screen, on the first screen, the terminal responds to the click operation to generate an instruction for displaying the message interface, and the message interface is displayed on the second screen for the user to view the message. In addition, the user can drag the message prompt box, the message prompt bar or the message prompt icon and the like displayed in the second screen through the first screen, and move the message prompt box, the message prompt bar or the message prompt icon and the like to a game interface which the user needs to watch is not blocked. Of course, it can be understood that the message prompt box, the message prompt bar, or the message prompt icon may also be controlled by performing a touch operation on the second screen.
Preferably, since the user cannot see the touch object on the first screen side, the user may have a certain difficulty in controlling the target object displayed on the second screen by using the touch object, thereby reducing the accuracy of touch control. In order to solve the technical problem, in the embodiment of the present invention, feature information of a touch object located on the first screen side may be detected, and a virtual image matching the touch object is displayed at a third position on the second screen according to the detected feature information of the touch object. When the position of the touch body changes, the virtual image also changes correspondingly, so that a user can better control the touch body on the first screen side to operate the target object according to the virtual image, and the accuracy of the user in touch on the first screen is improved.
The touch object herein includes but is not limited to: a user's hand, a stylus, etc. The third position described herein corresponds to the fourth position on the first screen. The fourth position is the position of the orthographic projection of the touch body on the first screen.
It should be noted that, when the position coordinates in the first screen and the position coordinates in the second screen are displayed on the second screen, the virtual image is mapped to the fourth position corresponding to the third position on the second screen for display according to the correspondence relationship between the position coordinates in the first screen and the position coordinates in the second screen and the third position where the orthographic projection of the touch object on the first screen is located.
Wherein, in order to avoid the virtual image from blocking the picture displayed by the second screen, the transparency of the virtual image can be set.
Specifically, the characteristic information of the touch object described herein includes at least one of the following: the depth information of the touch body within a preset distance from the first screen and the position information of the touch body touching the first screen are obtained.
When the characteristic information of the touch object is the depth information, a virtual image of the touch object matched with the touch object is displayed, for example, as shown in a finger graph displayed by a lighter color line in fig. 5. When the characteristic information of the touch object is the position information, displaying a virtual image indicating the touch position of the touch object, for example, as shown in a square in fig. 6.
When the touch object is a hand of a user, the depth information of the touch object is depth information of a finger of the hand, and the position information of the touch object touching the first screen is position information of the hand (such as a fingertip, a knuckle, and the like) of the user touching the first screen.
In the embodiment of the present invention, the first screen and the second screen of the terminal are capacitive screens, and for the principle of implementing the space-isolated touch recognition on the screens, the following explanation is given by taking a hand as an example:
as shown in fig. 4, although the finger does not actually touch the first screen, the finger may form a capacitor with the first screen due to being close to the first screen, and the capacitance of the capacitor formed is different due to the different distance between the finger and the first screen. Through the processing of the CPU, the software synthesizes a virtual finger image (the size ratio of the virtual finger image is the same as that of the human finger), and finally maps the virtual finger image to the corresponding position of the second screen.
Further, in one embodiment of the present invention, the target object includes: a first target object and a second target object. And the first target object and the second target object are displayed at a second position on the second screen in an overlapping manner.
When a user uses a terminal, for example, when the user is experiencing a game application, the terminal receives a new message to another application and performs message prompt (such as prompt through a message prompt box), and an object for performing message prompt just covers a virtual key in the game interface. For this situation, the embodiment of the present invention provides two solutions, which are specifically described as follows:
the first mode comprises the following steps: receiving a first pressing operation of a user at a first position on a first screen; and responding to the first pressing operation, determining a target object to be controlled corresponding to the pressing degree according to the pressing degree of the first pressing operation, and executing an instruction corresponding to the target object to be controlled. The target object to be controlled is a first target object or a second target object.
The second mode comprises the following steps:
receiving a second pressing operation of the user at a second position on the second screen; and responding to the second pressing operation, determining a target object to be controlled corresponding to the pressing force degree of the second pressing operation according to the pressing force degree, and executing an instruction corresponding to the target object to be controlled. The target object to be controlled is a first target object or a second target object.
As can be seen from the foregoing first and second manners, in the embodiment of the present invention, the user can select the target object that the user wants to manipulate according to the difference of the pressing force. For example, there are set in advance: when the pressing force is smaller than a preset pressure value, the first target object is a target object selected to be controlled; and when the pressing degree is greater than or equal to the preset pressure value, the second target object is the target object selected to be operated. In this way, when the pressing force of the user at the first position on the first screen (or at the second position on the second screen) is smaller than the preset pressure value, the first target object is determined as the target object selected to be manipulated, and the instruction corresponding to the first target object is executed; when the pressing force of the user at the first position on the first screen (or the second position on the second screen) is greater than or equal to the preset pressure value, the second target object is determined as the target object selected to be manipulated, and an instruction corresponding to the second target object is executed.
For better understanding of the above technical solution, the following is exemplified:
for example, when the message prompt box overlaps with a virtual key in the game interface, if a pressing operation is performed on the first screen, the following may be set: lightly pressing a position on the first screen corresponding to the overlapping position of the message prompt box and the virtual key in the game interface, and displaying a result of responding to the WeChat function on the first screen; and re-pressing a position on the first screen corresponding to the overlapping position of the message prompt box and the virtual key in the game interface, and displaying the result of the response game function on the first screen. If a pressing operation is performed on the second screen, it may be set that: pressing the overlapping position again, and displaying the result of the response WeChat function on a first screen; the overlapped position is tapped and the result of the response game function is displayed on the first screen. It is understood that the light press and the heavy press may be exchanged for the corresponding target object, and are not limited to the above.
Wherein, the user can be according to the use habit of oneself, the press intensity of lightly pressing and heavily pressing that the training set up to can select whether to open this function.
Preferably, in the embodiment of the present invention, when the user inputs the first input on the first screen, the first screen is in a screen-off state, that is, when the user performs a touch operation on the first screen, the screen backlight of the first screen is in a closed state, but the touch chip is in an open state, so that the power of the terminal can be saved.
In summary, in the embodiments of the present invention, a touch operation for generating an instruction is performed on one screen of the terminal, and an execution result of executing the instruction is displayed on another screen. Furthermore, in the embodiment of the present invention, the condition of the touch object on the first screen side can be displayed on the second screen, so that a user can better control the touch object to operate on the first screen.
According to another aspect of the embodiments of the present invention, a terminal is provided, which can achieve the details of the middle frame method of the terminal described above and achieve the same effects.
Wherein, this terminal station includes at least: the first screen and the second screen which are oppositely arranged are positioned on the opposite sides of the terminal. The first screen and the second screen are located on different sides of the terminal, for example, the first screen is located on the front side of the terminal, the second screen is located on the back side of the terminal, or the first screen is located on the back side of the terminal, the second screen is located on the front side of the terminal, and the like. The first screen and the second screen are respectively controlled by two independent touch chips, and the two touch chips are controlled by the same CPU. The touch control chips of the first screen and the second screen are simultaneously started to capture touch control information of a user.
As shown in fig. 7, the terminal further includes:
a first display module 701, configured to display the target object at a second position on the second screen.
The first receiving module 702 is configured to receive a touch operation of a user.
Wherein, this touch operation includes: the method comprises the following steps that a first touch operation of a user at a first position of a first screen and/or a second touch operation of the user at a second position of a second screen are/is carried out; wherein the first position of the first screen corresponds to the second position of the second screen.
The executing module 703 is configured to execute a first instruction corresponding to the target object in response to the first input received by the first receiving module 702.
And a second display module 704 for displaying the execution result of the first instruction on a second screen.
Further, the terminal further includes:
the detection module is used for detecting the characteristic information of the touch body positioned on the first screen side.
And the third display module is used for displaying the virtual image matched with the touch control body at a third position on the second screen according to the characteristic information of the touch control body detected by the detection module.
The third position corresponds to a fourth position on the first screen, and the fourth position is a position where the orthographic projection of the touch body on the first screen is located.
Further, the characteristic information of the touch object includes at least one of the following: the depth information of the touch body within a preset distance from the first screen and the position information of the touch body touching the first screen are obtained. Wherein the preset distance is greater than or equal to 0.
Wherein the third display module comprises:
the first display unit is used for displaying a virtual image of the touch control body matched with the touch control body when the characteristic information of the touch control body is depth information; and the second display unit is used for displaying a virtual image used for indicating the touch position of the touch body when the characteristic information of the touch body is position information.
Further, the target object includes: the first target object and the second target object are displayed at the second position in an overlapping mode.
Further, the first receiving module 702 includes:
the first receiving unit is used for receiving a first pressing operation of a user at a first position on a first screen.
The execution module 703 includes:
and the first determining unit is used for responding to the first pressing operation and determining the target object to be controlled corresponding to the pressing degree according to the pressing degree of the first pressing operation received by the receiving unit.
Wherein the target object to be controlled is the first target object or the second target object;
and the first execution unit is used for executing the instruction corresponding to the target object to be controlled.
Further, the first receiving module 702 includes:
a second receiving unit for receiving a second pressing operation of the user at a second position on the second screen.
The execution module 703 includes:
and the second determining unit is used for responding to the second pressing operation received by the second receiving module and determining the target object to be controlled corresponding to the pressing degree of the second pressing operation according to the pressing degree of the second pressing operation.
Wherein the target object to be controlled is the first target object or the second target object;
and the second execution unit is used for executing the instruction corresponding to the target object to be controlled.
Further, the target object includes: at least one of a screen locking interface of the terminal, a main interface of the terminal, an application program interface, virtual keys and an object for performing message prompt.
Preferably, the first location overlaps with a projection of the second location on a first plane. The first plane is parallel to the first screen or the second screen.
Preferably, in the embodiment of the present invention, the first screen is in a screen-off state.
In the embodiment of the invention, the touch operation for generating the instruction is carried out on one screen of the terminal, and the execution result of executing the instruction is displayed on the other screen.
Fig. 8 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention.
The terminal 800 includes but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, a processor 810, and a power supply 811. Those skilled in the art will appreciate that the terminal configuration shown in fig. 8 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
And a processor 810, configured to, when the user input unit 807 receives a touch operation of the user after the display unit 806 controls to display the target object at the second position on the second screen, execute a first instruction corresponding to the target object in response to the touch operation, and display an execution result of the first instruction on the second screen through the display unit 806.
Wherein, the touch operation described herein includes: the method comprises the following steps that a first touch operation of a user at a first position of a first screen and/or a second touch operation of the user at a second position of a second screen are/is carried out; wherein the first position of the first screen corresponds to the second position of the second screen.
In the embodiment of the invention, the touch operation for generating the instruction is carried out on one screen of the terminal, and the execution result of executing the instruction is displayed on the other screen.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 801 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 810; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 801 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 801 can also communicate with a network and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user through the network module 802, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 803 may convert audio data received by the radio frequency unit 801 or the network module 802 or stored in the memory 809 into an audio signal and output as sound. Also, the audio output unit 803 may also provide audio output related to a specific function performed by the terminal 800 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 803 includes a speaker, a buzzer, a receiver, and the like.
The input unit 804 is used for receiving an audio or video signal. The input Unit 804 may include a Graphics Processing Unit (GPU) 8041 and a microphone 8042, and the Graphics processor 8041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 806. The image frames processed by the graphics processor 8041 may be stored in the memory 809 (or other storage medium) or transmitted via the radio frequency unit 801 or the network module 802. The microphone 8042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 801 in case of a phone call mode.
The terminal 800 also includes at least one sensor 805, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 8061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 8061 and/or the backlight when the terminal 800 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 805 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 806 is used to display information input by the user or information provided to the user. The Display unit 806 may include a Display panel 8061, and the Display panel 8061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 807 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 807 includes a touch panel 8071 and other input devices 8072. The touch panel 8071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 8071 (e.g., operations by a user on or near the touch panel 8071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 8071 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 810, receives a command from the processor 810, and executes the command. In addition, the touch panel 8071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 8071, the user input unit 807 can include other input devices 8072. In particular, other input devices 8072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 8071 can be overlaid on the display panel 8061, and when the touch panel 8071 detects a touch operation on or near the touch panel 8071, the touch operation is transmitted to the processor 810 to determine the type of the touch event, and then the processor 810 provides a corresponding visual output on the display panel 8061 according to the type of the touch event. Although in fig. 8, the touch panel 8071 and the display panel 8061 are two independent components to implement the input and output functions of the terminal, in some embodiments, the touch panel 8071 and the display panel 8061 may be integrated to implement the input and output functions of the terminal, which is not limited herein.
The interface unit 808 is an interface for connecting an external device to the terminal 800. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 808 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the terminal 800 or may be used to transmit data between the terminal 800 and external devices.
The memory 809 may be used to store software programs as well as various data. The memory 809 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 809 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 810 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 809 and calling data stored in the memory 809, thereby integrally monitoring the terminal. Processor 810 may include one or more processing units; preferably, the processor 810 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 810.
The terminal 800 may also include a power supply 811 (e.g., a battery) for powering the various components, and preferably, the power supply 811 may be logically coupled to the processor 810 via a power management system to provide management of charging, discharging, and power consumption via the power management system.
In addition, the terminal 800 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a terminal, including a processor 810, a memory 809, and a computer program stored in the memory 809 and capable of running on the processor 810, where the computer program, when executed by the processor 810, implements each process of the above operation control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the operation control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (6)

1.一种操作控制方法,应用于终端,所述终端至少包括:相对设置的第一屏幕和第二屏幕,其特征在于,所述方法包括:1. An operation control method, applied to a terminal, the terminal at least comprising: a first screen and a second screen that are relatively set, wherein the method comprises: 在所述第二屏幕的第二位置处显示目标对象,其中,所述目标对象包括虚拟按键和用于进行消息提示的对象中的至少一项;Displaying a target object at a second position on the second screen, wherein the target object includes at least one of a virtual key and an object for prompting a message; 接收用户的触控操作,所述触控操作包括:用户在所述第一屏幕的第一位置处的第一触控操作,和/或,用户在所述第二屏幕的第二位置处的第二触控操作;其中,所述第一屏幕的第一位置与所述第二屏幕的第二位置相对应;Receive a user's touch operation, where the touch operation includes: a user's first touch operation at a first position on the first screen, and/or a user's first touch operation at a second position on the second screen a second touch operation; wherein, the first position of the first screen corresponds to the second position of the second screen; 响应于所述触控操作,执行与所述目标对象对应的第一指令;in response to the touch operation, executing a first instruction corresponding to the target object; 在所述第二屏幕上显示所述第一指令的执行结果;displaying the execution result of the first instruction on the second screen; 所述目标对象包括:第一目标对象和第二目标对象,所述第一目标对象与所述第二目标对象重叠显示在所述第二位置处;The target object includes: a first target object and a second target object, the first target object and the second target object are overlapped and displayed at the second position; 所述接收用户的触控操作,包括:The receiving a user's touch operation includes: 接收用户在所述第一屏幕上的第一位置处的第一按压操作;receiving a first pressing operation by the user at a first position on the first screen; 所述响应于所述触控操作,执行与所述目标对象对应的第一指令,包括:The executing the first instruction corresponding to the target object in response to the touch operation includes: 响应于所述第一按压操作,根据所述第一按压操作的按压力度,确定与所述按压力度对应的待操控目标对象,所述待操控目标对象为所述第一目标对象或所述第二目标对象;In response to the first pressing operation, according to the pressing force of the first pressing operation, a target object to be manipulated corresponding to the pressing force is determined, and the target object to be manipulated is the first target object or the first target object. two target objects; 执行与所述待操控目标对象对应的指令;executing an instruction corresponding to the target object to be manipulated; 或,所述接收用户的触控操作,包括:Or, the receiving a user's touch operation includes: 接收用户在所述第二屏幕上的第二位置处的第二按压操作;receiving a second pressing operation by the user at a second position on the second screen; 所述响应于所述触控操作,执行与所述目标对象对应的第一指令,包括:The executing the first instruction corresponding to the target object in response to the touch operation includes: 响应于所述第二按压操作,根据所述第二按压操作的按压力度,确定与所述按压力度对应的待操控目标对象,所述待操控目标对象为所述第一目标对象或所述第二目标对象;In response to the second pressing operation, according to the pressing force of the second pressing operation, a target object to be manipulated corresponding to the pressing force is determined, and the target object to be manipulated is the first target object or the first target object. two target objects; 执行与所述待操控目标对象对应的指令;executing an instruction corresponding to the target object to be manipulated; 所述方法还包括:The method also includes: 检测位于所述第一屏幕侧的触控体的特征信息;Detecting feature information of the touch body on the first screen side; 根据所述触控体的特征信息,在所述第二屏幕上的第三位置处,显示与所述触控体匹配的虚拟图像;Displaying a virtual image matching the touch body at a third position on the second screen according to the characteristic information of the touch body; 其中,所述第三位置与所述第一屏幕上的第四位置相对应,所述第四位置为所述触控体在所述第一屏幕上的正投影所在的位置;Wherein, the third position corresponds to a fourth position on the first screen, and the fourth position is the position where the orthographic projection of the touch object on the first screen is located; 所述触控体的特征信息包括以下至少一种:处于距离所述第一屏幕预设距离内的触控体的深度信息、触控体触碰在所述第一屏幕上的位置信息;The feature information of the touch body includes at least one of the following: depth information of the touch body within a preset distance from the first screen, and position information of the touch body touching the first screen; 所述显示与所述触控体匹配的虚拟图像,包括:The displaying a virtual image matching the touch body includes: 当所述触控体的特征信息包括所述深度信息时,显示与所述触控体匹配的触控体虚拟图像;When the feature information of the touch body includes the depth information, displaying a virtual image of the touch body matched with the touch body; 当所述触控体的特征信息包括所述位置信息时,显示用于指示触控体触碰位置的虚拟图像;When the feature information of the touch body includes the position information, displaying a virtual image for indicating the touch position of the touch body; 终端的第一屏幕和第二屏幕是电容屏幕。The first screen and the second screen of the terminal are capacitive screens. 2.根据权利要求1所述的方法,其特征在于,所述第一位置与所述第二位置在第一平面上的投影重叠,其中,所述第一平面与所述第一屏幕或所述第二屏幕平行。2. The method according to claim 1, wherein the projection of the first position and the second position on a first plane overlaps, wherein the first plane and the first screen or the The second screen is parallel. 3.根据权利要求1至2任一项所述的方法,其特征在于,所述第一屏幕处于灭屏状态。3. The method according to any one of claims 1 to 2, wherein the first screen is in an off-screen state. 4.一种终端,所述终端至少包括:相对设置的第一屏幕和第二屏幕,其特征在于,所述终端还包括:4. A terminal, the terminal comprising at least: a first screen and a second screen that are relatively set, wherein the terminal further comprises: 第一显示模块,用于在所述第二屏幕的第二位置处显示目标对象,其中,所述目标对象包括虚拟按键和用于进行消息提示的对象中的至少一项;a first display module, configured to display a target object at a second position on the second screen, wherein the target object includes at least one of a virtual key and an object for prompting a message; 第一接收模块,用于接收用户的触控操作,所述触控操作包括:用户在所述第一屏幕的第一位置处的第一触控操作,和/或,用户在所述第二屏幕的第二位置处的第二触控操作;其中,所述第一屏幕的第一位置与所述第二屏幕的第二位置相对应;a first receiving module, configured to receive a user's touch operation, where the touch operation includes: a user's first touch operation at a first position on the first screen, and/or a user's first touch operation on the second screen a second touch operation at a second position of the screen; wherein the first position of the first screen corresponds to the second position of the second screen; 执行模块,用于响应于所述触控操作,执行与所述目标对象对应的第一指令;an execution module, configured to execute a first instruction corresponding to the target object in response to the touch operation; 第二显示模块,用于在所述第二屏幕上显示所述第一指令的执行结果;a second display module, configured to display the execution result of the first instruction on the second screen; 所述目标对象包括:第一目标对象和第二目标对象,所述第一目标对象与所述第二目标对象重叠显示在所述第二位置处;The target object includes: a first target object and a second target object, the first target object and the second target object are overlapped and displayed at the second position; 所述第一接收模块包括:The first receiving module includes: 第一接收单元,用于接收用户在所述第一屏幕上的第一位置处的第一按压操作;a first receiving unit, configured to receive a user's first pressing operation at a first position on the first screen; 所述执行模块包括:The execution module includes: 第一确定单元,用于响应于所述第一接收单元接收到的第一按压操作,根据所述第一按压操作的按压力度,确定与所述按压力度对应的待操控目标对象,所述待操控目标对象为所述第一目标对象或所述第二目标对象;A first determining unit, configured to, in response to a first pressing operation received by the first receiving unit, determine, according to the pressing force of the first pressing operation, a target object to be manipulated corresponding to the pressing force, and the to-be-operated target object corresponding to the pressing force. The manipulation target object is the first target object or the second target object; 第一执行单元,用于执行与所述待操控目标对象对应的指令;a first execution unit, configured to execute an instruction corresponding to the target object to be manipulated; 或,所述第一接收模块包括:Or, the first receiving module includes: 第二接收单元,用于接收用户在所述第二屏幕上的第二位置处的第二按压操作;a second receiving unit, configured to receive a second pressing operation by the user at a second position on the second screen; 所述执行模块包括:The execution module includes: 第二确定单元,用于响应于所述第二接收单元接收到的第二按压操作,根据所述第二按压操作的按压力度,确定与所述按压力度对应的待操控目标对象,所述待操控目标对象为所述第一目标对象或所述第二目标对象;A second determining unit, configured to, in response to a second pressing operation received by the second receiving unit, determine, according to the pressing force of the second pressing operation, a target object to be manipulated corresponding to the pressing force, the to-be-operated target object corresponding to the pressing force. The manipulation target object is the first target object or the second target object; 第二执行单元,用于执行与所述待操控目标对象对应的指令;a second execution unit, configured to execute an instruction corresponding to the target object to be manipulated; 所述终端还包括:The terminal also includes: 检测模块,用于检测位于所述第一屏幕侧的触控体的特征信息;a detection module, configured to detect characteristic information of the touch object located on the side of the first screen; 第三显示模块,用于根据所述检测模块检测到的所述触控体的特征信息,在所述第二屏幕上的第三位置处,显示与所述触控体匹配的虚拟图像;a third display module, configured to display a virtual image matching the touch body at a third position on the second screen according to the characteristic information of the touch body detected by the detection module; 其中,所述第三位置与所述第一屏幕上的第四位置相对应,所述第四位置为所述触控体在所述第一屏幕上的正投影所在的位置;Wherein, the third position corresponds to a fourth position on the first screen, and the fourth position is the position where the orthographic projection of the touch object on the first screen is located; 所述触控体的特征信息包括以下至少一种:处于距离所述第一屏幕预设距离内的触控体的深度信息、触控体触碰在所述第一屏幕上的位置信息;The feature information of the touch body includes at least one of the following: depth information of the touch body within a preset distance from the first screen, and position information of the touch body touching the first screen; 所述第三显示模块包括:The third display module includes: 第一显示单元,用于当所述触控体的特征信息为所述深度信息时,显示与所述触控体匹配的触控体虚拟图像;a first display unit, configured to display a virtual image of the touch body matched with the touch body when the feature information of the touch body is the depth information; 第二显示单元,用于当所述触控体的特征信息为所述位置信息时,显示用于指示触控体触碰位置的虚拟图像;a second display unit, configured to display a virtual image used to indicate the touch position of the touch body when the feature information of the touch body is the position information; 终端的第一屏幕和第二屏幕是电容屏幕。The first screen and the second screen of the terminal are capacitive screens. 5.根据权利要求4所述的终端,其特征在于,所述第一位置与所述第二位置在第一平面上的投影重叠,其中,所述第一平面与所述第一屏幕或所述第二屏幕平行。5. The terminal according to claim 4, wherein the projection of the first position and the second position on a first plane overlaps, wherein the first plane and the first screen or the other The second screen is parallel. 6.根据权利要求4至5任一项所述的终端,其特征在于,所述第一屏幕处于灭屏状态。6 . The terminal according to claim 4 , wherein the first screen is in an off-screen state. 7 .
CN201811452861.8A 2018-11-30 2018-11-30 An operation control method and terminal Active CN109558061B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811452861.8A CN109558061B (en) 2018-11-30 2018-11-30 An operation control method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811452861.8A CN109558061B (en) 2018-11-30 2018-11-30 An operation control method and terminal

Publications (2)

Publication Number Publication Date
CN109558061A CN109558061A (en) 2019-04-02
CN109558061B true CN109558061B (en) 2021-05-18

Family

ID=65868261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811452861.8A Active CN109558061B (en) 2018-11-30 2018-11-30 An operation control method and terminal

Country Status (1)

Country Link
CN (1) CN109558061B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110007840A (en) * 2019-04-10 2019-07-12 网易(杭州)网络有限公司 Object control method, device, medium and electronic device
CN110083302A (en) * 2019-04-30 2019-08-02 维沃移动通信有限公司 A method, device and terminal for performing preset operations
CN110162262B (en) * 2019-05-08 2022-02-25 安徽华米信息科技有限公司 Display method and device, intelligent wearable device and storage medium
CN110362231B (en) * 2019-07-12 2022-05-20 腾讯科技(深圳)有限公司 Head-up touch device, image display method and device
CN110502182B (en) * 2019-08-28 2021-06-29 Oppo(重庆)智能科技有限公司 Operation processing method, device, mobile terminal, and computer-readable storage medium
CN111124243A (en) * 2019-12-18 2020-05-08 华勤通讯技术有限公司 Response method and device
CN113680047B (en) * 2021-09-08 2024-06-28 网易(杭州)网络有限公司 Terminal operation method, device, electronic equipment and storage medium
CN118170271A (en) * 2022-12-09 2024-06-11 普赞加信息科技南京有限公司 Multi-touch device, system and related method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8943434B2 (en) * 2010-10-01 2015-01-27 Z124 Method and apparatus for showing stored window display
EP2767888A3 (en) * 2013-02-19 2014-10-01 Handscape Inc. Method for user input from alternative touchpads of a handheld computerized device
CN105094654B (en) * 2014-05-07 2020-02-07 中兴通讯股份有限公司 Screen control method and device
CN107077255A (en) * 2017-01-19 2017-08-18 深圳市汇顶科技股份有限公司 A kind of method and device by pressing dynamics control intelligent terminal operation
CN108153466A (en) * 2017-11-28 2018-06-12 北京珠穆朗玛移动通信有限公司 Operating method, mobile terminal and storage medium based on double screen
CN108205419A (en) * 2017-12-21 2018-06-26 中兴通讯股份有限公司 Double screens control method, apparatus, mobile terminal and computer readable storage medium

Also Published As

Publication number Publication date
CN109558061A (en) 2019-04-02

Similar Documents

Publication Publication Date Title
CN109558061B (en) An operation control method and terminal
CN110737374B (en) Operation method and electronic device
CN108536411A (en) A kind of method for controlling mobile terminal and mobile terminal
CN110069178B (en) Interface control method and terminal equipment
CN108459797A (en) A kind of control method and mobile terminal of Folding screen
CN107728886B (en) One-handed operation method and device
CN110174993B (en) A display control method, terminal device and computer-readable storage medium
WO2021068885A1 (en) Control method and electronic device
CN109445656B (en) A screen manipulation method and terminal device
CN109407949B (en) A display control method and terminal
CN109521937B (en) Screen display control method and mobile terminal
CN108319386A (en) A kind of display screen false-touch prevention method and mobile terminal
CN107728923A (en) The processing method and mobile terminal of a kind of operation
CN110531915A (en) Screen operation method and terminal equipment
CN110324497A (en) A kind of method of controlling operation thereof and terminal
CN108509141A (en) A kind of generation method and mobile terminal of control
CN110221799A (en) A kind of control method, terminal and computer readable storage medium
CN108984099B (en) Man-machine interaction method and terminal
CN111124235B (en) Screen control method and flexible electronic device
CN108762648A (en) Screen operation control method and mobile terminal
CN108897477B (en) Operation control method and terminal equipment
CN109189514B (en) A terminal device control method and terminal device
CN111443860B (en) Touch method and electronic device
CN108540642A (en) Operation method of mobile terminal and mobile terminal
CN109814825B (en) Display screen control method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant