Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The term "and/or" is merely an association relationship describing the associated object, and means that three relationships may exist, for example, a and/or B may mean that a exists alone, while a and B exist together, and B exists alone.
The terms first and second and the like in the description and in the claims of embodiments of the application, are used for distinguishing between different objects and not necessarily for describing a particular sequential order of objects. For example, the first target object and the second target object, etc., are used to distinguish between different target objects, and are not used to describe a particular order of target objects.
In embodiments of the application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g." in an embodiment should not be taken as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more. For example, a plurality of processing units refers to two or more processing units, and a plurality of systems refers to two or more systems.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. The user interface is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, and the interface source code is analyzed and rendered on the mobile device or the electronic device to finally be presented as content which can be identified by a user. A commonly used presentation form of a user interface is a graphical user interface (graphic user interface, GUI), which refers to a graphically displayed user interface that is related to computer operations. It may be a visual interface element of text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc., displayed in a display of a mobile or electronic device.
First, a communication system 1000 provided in an embodiment of the present application will be described. Fig. 1 is a schematic diagram of a communication system, referring to fig. 1, the system includes a first electronic device and a second electronic device. In an embodiment of the present application, the first electronic device is a projector. In other embodiments, the first electronic device may also be other electronic devices with camera and display functions. For example, a television or the like may be used. The second electronic device is an electronic device such as a mobile phone, a tablet computer, a wearable device, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), or the like, and the embodiment of the present application does not limit a specific type of the terminal device.
Fig. 2 shows a schematic structural diagram of the second electronic device 100. It should be understood that the electronic device 100 shown in fig. 2 is only one example of an electronic device, and that the electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 2 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a memory, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SERIAL DATA LINE, SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example, the processor 110 may couple the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface, to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example, the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement bluetooth functions. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (CAMERA SERIAL INTERFACE, CSI), display serial interfaces (DISPLAY SERIAL INTERFACE, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example, the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may comprise a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a Beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS)
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. Thus, the electronic device 100 may play or record video in a variety of encoding formats, such as moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent recognition of the electronic device 100, for example, image recognition, face recognition, voice recognition, text understanding, etc., can be realized through the NPU.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example, when a touch operation with a touch operation intensity smaller than a first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 3 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present application.
The layered architecture of the electronic device 100 divides the software into several layers, each with a distinct role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun rows (Android runtime) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 3, the application package may include applications such as cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, screen shots, etc.
In the embodiment of the application, a user can screen the video data in the electronic equipment through a screen-casting function in the video application, namely, screen-casting the video picture to the projector side for display. Optionally, the user can also screen the currently displayed picture of the electronic device to the projector side for display through third party screen projection software in the electronic device or screen projection software developed by a former factory. The specific screen throwing mode can be set according to actual requirements, and the application is not limited.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android runtime is responsible for scheduling and management of the android system.
The core library comprises two parts, wherein one part is a function required to be called by java language, and the other part is an android core library.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. Such as surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver, a Bluetooth driver and the like.
It is to be understood that the components contained in the framework layer, the system library, and the runtime layer shown in fig. 3 do not constitute a particular limitation of the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components.
Fig. 4 is a hardware structure of the first electronic device 200 exemplarily shown. As shown in fig. 4, the first electronic device 200 may include a processor 201, a memory 202, a wireless communication processing module 203, a power switch 204, a high definition multimedia interface (high definition multimedia interface, HDMI) communication processing module 205, a USB communication processing module 206, an image projection module 207, an audio module 208, a camera 209 (may also be referred to as an image acquisition module). The modules may be connected by a bus.
The processor 201 may be used to read and execute computer readable instructions. In particular implementations, processor 202 may include primarily controllers, operators, and registers. The controller is mainly responsible for instruction decoding and sending out control signals for operations corresponding to the instructions. The arithmetic unit is mainly responsible for performing fixed-point or floating-point arithmetic operations, shift operations, logic operations, and the like, and may also perform address operations and conversions. The register is mainly responsible for storing register operands, intermediate operation results and the like temporarily stored in the instruction execution process. In a specific implementation, the hardware architecture of the processor 201 may be an Application Specific Integrated Circuit (ASIC) architecture, a MIPS architecture, an ARM architecture, an NP architecture, or the like.
In some embodiments, the processor 201 may be configured to parse signals received by the wireless communication processing module 203, such as a projection request or a projection instruction sent by the second electronic device, and so on. The processor 201 may be configured to perform corresponding processing operations according to the parsing result, such as driving the image projection module 207 to perform the projection operation according to the projection request or the projection instruction, and so on.
In some examples, the processor 201 includes a video codec for compressing or decompressing digital video. In an embodiment of the present application, the video codec may decompress the multimedia content from the second electronic social networking device 100. The electronic device 200 may support one or more video codecs that may play video in one or more encoding formats. For example, MPEG1, MPEG2, MPEG3, MPEG4, etc. The processor 201 may be configured to drive the image projection module to perform display according to the decompression result of the video codec.
The wireless communication processing module 203 may include a Bluetooth (BT) communication processing module 203A, wi-Fi communication processing module 203B, an infrared communication module 203C, and so on.
In some embodiments, the wireless communication processing module 203 may be configured to establish a communication connection with the second electronic device 100 and receive encoded data transmitted by the second electronic device 100 based on the communication connection. For example, the Wi-Fi communication processing module 203B may be configured to establish a Wi-Fi direct communication connection with the second electronic device 100, and the Bluetooth (BT) communication processing module 203A may be configured to establish a bluetooth communication connection with the second electronic device 100, i.e., the wireless communication processing module 203 may support sharing of multimedia content between the first electronic device 200 and the second electronic device 100 via a mirrored screen (e.g., miracast). The infrared communication module 203C may be used to receive infrared signals transmitted by auxiliary devices (e.g., remote controls, game pads, etc.). That is, the wireless communication module 203 further supports the first electronic device 200 to receive control signals (which may also be referred to as auxiliary information, auxiliary parameters or control parameters, control information, etc., without limitation of the present application) sent by other auxiliary devices.
In one embodiment, the wireless communication processing module 203 may monitor the signal transmitted by the second electronic device 100, such as a probe request, a scanning signal, discover the second electronic device 100, and establish a communication connection with the second electronic device 100. In another embodiment, the wireless communication processing module 203 may also transmit a signal, such as a probe request, a scanning signal, so that the second electronic device 100 may discover the first electronic device 200 and establish a communication connection (such as a Wi-Fi P2P connection) with the second electronic device 100.
Memory 202 is coupled to processor 201 for storing various software programs and/or sets of instructions. In particular implementations, memory 202 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. The memory 202 may store an operating system, such as an embedded operating system like uCOS, vxWorks, RTLinux, harmony, android. The memory 202 may also store a communication program that may be used to communicate with the second electronic device 100, one or more servers, or additional devices.
The power switch 204 may be used to control the power supplied by the power source to the first electronic device 100.
The HDMI communication processing module 207 is operable to communicate with other devices through an HDMI interface (not shown). In some embodiments, the first electronic device 200 may also include a serial interface such as an RS-232 interface. The serial interface may be coupled to a first electronic audio playback device, such as a speaker, such that the display and the audio playback device cooperate to play audio and video. It is to be understood that the structure illustrated in fig. 4 does not constitute a specific limitation on the electronic device 200. In other embodiments of the application, electronic device 200 may include more or fewer components than shown, or certain components may be combined, or certain components may be separated, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The USB communication processing module 206 may be used to communicate with other devices through a USB interface (not shown).
The image projection module 207 may have a light source (not shown) that modulates light emitted from the light source according to image data and projects an image on a screen (or an object such as a wall surface). The image data may be an image generated by the first electronic device 200 according to the content of the local data, or may be an image sent by the second electronic device 100 (may be referred to as a screen shot image, screen shot data, etc., which is not limited by the present application). Optionally, in the embodiment of the present application, only the second electronic device is taken as an example of the projector, and as described above, in other embodiments, other electronic devices such as a television may also implement the reverse control scheme in the present application, where the second electronic device may include a display screen.
The audio module 208 may be configured to output an audio signal via the audio output interface, which may enable the first electronic device 200 to support audio playback. The audio module 230 may also be used to receive audio data through an audio input interface. The audio module 208 may include, but is not limited to, a microphone, speaker, receiver, etc.
The camera 209, which may also be referred to as an image acquisition module or the like, the camera 209 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The software system of the first electronic device 200 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In particular, the software system of the first electronic device 200 may beWindows, linux, or other operating system. Embodiments of the application are configured in a layered mannerThe system is exemplified by the software architecture of the first electronic device 200.
Fig. 5 is a software structural block diagram of the first electronic device 200 according to the embodiment of the present application.
The layered architecture of the first electronic device 200 divides the software into several layers, each layer having a distinct role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, namely an application layer, a framework layer, a system library and runtime layer and a kernel layer from top to bottom.
The application layer may include camera, gallery, calendar, gesture control, WLAN, bluetooth, music, video, reverse control module, etc. applications. It should be noted that the application program included in the application program layer shown in fig. 5 is only an exemplary illustration, and the present application is not limited thereto. It will be appreciated that the application included in the application program layer does not constitute a specific limitation on the first electronic device 200. In other embodiments of the present application, the first electronic device 200 may include more or fewer applications than the application program layer shown in fig. 5, and the first electronic device 200 may also include entirely different applications.
In the embodiment of the application, the reverse control module is used for providing a mode selection interface. The reverse control module may determine the target mode in response to a received user operation. And, after the reverse control module determines the target mode, the corresponding module (for example, bluetooth driver, etc.) may be driven to establish a communication connection with the second electronic device (may also be referred to as a control data transmission channel, etc.), which is not limited by the present application. And, the reverse control module may transmit the reverse control information (which may also be referred to as a reverse control parameter, a reverse operation parameter, etc., without limitation, according to the present application) input by the control information generating module to the second electronic device through the control data transmission channel.
The framework layer provides an application programming interface (Application Programming Interface, API) and programming framework for the application of the application layer, including various components and services to support the developer's android development. The framework layer includes some predefined functions. As shown in FIG. 3, the framework layers may include a view system, a window manager, a resource manager, a content provider, and the like. The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like. The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, and the like.
The system library and Runtime layer includes a system library and Android Runtime (Android run time). The system library may include a plurality of functional modules. Such as browser kernels, 3D graphics libraries (e.g., openGL ES), font libraries, etc. The browser kernel is responsible for interpreting the web page's grammar (e.g., an application HTML, javaScript in standard universal markup language) and rendering (displaying) the web page. The 3D graphic library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The font library is used for realizing the input of different fonts. The android runtime includes a core library and virtual machines. And the android running time is responsible for scheduling and managing an android system. The core library comprises two parts, wherein one part is a function required to be called by java language, and the other part is an android core library. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
It will be appreciated that the components contained in the framework layer, the system library, and the runtime layer illustrated in fig. 5 do not constitute a particular limitation of the first electronic device 200. In other embodiments of the application, the first electronic device 200 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver, a control information generating module and the like.
In one possible implementation, the control information generating module may obtain an image collected by the camera. The control information generation module can identify the image and acquire the control information. Optionally, the control information generating module may further acquire the auxiliary information based on the received auxiliary signal sent by the auxiliary device. The control information generation module may output reverse control information to the reverse control module, and the reverse control information may include control information or the reverse control information may include control information and auxiliary information.
In another possible implementation manner, the control information generating module may acquire control information sent by the camera through the camera driver. That is, the camera may acquire control information based on the acquired image and transmit the control information to the control information generation module through the camera driver. Optionally, the control information generating module may further acquire the auxiliary information based on the received auxiliary signal sent by the auxiliary device. The control information generation module may output reverse control information to the reverse control module, and the reverse control information may include control information or the reverse control information may include control information and auxiliary information.
That is, in the embodiment of the present application, the process of identifying the image to obtain the control information may be performed by the control information generating module or may be performed by the camera, which is not limited by the present application.
Illustratively, the camera driver is used to abstract the camera to hide a particular channel of the camera so that the application can access (or invoke) the camera. The camera driver may communicate with the camera based on a universal serial bus video (universal serial bus video class, UVC) protocol. The UVC protocol may also be understood as a protocol based on the UVC channel, i.e. the camera establishes a UVC connection with the camera driver and transmits a message conforming to the UVC protocol based on the UVC connection. The camera driver may also communicate with the camera based on a Remote Network Driver Interface Specification (RNDIS) protocol. It should be noted that, the RNDIS protocol may also be understood as a Socket (Socket) channel-based protocol, that is, the camera and the camera driver establish Socket connection through the Socket channel, and transmit a message conforming to the RNDIS protocol based on the Socket connection.
Optionally, the UVC channel may be used to transmit control instructions and video streams, and the Socket channel may be used to transmit control information and other information in the embodiment of the present application.
The camera of the first electronic device 200 may be an external camera and/or an internal camera. The external camera may be connected to the USB interface of the first electronic device 200 through a USB cable. A built-in camera may be embedded in the first electronic device 200, and in the first electronic device 200, the built-in camera is connected to a USB interface of the first electronic device 200 through a USB cable.
Fig. 6 illustrates a schematic diagram of a connection between an electronic device 600 and a camera. As shown in fig. 6, hardware in electronic device 600 includes, but is not limited to, a USB interface and a camera. The camera can be connected with the USB interface through a USB cable and performs data interaction through USB connection. The USB interface can interact data with the upper layer. For example, the USB interface may output data or instructions (e.g., control information) input by the camera to the control information generating module, and may also output data or instructions input by other applications or modules to the camera.
Optionally, in an embodiment of the present application, an image acquisition range of a camera of the second electronic device is greater than or equal to a projection range of the image projection module. Optionally, the image capturing range of the camera of the second electronic device may vary with the projection range of the image projection module. By way of example, the projection range is understood to be the size of the picture projected by the image projection module on the screen (or an object such as a wall surface). The projection range can be automatically adjusted, or can be manually adjusted by a user (for example, an adjusting button on the projector body or a remote controller is used), and the application is not limited.
Fig. 7 is a schematic structural diagram of software and hardware of a camera according to an embodiment of the present application. As shown in fig. 7, the embodiment of the application takes a Linux system with a camera as a layered architecture as an example, and illustrates the structure of the camera. The layered architecture of the camera divides the software into a plurality of layers, and each layer has clear roles and division work. The layers communicate with each other through a software interface. The architecture of the camera includes, from top to bottom, an application layer and a Kernel (Kernel) layer.
The application layer may include, but is not limited to, applications such as image processing applications. Alternatively, the image processing applications may include, but are not limited to, video Input (VI), video processing subsystem (video process sub-system, VPSS), video Encoding (VENC), video graphics system (video GRAPHIC SYSTEM, VGS), and the like sub-functions (or sub-applications). The image processing application is used for image processing of the image, for example, image processing procedures such as noise reduction, color correction and the like are performed on the image through one or more sub-functions in the image processing application, and then the image after the image processing is output to the electronic device. The application included in the application layer shown in fig. 7 is only an exemplary illustration, and the present application is not limited in this regard. It will be appreciated that the application program layer includes applications that do not constitute a specific limitation on cameras. In other embodiments of the present application, the camera may include more or fewer applications than the applications included in the application layer shown in FIG. 7. Alternatively, each application in the application layer may be pre-installed before shipment of the camera or the electronic device including the camera, or may be installed when the camera or the electronic device including the camera is upgraded.
The kernel layer is an intermediate layer between hardware and software (i.e., application layer). Which passes the application's requests to the hardware and acts as an underlying driver addressing the various devices and components in the system. The kernel layer may be used to manage hardware devices and to provide application usage. The kernel layer includes one or more components. For example, system call interfaces, process management, memory management, virtual file systems, network stacks, vision processing modules, ISP drivers, sensor drivers, and the like may be included.
For example, as described above, the process of acquiring control information may be performed by a camera, in which case a vision processing module in the camera may acquire an ISP processed image, and the vision processing module may acquire control information based on the acquired image.
The hardware of the camera head includes, for example, a CPU, ISP, and sensor. Optionally, the hardware may also include a memory or the like. The ISP is used for processing the image and the video stream and outputting the processed video stream and the processed image in two paths. The CPU is merely illustrative, and various microcontrollers such as a micro control unit (microcontroller unit, MCU) or devices functioning as a processor or microcontroller may be an alternative to the CPU described above.
The ISP is arranged with processing circuit and receiving and transmitting pins. The processing circuitry may control the transceiver pins to receive or transmit data or signals to communicate with the CPU of the camera and electronic devices (e.g., projector). The processing circuitry may also support applications in the application layer, such as image processing applications, to implement corresponding functions.
The CPU is arranged on a chip and is provided with a processing circuit and a receiving-transmitting pin. Also, the processing circuitry may control the transceiver pins to transmit or receive data or signals to communicate with ISPs of electronic devices (e.g., projectors) and cameras, respectively. In addition, the processing circuitry may also support the application or a function in the application to perform a corresponding process, e.g., may support a gesture recognition application or a gesture recognition function in a video application to perform a corresponding process.
In the embodiments of the present application, the description is made with each application (or module) as a main body for realizing each function. In practice, the functions of the respective application programs are implemented by processing circuits in the ISP or the CPU, and will not be repeated hereinafter.
The sensor is a photosensitive element of the camera and is used for collecting optical signals, converting the collected optical signals into electric signals, and then transmitting the electric signals to the ISP for processing and converting the electric signals into images or video streams.
Alternatively, the CPU and ISP may be integrated on a chip, or on a different chip, and connected via a bus. The CPU may output control signals to the ISP through a control channel with the ISP in response to a request from the electronic device to trigger corresponding processing circuits (also understood as modules) in the ISP. The CPU may also output data to the ISP through a data channel with the ISP, such as palm coordinates as described in the embodiments of the present application. The ISP may output data to the CPU through a data channel with the CPU. It should be noted that, the control channel and the data channel described above may refer to the same physical circuit, or may be different physical circuits, which is not limited by the present application.
Fig. 8 is a schematic view of an exemplary application scenario, please refer to fig. 8, in which a communication connection is established between a first electronic device (e.g. a projector) and a second electronic device (e.g. a tablet). The second electronic device triggers a screen-casting function, such as starting a screen-casting application, in response to the received user operation. The second electronic device sends the screen throwing data to the first electronic device. The projection data may also be called a projection image, etc., and the present application is not limited thereto. The first electronic equipment receives the screen projection data, encodes and decodes the screen projection data, and projects the screen projection data onto a screen (or other objects such as a wall surface) to the screen projection data. A screen shot is displayed on the screen, wherein the screen shot includes screen shot data and other images (optional). Alternatively, the other images may be generated by the first electronic device, and may include, for example, time information, a drop frame, etc., which is not limited by the present application. In some examples, the screen projection image may be understood as screen projection data, and in the embodiment of the present application, any replacement may be performed between the screen projection image and the screen projection data, which is not limited by the present application.
In the scenario shown in fig. 8, the interaction between the user and the screen is typically done by a second electronic device, which may also be referred to as the content side or the content generation side. For example, if the user desires to adjust the video progress displayed in the screen-projection screen, the user needs to adjust at the second electronic device side, the second electronic device adjusts the video playing progress in response to the received user operation, generates corresponding screen-projection data based on the adjusted progress, and sends the screen-projection data to the first electronic device. The first electronic equipment projects the screen projection data onto a screen, and the projection picture on the screen displays a video picture with the adjusted progress.
That is, the user interaction in the screen-projection scene in the prior art is usually performed on the second electronic device side, and the interaction is limited to simple operations, such as progress adjustment, volume adjustment, and the like.
The application provides a control method, electronic equipment and a system. Fig. 9 is a schematic diagram schematically illustrating an example, and referring to fig. 9, in this method, a first electronic device establishes a communication connection with a second electronic device as a virtual input device, where the communication connection may also be referred to as a control channel. The user may point to any location in the projected picture through gestures, pointing tools, laser devices, etc. The first electronic device obtains control information (which may also be referred to as a control parameter, an operation parameter, etc., without limitation) based on the image collected by the camera, where the control information may be used to indicate a position pointed by the user. The first electronic device transmits control information to the second electronic device through the control channel. The second electronic device may generate the screen projection data (may also be referred to as a screen projection image, content data, etc., without limitation of the present application) based on the control information, and transmit the screen projection data to the first electronic device through a data transmission channel with the first electronic device. The first electronic device displays the received image, for example, the first electronic device projects the screen projection data onto a screen on which a screen projection screen is displayed. According to the application, the first electronic device is virtualized as one input device, so that a user can reversely control the second electronic device through the first electronic device. In the mode, the control information data volume sent by the first electronic equipment to the second electronic equipment is smaller, the transmission delay can be effectively reduced, and the occupied bandwidth is smaller. The second electronic equipment can generate corresponding screen throwing data according to the control information in real time. And the user performs reverse control on the first electronic equipment side, so that immersive interaction experience can be realized, and user use experience is improved.
Fig. 10 is a schematic diagram illustrating communication connection between a first electronic device and a second electronic device, and referring to fig. 10, a first communication connection and a second communication connection are established between the first electronic device and the second electronic device, where the first communication connection may be referred to as a data transmission channel or a data communication connection, and is used for the second electronic device to transmit screen projection data to the first electronic device. The second communication connection may be referred to as a reverse control channel or a control information transmission channel for the first electronic device to transmit control information to the second electronic device. The control information, which may also be referred to as reverse control information, is generated by the first electronic device and transmitted to the second electronic device for reverse control of the second electronic device. In the embodiment of the application, the reverse control can be understood that the first electronic device is used as the virtual input end of the second electronic device to control the second electronic device, that is, the user can reversely control the second electronic device through the first electronic device. Optionally the first communication connection may also be used for transmitting control information sent by the second electronic device to the first electronic device. The control information may also be understood as forward control information, i.e. in the embodiment of the present application, the control information is divided into forward and reverse, and the control information sent by the second electronic device to the first electronic device is the forward control information. The control information sent by the first electronic device to the second electronic device is reverse control information.
The protocols supported by the first communication connection and the second communication connection in embodiments of the present application may be the same or different. Alternatively, the first communication connection may be wired or wireless. The second communication connection may be wired or wireless. The wired connection manner of the first communication connection may include, but is not limited to, type C, HDMI, USB, etc. The wired connection manner of the second communication connection may include, but is not limited to, type C, USB, etc. The wireless connection of the first communication connection may include, but is not limited to, wi-Fi, bluetooth, star flash, etc. The wireless connection of the second communication connection may include, but is not limited to, wi-Fi, bluetooth, star flash, etc.
Fig. 11A to 11D are schematic diagrams illustrating connection modes. It should be noted that the connection (including physical links and network connections) shown in the embodiments of the present application is merely an illustrative example, and the present application is not limited thereto. Referring to fig. 11A, an HDMI port of a first electronic device is connected to an HDMI port of a second electronic device through a first communication link (which is a physical link, for example, an HDMI line). The first communication link may be used to support the establishment of the first communication connection, i.e. the data or instructions, such as the screen-drop data, transmitted via the first communication link as described in the embodiments of the present application are all transmitted in the first communication link. The USB port of the first electronic device is connected to the USB port of the second electronic device via a second communication link (being a physical link, e.g. a USB line). The second communication link may be used to support establishment of a second communication connection (i.e., a reverse control information transmission channel), that is, data or instructions, such as reverse control information, transmitted via the second communication connection in the embodiments of the present application are transmitted in the second communication link. It should be understood that, in the embodiments of the present application, each "link" refers to an entity line, and "connection" may be understood as a network connection established at an upper layer of the link.
Referring to fig. 11B, the HDMI port of the first electronic device is connected to the HDMI port of the second electronic device through a first communication link (which is a physical link, for example, an HDMI line). The first communication link may be used to support the establishment of the first communication connection, i.e. the data or instructions, such as the screen-drop data, transmitted via the first communication link as described in the embodiments of the present application are all transmitted in the first communication link. A bluetooth communication link, i.e. a second communication link, may be established between the first electronic device and the second electronic device, which may also be understood as a second communication connection, for transmitting the reverse control information.
Referring to fig. 11C, the USB port of the first electronic device is connected to the USB port of the second electronic device through a communication link (which is a physical link, for example, a USB line). The communication link supports the first electronic device to establish a first communication connection and the second communication connection with the second electronic device, that is, the data or instructions, such as the screen-drop data, transmitted through the first communication link in the embodiments of the present application are transmitted in the communication link. The data or instructions, e.g., reverse control information, transmitted over the second communication connection described in the embodiments of the present application are also transmitted over the communication link. That is, the same physical link may support an upper layer to establish two or more communication connections.
Referring to fig. 11D, the first electronic device and the second electronic device establish a first communication connection and a second communication connection, where the first communication connection and the second communication connection are both bluetooth connections. Alternatively, the second communication connection may be a bluetooth low energy connection.
The connection manner and the connection type related in the embodiment of the present application are only illustrative examples, and in practical application, the corresponding connection can be established according to the protocol types supported by the first electronic device and the second electronic device, and the present application is not limited. For example, in other embodiments, the first communication connection may be a bluetooth connection and the second communication connection may be a Wi-Fi connection. Or the first communication connection is Wi-Fi connection, the second communication connection is bluetooth connection, and the like, and any combination of wired and wired, wired and wireless, and wireless modes can be adopted, so that the application is not limited.
In one possible implementation, if the first communication connection and/or the second communication connection are established on the basis of a physical link (e.g., a USB connection), the first electronic device reestablishes a corresponding network connection with the second electronic device after the user needs to configure the physical link before the first communication connection and the second communication connection are established.
In the embodiment of the present application, the order of establishment of the second communication connection and the first communication connection is not limited. In one example, in the process that the first electronic device establishes a first communication connection with the second electronic device and performs screen projection, a user can operate the first electronic device to establish a second communication connection with the second electronic device. In another example, the first electronic device may establish the second communication connection with the second electronic device before establishing the first communication connection with the first electronic device. In yet another example, the first electronic device may establish a first communication connection with the second electronic device, but not conduct the screen-casting service. And the first electronic equipment and the second electronic equipment are connected in a second communication mode, and then the second electronic equipment and the first electronic equipment perform screen projection service. It should be understood that, in the embodiment of the present application, the first communication connection and the second communication connection are independent and do not affect each other. The screen throwing service and the reverse control are mutually independent and are not influenced.
Fig. 12 is a flow chart illustrating a control method according to an embodiment of the present application, please refer to fig. 12, which specifically includes but is not limited to the following steps:
S1201, the first electronic device determines a target mode based on the user operation.
For example, after the first electronic device is started, the first electronic device displays a preset screen, and the first electronic device initializes a camera (may also be referred to as an image sensing module).
Optionally, the preset screen may be a setting interface of the first electronic device, or may be a desktop image of the first electronic device, which is not limited by the present application. In the embodiment of the application, taking the first electronic equipment as a projector as an example, after the projector is started, a preset picture can be projected onto a forward wall surface. The preset screen may be a desktop of the projector. At least one application icon is included on the desktop, but is not limited to.
For example, the user may select the target mode by triggering a function key on the first electronic device, gesture control, or using a control device (e.g., a remote control). In the embodiment of the application, the target mode is used for indicating the first electronic device to serve as a virtual target type input device. The object type input device includes, but is not limited to, a mouse (may also be referred to as a virtual mouse), a keyboard (may be referred to as a virtual keyboard), a stylus (may be referred to as a virtual stylus), a joystick (may be referred to as a virtual game pad), and the like, and the present application is not limited thereto.
Fig. 13 is a schematic view of an exemplary application scenario, referring to fig. 13, a user may select a setting application using a remote controller, and the first electronic device displays a control screen in response to a received control signal. The control screen includes at least one mode selection option including, but not limited to, a mouse, a stylus, a keyboard, and more. The user can select a mouse as a target mode through the remote controller. Accordingly, in this scenario, the first electronic device may be connected to other devices as a virtual mouse, and may also be understood as an input device of a virtual mouse type, for example, to access other devices, such as the second electronic device in the embodiment of the present application.
Illustratively, the first electronic device determines the target mode in response to a received user operation (which may also be understood as a user instruction). It is understood that a first electronic device is virtualized as one input device, connecting with other electronic devices (e.g., a second electronic device).
Optionally, after the initialization of the camera is completed, the image acquisition can be started. In one possible implementation, the user may control the first electronic device by other means, such as gestures. For example, a user's finger points to a "mouse" mode, a camera captures an image and performs image recognition on the image, determines that a user gesture points to the "mouse" mode, and initiates the mouse mode. The specific implementation is similar to that described below, and is not repeated here. It may be understood that, before the first electronic device establishes the first communication connection with the second electronic device, that is, before the first electronic device is connected to the second electronic device as the input device (or after the control information transmission channel between the first electronic device and the second electronic device is disconnected), the first electronic device may acquire control information based on the acquired image at the base station, and perform a corresponding operation based on the control information.
In one possible implementation, the setting application may include a virtualized input device launch option, and after the user selects to trigger the option, the first electronic device launches the target mode selection function, that is, displays the selection screen shown in fig. 13. In one example, the user may select a corresponding target mode based on the manner shown in FIG. 13. In another example, after the virtualized input device launch option is launched, the second electronic device may automatically select the target mode, e.g., the second electronic device selects a preset mode as the target mode, where the preset mode may be preset. The second electronic device may also take the last selected (i.e., the last selected mode) mode as the current target mode. Optionally, the first electronic device may further select a corresponding mode based on a display mode or a device type of the second electronic device. For example, if the second electronic device is a mobile phone, the touch mode may be selected. If the second electronic device is a computer, a mouse or keyboard mode may be recommended.
In the embodiment of the present application, the user may modify the current mode at any time through the selection screen shown in fig. 13. Optionally, after the modification of the target mode, the second electronic device disconnects the second communication connection from the first electronic device, and the first electronic device reestablishes the second communication connection with the second electronic device based on the new target mode, so that the second electronic device recognizes the updated target type of input device.
S1202, the first electronic device serves as a virtual input device, and establishes a second communication connection (i.e., a control information transmission channel) with the second electronic device.
For example, after the first electronic device determines the target mode, an input device virtualized to be a target type (for example, a virtual mouse type) establishes a first communication connection with the second electronic device, which can also be understood as establishing a control information transmission channel. Fig. 14 is a schematic view of an exemplary application scenario, as shown in fig. 14, in which a first electronic device (e.g., a projector) determines virtualization as a mouse in response to a received user operation, and establishes a second communication connection with a second electronic device (e.g., a tablet), that is, the first electronic device corresponds to a mouse for the second electronic device.
Optionally, taking the second communication connection as an example of bluetooth connection, fig. 15 is a schematic diagram of an exemplary bluetooth connection establishment flow, please refer to fig. 15, which specifically includes but is not limited to:
s1501, the first electronic device sends Discovery Request (probe request) message to the second electronic device.
The first electronic device stores descriptors corresponding to different modes in advance (may also be referred to as target type descriptors), which is not limited by the present application. After the first electronic device determines the target mode, a pre-stored pair descriptor is acquired. For example, after determining that the target mode is the "virtual mouse" mode, the first electronic device obtains a descriptor corresponding to the "virtual mouse" mode, for example, "function= HIDfunction (gadget, mouse"), which is used to indicate that the current device is a mouse type input device. The descriptor content is merely illustrative, and different descriptors may be defined according to different operating systems, and the application is not limited thereto.
Illustratively, the first electronic device transmits Discovery Request the message in a broadcast manner over a bluetooth interface.
Optionally, discovery Request messages include, but are not limited to, identification information of the first electronic device, bluetooth address of the first electronic device, and descriptor. The identification information of the first electronic device may be, for example, a device name of the first electronic device, and the application is not limited thereto.
S1502, the second electronic device sends Discovery Response (probe response) message to the first electronic device.
The second electronic device, upon receiving Discovery Request of the message sent by the first electronic device, determines that the target type input device, for example, a mouse (i.e., a mouse type input device), is found.
Optionally, a prompt message such as "find mouse" or "pairing with mouse" may be displayed in the display screen of the second electronic device, so as to indicate that the second electronic device successfully finds the first electronic device as a virtual mouse device.
The second electronic device sends Discovery Response a message to the first electronic device indicating that a bluetooth connection is to be established with the first electronic device. Illustratively, discovery Response include, but are not limited to, identification information of the second electronic device, bluetooth address of the second electronic device, and the like.
In S1503, the first electronic device establishes a bluetooth connection (i.e., a second communication connection, which may also be understood as a control information transmission channel) with the second electronic device.
The first electronic device and the second electronic device transmit information required for establishing the bluetooth connection through interaction of a plurality of bluetooth signaling. For example, a bluetooth encryption key may be negotiated in the multiple interactions of S1503, which may be used to encrypt bluetooth data during data transmission. The specific interaction process and content and the description in the reference bluetooth protocol are not repeated in the present application.
S1504, the first electronic device and the second electronic device perform data transmission.
For example, the first electronic device and the second electronic device may interact with data based on an established bluetooth connection. For example, the first electronic device may send a bluetooth data packet to the second electronic device, where the bluetooth data packet includes, but is not limited to, reverse control information.
It should be noted that, in the embodiment of the present application, the bluetooth connection is established between the second electronic device and the first electronic device, optionally, S1501 is taken as the starting time of the bluetooth connection process, and S1503 is taken as the completion time of bluetooth connection establishment. In other embodiments, the bluetooth connection between the second electronic device and the first electronic device may be established with S1502 or S1503 as the starting time, which is not limited by the present application.
Alternatively, the bluetooth connection between the second electronic device and the first electronic device may be maintained via BLE (Bluetooth Low Energy ) protocol. Alternatively, the bluetooth connection between the second electronic device and the first electronic device may be maintained by classical bluetooth protocols, including BR (basic rate) and EDR (ENHANCED DATE RATE, extended data rate). It should be noted that, the bluetooth connection maintained by the BLE protocol supports transmission rates of 1Mbps, 2Mbps, 500Kbps, and 125Kbps, and the supported bandwidth is 1MHz or 2MHz. The BR/EDR protocol maintains a Bluetooth connection that supports a transmission rate of up to 3Mbps with a supported bandwidth of 1MHz.
In the embodiment of the present application, only the establishment procedure of the bluetooth connection is taken as an example for explanation, and as described above, the second communication connection in the embodiment of the present application may support any protocol, and the specific establishment procedure may refer to the corresponding protocol content.
Fig. 16A is a schematic diagram illustrating an exemplary user interface, please refer to fig. 16A, in which the bluetooth setting interface 1501 of the second electronic device includes, but is not limited to, a paired device list 1502, and the paired device list 1502 includes a encounter. At least one bluetooth device for which the second electronic device has been successfully paired, a paired bluetooth device may also be understood as a device that establishes a bluetooth connection with the second electronic device.
The paired devices in this example include a projector and a mouse. The projector and the mouse are both first electronic equipment. It may be understood that when the first electronic device establishes a first communication connection with the second electronic device, the descriptor sent by the first electronic device is used to indicate that the first electronic device is an output device of a projector type, and the second electronic device establishes the first communication connection with the first electronic device, and identifies an object (i.e. the first electronic device) to which the first communication connection is connected as a projector. When the first electronic device establishes a second communication connection with the second electronic device, the descriptor sent by the first electronic device is used for indicating that the first electronic device is an input device of a mouse type, the second electronic device establishes the second communication connection with the first electronic device, and an object (namely the first electronic device) connected with the second communication connection is identified as a mouse. In this scenario, the first communication connection and the second communication connection are taken as bluetooth connection as an example, and as described above, the first communication connection and the second communication connection may support any protocol, which is not limited by the present application.
It should be noted that, in the embodiment of the present application, the time for establishing the first communication connection may be before the second communication connection is established, or may be after the second communication connection is established, which is not limited by the present application. Optionally, after the first communication connection is established, the second electronic device may perform screen projection on the first electronic device through the first communication connection, or may not perform screen projection temporarily, and may perform a corresponding service according to an actual scenario.
Fig. 16B is a schematic diagram of an exemplary user interface, referring to fig. 16B, after the first electronic device establishes a second communication connection with the second electronic device as a virtual mouse input device. The second electronic device recognizes the first electronic device as a virtual mouse. A cursor may be displayed in the display interface of the second electronic device (the cursor position may be a default position, which is not limited by the present application). In the insurance, only the first electronic device is virtualized as a mouse for example, and in the embodiment of the application, the first electronic device can be virtualized as other input devices such as a handwriting pen, a game handle and the like. The second electronic device may display a corresponding icon or image in the display screen. Such as if the first electronic device is virtualized as a stylus. An image corresponding to the handwriting pen can be displayed on a display screen of the corresponding second electronic device, for example, a painting brush can be displayed.
For example, fig. 17 is a schematic diagram of an application scenario that is exemplarily shown, please refer to fig. 17, assuming that a first communication connection and a second communication connection are already established between a first electronic device and a second electronic device, and the second electronic device is sending screen-throwing data to the first electronic device through a communication connection. In connection with the user interface schematic shown in fig. 16B, the second electronic device generates an image including a display image and a cursor. The second electronic device sends screen throwing data to the first electronic device, wherein the screen throwing data corresponds to an image currently displayed by the second electronic device, namely the screen throwing data comprises a display image and a cursor. Wherein the display image is an image generated based on the video content. The first electronic device projects the screen projection data onto a screen, and a screen projection picture is displayed on the screen, wherein the picture comprises a display image and a cursor. It is understood that the image displayed by the screen shot is consistent with the image displayed by the first electronic device. Of course, in the embodiment of the present application, only the synchronous display of the image by the first electronic device is illustrated as an example, and in other embodiments, the first electronic device may be in a black screen state in the process of projecting the first electronic device to the second electronic device. Namely, corresponding screen projection data are generated in the background.
S1203, the first electronic device acquires control information based on the image acquired by the camera, and sends the control information to the second electronic device through the second communication connection.
In the embodiment of the application, the first electronic equipment acquires the image acquired by the camera in real time. Optionally, the shooting range of the camera is greater than or equal to the display range of the projection screen.
For example, the user may implement reverse control through gestures, pointing tools, etc. at any position in front of the camera, i.e., between the camera and the projected screen. The user may also use a laser device (e.g., infrared) to effect the reverse control. Correspondingly, the images acquired by the camera comprise a display image and a target object image. The display image refers to a screen projection picture, namely an image projected by screen projection data sent by the first electronic equipment. The target object image refers to an image corresponding to a user gesture, a pointing tool and a laser mapping point.
Taking a user gesture as an example. As shown in fig. 9, the finger of the user is in front of the camera of the first electronic device, i.e. between the camera of the first electronic device and the projection screen. Correspondingly, the image acquired by the camera of the first electronic device comprises a projection picture and an image corresponding to the gesture of the user.
Take a laser device as an example. Fig. 18A is a schematic view of an exemplary application scenario, please refer to fig. 18A, in which a user uses a laser device to emit laser, and the emitted laser is mapped into a projection screen on a screen, that is, a laser landing point in the projection screen. Optionally, a cursor is also included in the projection screen.
The first electronic device may perform image recognition on the acquired image, identify the target object image, and further acquire the absolute pointing position coordinate of the target object image in the screen projection screen by using an absolute pointing (Absolute Pointing, AP) positioning technology, which may also be referred to as absolute pointing position information, position parameters, and the like, which is not limited in the present application. It may be understood that the location information obtained by the first electronic device is the location (or location coordinate) indicated by the user in the coordinate system where the display image is located. Compared with the relative pointing technology corresponding to the traditional mouse control (namely, the electronic equipment obtains the moving distance and direction of the mouse in the space and maps the moving distance and direction to the coordinate system of the image display), the application realizes high-precision positioning by utilizing the absolute pointing positioning technology, can greatly improve the user experience and realizes consistency of visibility, feeling and control. For example, a cursor displayed in an image may move following the position where a user's finger moves.
Fig. 19A is a schematic diagram illustrating module interaction, and referring to fig. 19A, an exemplary camera uploads a collected image to a control information generating module through a USB interface. And the control information generation module is used for identifying the image. And acquiring the position coordinates (namely the absolute directivity position coordinates) of the laser landing points in a coordinate system corresponding to the screen projection picture. The corresponding position coordinates can also be obtained based on the relative positions of the image of the laser landing point in the screen projection picture and the screen projection picture.
For example, as shown in fig. 20, the control information generating module performs coordinate conversion on the absolute directivity coordinates to obtain system coordinates. For example, the control information generation module calculates the center method by brightness distinction:
1. Assuming that the highest brightness value of the projection light machine for projecting the display content (i.e. projection screen data) to the projection area (i.e. projection picture) is L0, the laser pen projects a laser spot into the projection area, and the brightness value L1 is higher than L. The control information generating module presets a threshold value L2 (L_0+t_0 < = L2< = L1-t_1, t_0, t_1 are preset division coefficients used for noise resistance and robustness improvement), records a pixel coordinate set M that the gray value of an image pixel is larger than L2, calculates an average value of M coordinates, and can obtain a light spot center value (x 0, y 0). The numerical value is the absolute pointing position coordinate of the laser mapping point. The coordinates of each pixel are a coordinate system established by a projection screen, and the specific coordinate system and the establishment mode can be set according to actual requirements, so that the application is not limited.
2. The homography matrix is transformed into a homography matrix,
Homography matrix and calibration, namely, pixel coordinates in a projection picture and system coordinates of projection contents of a projector belong to two different coordinate systems. In computer vision, the perspective projection from plane to plane conforms to the constraints of homography matrix a. The detailed description of the homography matrix can refer to the prior art embodiment, and the application is not repeated. The control information generating module may perform coordinate transformation on the absolute pointing position coordinate obtained in the previous step, to obtain a coordinate position of the laser mapping point in the system. For example, the control module may derive the coordinates (mx, my) of the transformed system cursor based on the following formula:
[mx,my]=A*[x0,y0]
illustratively, the information obtained by the control information generating module based on the acquired image may be referred to as location information, location parameters, etc., which is not limited by the present application.
In the embodiment of the application, the control information generating module may also acquire the auxiliary parameter, which may also be referred to as auxiliary information, based on an auxiliary signal sent by the user using the auxiliary device or by recognizing a gesture of the user. For example, the user may click a key on a gamepad, click a remote control key, use an auxiliary touchpad, and so forth. The first electronic device may receive an auxiliary signal sent by the auxiliary device, so as to obtain a corresponding auxiliary parameter. For another example, the control information generating module may further perform gesture recognition on the gesture image to obtain gesture features for use. The gesture recognition method can refer to the prior art embodiment, and the application is not limited. The control information generation module may determine the corresponding auxiliary information based on the gesture features. For example, the control information generating module stores auxiliary information corresponding to different gesture features in advance, for example, the auxiliary information corresponding to the pinch gesture is "click mouse" and the like, which can be set according to actual requirements, and the application is not limited.
It will be appreciated that for mouse mode, the user may trigger different events, such as a single click, double click, etc., via a gesture or auxiliary device. Likewise, for stylus mode, the user may trigger a press, move, lift, multi-select, etc. event via a gesture or auxiliary device. That is, the user can control the moving position of the object such as a finger, a laser mapping point and the like to realize the moving position of the mouse or the handwriting pen and the like on the screen, and the user can further trigger different events by controlling gestures and auxiliary equipment.
Still referring to fig. 19A, the control information generating module sends control information to the reverse control module, which may also be referred to as reverse control information, reverse control parameters, etc., and the present application is not limited thereto. The control information includes, but is not limited to, location information and/or auxiliary information. That is, the position information may be included, the auxiliary information may be included, and the position information and the auxiliary information may be included.
Illustratively, the obtaining of the position information may also be performed by a vision processing module in the camera, as described above, and as shown in fig. 19B, the vision processing module may obtain the position information, and the camera sends the position information to the control information generating module through the USB interface. The control information generation module optionally acquires the auxiliary information. The control information generation module sends control information to the reverse control module. The control information includes, but is not limited to, location information and auxiliary information.
For example, the reverse control module may generate a corresponding data packet, such as a bluetooth data packet, based on a connection type of the second communication connection between the first electronic device and the second electronic device, which is not limited by the present application. The reverse control module may invoke a corresponding module (e.g., bluetooth driver, etc.) to send a data packet to the first electronic device over the second communication connection, including but not limited to control information (i.e., reverse control information).
And S1204, the first electronic device receives the screen projection image sent by the second electronic device through the first communication connection and displays the screen projection image.
The first electronic device sends control information to the second electronic device via the 2 nd communication connection, and the second electronic device receives the control information. The second electronic device may generate corresponding screen projection data based on the position information and/or the auxiliary information in the control information, so that the cursor moves to the specified position. Specifically, after the application program in the second electronic device generates the display image based on the video content, the view system (may refer to fig. 3, or may be another module, which is not limited by the present application) may generate the cursor based on the position information. Of course, in the embodiment of the present application, the movement of the cursor is merely illustrated as an example, in other embodiments, the events triggered by the position information and the auxiliary information are different, and the manner in which the second electronic device generates the screen projection data is also different, which can be implemented according to a specific scenario, and the present application is not limited.
The second electronic equipment sends the screen projection data to the first electronic equipment through the first communication connection, the first electronic equipment receives the screen projection data and projects the screen projection data onto a screen, and a projection picture is displayed on the screen. Fig. 18B is a schematic view of an exemplary application scenario, please refer to fig. 18B in combination with fig. 18A, wherein in fig. 18A, a cursor is at a first position in a projection screen, and a laser mapping point is at a second position in the projection screen. In the screen shown in fig. 18B, the second electronic device will move the cursor to the specified position based on the position information, and correspondingly, in the projection screen, the specified position to which the cursor is moved is the second position where the laser mapping point is located.
In one possible implementation manner, the gesture of the user and the pointing tool (for example, pointer, etc.) may be located at any position between the camera of the projector and the projection screen, as shown in fig. 21A, in this scenario, the first electronic device may identify, based on the acquired image, a mapping point of the index finger (i.e. the tracking point) of the user on the projection screen, and further acquire location information corresponding to the mapping point. It will be appreciated that in this example, there may be a difference between the user gesture and the actual mapped point (which may also be understood to be the display point of the cursor). Alternatively, the user's finger and pointing tool may be attached to the screen, as shown in fig. 21B, with little difference between the corresponding mapping points and the user's finger.
In the embodiment of the application, a user can control the cursor in the projection picture to move through the movement gesture. For example, as shown in fig. 22, the user's finger moves laterally, a distance D1. The first electronic device periodically (the period may be a sampling interval of the camera, for example, 13.3ms, which may be set according to practical needs, and the application is not limited thereto) acquires an image acquired by the camera, and acquires position information corresponding to a finger image (for example, a tip of an index finger, that is, a tracking point) in each image frame. The first electronic device transmits the position information obtained each time to the second electronic device, and the second electronic device can generate corresponding screen projection data based on each position information and send the screen projection data to the first electronic device. The first electronic device continuously projects a plurality of projection screen data, and accordingly, from the user perspective, the mouse in the projection screen observed by the user continuously moves, for example, the mapping point 2 is moved from the mapping point 1 by a distance d1. Alternatively, in the mouse mode (or other modes), after the first electronic device sends the initial position information for the first time, in a subsequent control process, the position information sent by the first electronic device to the second electronic device may also be relative displacement information, which is used to indicate the length and the reverse direction of the movement. That is, the first electronic device may further acquire relative displacement information based on the identified position information, and transmit the relative displacement information to the second electronic device, which may draw a cursor based on the relative displacement information.
In one possible implementation, the projector is used in a scene (i.e. the second electronic device is a projector), and the projector needs a vertical projection area to obtain a better projection effect. If not vertical (the daily use scene is not always completely vertical), the picture can generate keystone distortion. The projector can pre-correct the image by a trapezoidal correction (Keystone Correction, KC) which can cause the projected picture to be corrected from trapezoidal to rectangular. In the case of a planar projection surface, the pre-correction is constrained by a homography matrix (Homograhy). I.e. homography (or equivalent correction information) will be obtained during the trapezoidal correction.
The scheme provided by the embodiment of the application is mainly introduced from the interaction angle among the network elements. It will be appreciated that the non-control means comprise, in order to implement the above-described functions, corresponding hardware structures and/or software modules for performing the respective functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional modules of the control device according to the method example, for example, each functional module can be divided corresponding to each function, or two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
In the case of dividing the respective functional modules with the respective functions, fig. 23 shows a schematic diagram of one possible configuration of the control device 2300 related to the above embodiment, as shown in fig. 23, which may include a first acquisition module 2301, a second acquisition module 2303, a communication module 2302, and a display module 2304. The system comprises a first acquisition module 2301 for acquiring a target mode, a communication module 2302 for establishing a first communication connection with a second electronic device based on the target mode, a second acquisition module 2303 for acquiring a first reverse control parameter indicated by a user in the first image based on the first image acquired by a camera, a communication module 2302 for transmitting the first reverse control parameter to the second electronic device through the first communication connection, and a display module 2304 for displaying a second image transmitted by the second electronic device through the second communication connection, the second image being generated by the second electronic device based on the first reverse control parameter.
In one possible implementation manner, the first image comprises a first display image and an image of a target object, wherein the image of the target object is within the first display image, the first display image is an image generated by the first electronic device or the first display image is an image sent by the second electronic device through the second communication connection, and the image of the target object comprises at least one of an image corresponding to a gesture of a user, an image of a laser spot mapped on the first display image by the user through the laser device and an image corresponding to an indication tool used by the user.
In one possible implementation, the second obtaining module 2303 is specifically configured to obtain absolute pointing position coordinates of an image of the target object in the first display image, and the first inverse control parameter is configured to indicate the absolute pointing position coordinates.
In one possible implementation, the first reverse control parameter includes a position parameter and a first auxiliary parameter, and the second obtaining module 2303 is specifically configured to obtain an absolute pointing position coordinate of an image of the target object in the first display image, where the position parameter is used to indicate the absolute pointing position coordinate, and perform image recognition on the image of the target object to obtain the first auxiliary parameter.
In one possible implementation, the second image includes a second display image and a cursor, where the second display image is generated by the second electronic device based on the display content, and the cursor is generated by the second electronic device based on the first reverse control parameter.
In one possible implementation, the second obtaining module 2303 is further configured to obtain, based on a third image acquired by the camera, a second reverse control parameter indicated by the user in the third image, the communication module 2302 is further configured to send the second reverse control parameter to the second electronic device via the first communication connection, the display module is further configured to display a fourth image sent by the second electronic device via the second communication connection, the fourth image being generated by the second electronic device based on the second reverse control parameter, and the second reverse control parameter is configured to indicate to move the cursor from the first position in the second display image to the second position in the fourth display image.
In one possible implementation, the third image includes a second image and an image of the target object, and the second obtaining module 2303 is specifically configured to obtain the second inverse control parameter based on absolute pointing position coordinates of the image of the target object in the second display image.
In one possible implementation manner, the second obtaining module is further configured to obtain a second auxiliary parameter based on a second operation triggered by the user using the control device in a process of obtaining a first reverse control parameter indicated by the user in the first image based on the first image collected by the camera, and the communication module is further configured to send the second auxiliary parameter to the second electronic device through the first communication connection, where the second image is generated by the second electronic device based on the first reverse control parameter and the second auxiliary parameter.
In one possible implementation, the communication module 2302 is specifically configured to send a target descriptor to the second electronic device, where the target descriptor is used to indicate that the first electronic device is a target type input device.
In one possible implementation, the target type input device includes a mouse type, a keyboard type, a stylus type, or a handle type.
In another example, fig. 24 shows a schematic block diagram of a control device 2400 of an embodiment of the present application. The control device may include a processor 2401 and transceiver/transceiving pins 2402, and optionally, a memory 2403. The processor 2401 may be configured to perform the steps performed by the first electronic device in the methods of the foregoing embodiments, and control the receive pin to receive signals and control the transmit pin to transmit signals.
The various components of control device 2400 are coupled together by bus 2404, where bus system 2404 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, the various buses are labeled in the figure as bus system 2404.
Alternatively, memory 2403 may be used to store instructions in the foregoing method embodiments.
It should be understood that the control device 2400 according to an embodiment of the present application may correspond to the first apparatus in the methods of the foregoing embodiments, and that the foregoing and other management operations and/or functions of the respective elements in the control device 2400 are respectively for implementing the corresponding steps of the foregoing methods, and are not repeated herein for brevity.
All relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
Based on the same technical idea, the embodiments of the present application further provide a computer readable storage medium storing a computer program, where the computer program includes at least one piece of code, and the at least one piece of code is executable by an electronic device to control the electronic device to implement the above-mentioned method embodiments.
Based on the same technical idea, the embodiments of the present application also provide a computer program for implementing the above-mentioned method embodiments when the computer program is executed by an electronic device.
The program may be stored in whole or in part on a storage medium that is packaged with the processor, or in part or in whole on a memory that is not packaged with the processor.
Based on the same technical concept, the embodiment of the application also provides a processor, which is used for realizing the embodiment of the method. The processor may be a chip.
The steps of a method or algorithm described in connection with the present disclosure may be embodied in hardware, or may be embodied in software instructions executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in random access Memory (Random Access Memory, RAM), flash Memory, read Only Memory (ROM), erasable programmable Read Only Memory (Erasable Programmable ROM), electrically Erasable Programmable Read Only Memory (EEPROM), registers, hard disk, a removable disk, a compact disk Read Only Memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC.
Those skilled in the art will appreciate that in one or more of the examples described above, the functions described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.