TWI788651B - Control system for a touch device and method thereof - Google Patents

Control system for a touch device and method thereof Download PDF

Info

Publication number
TWI788651B
TWI788651B TW109111097A TW109111097A TWI788651B TW I788651 B TWI788651 B TW I788651B TW 109111097 A TW109111097 A TW 109111097A TW 109111097 A TW109111097 A TW 109111097A TW I788651 B TWI788651 B TW I788651B
Authority
TW
Taiwan
Prior art keywords
neural network
convolutional neural
control system
sensing
image
Prior art date
Application number
TW109111097A
Other languages
Chinese (zh)
Other versions
TW202042039A (en
Inventor
楊學偉
張端穎
陳昶儒
包天雯
鍾炳榮
Original Assignee
義隆電子股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 義隆電子股份有限公司 filed Critical 義隆電子股份有限公司
Priority to CN202010333138.9A priority Critical patent/CN111708448B/en
Priority to CN202311286508.8A priority patent/CN117234356A/en
Priority to US16/868,956 priority patent/US11320927B2/en
Publication of TW202042039A publication Critical patent/TW202042039A/en
Application granted granted Critical
Publication of TWI788651B publication Critical patent/TWI788651B/en

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Electrotherapy Devices (AREA)
  • Electrical Discharge Machining, Electrochemical Machining, And Combined Machining (AREA)

Abstract

A control system for a touch device which has a touch sensor includes a sensing circuit for sensing the touch sensor to generate a plurality of sensing values, a processor for generating a sensing picture according to the plurality of sensing values, and a convolutional neural network (CNN) for processing the sensing picture so as to generate a characteristic information and generate an identification information according to the characteristic information. The identification information is used to determine a state of the touch sensor.

Description

用於觸控裝置的控制系統及其方法Control system and method for touch device

本發明是有關一種用於觸控裝置的控制系統及其方法,特別是關於一種具有卷積神經網路(Convolutional Neural Network;CNN)的控制系統及其方法。 The present invention relates to a control system and method for a touch device, in particular to a control system with a convolutional neural network (CNN) and a method thereof.

在觸控裝置中,需要有一判斷機制來辨識觸碰觸控裝置的物件種類或觸控裝置的狀態。圖1至圖4顯示觸控裝置因應不同物件或狀態而產生的感應圖像,該感應圖像包括多個感應量對應該觸控裝置的不同位置。圖1顯示物件為水滴時的感應圖像,圖2顯示物件為一近距離無線通訊(Near-Field Communication;NFC)卡片時的感應圖像,圖3顯示物件為浮接(floating)金屬物體時的感應圖像,圖4顯示物件為手指且觸控裝置處於具有雜訊的異常狀態時的感應圖像。觸控裝置需要根據所感測到的感應圖像來判斷觸碰觸控裝置的物件的種類或觸控裝置的狀態,以進行對應的操作,例如判斷觸控裝置上的物件為非操作物件時,忽略該物件的操作,又或者判斷觸控裝置處於異常的狀態時,重新執行校正程序。 In the touch device, a judging mechanism is needed to identify the type of the object touching the touch device or the state of the touch device. 1 to 4 show sensing images generated by the touch device in response to different objects or states, and the sensing images include multiple sensing quantities corresponding to different positions of the touch device. Figure 1 shows the sensing image when the object is a water droplet, Figure 2 shows the sensing image when the object is a Near-Field Communication (NFC) card, Figure 3 shows the sensing image when the object is a floating metal object Fig. 4 shows the sensing image when the object is a finger and the touch device is in an abnormal state with noise. The touch device needs to judge the type of the object touching the touch device or the state of the touch device according to the sensed sensing image to perform corresponding operations. When ignoring the operation of the object, or judging that the touch device is in an abnormal state, re-execute the calibration procedure.

本發明的目的,在於提出一種用於觸控裝置且使用卷積神經網路的控制系統及其方法。 The purpose of the present invention is to provide a control system and method for a touch device using a convolutional neural network.

根據本發明,一種用於觸控裝置的控制系統包括:一感測電路, 用以感測該觸控裝置的觸控感測器以產生多個感應量;一處理器,連接該感測電路,根據該多個感應量產生一感應圖像;以及一卷積神經網路,用以處理該感應圖像以產生一特徵資訊,以及根據該特徵資訊產生一識別資訊;其中該處理器根據該識別資訊判斷該觸控感測器的狀態。 According to the present invention, a control system for a touch device includes: a sensing circuit, A touch sensor for sensing the touch device to generate a plurality of sensing values; a processor connected to the sensing circuit to generate a sensing image according to the plurality of sensing values; and a convolutional neural network , for processing the sensing image to generate feature information, and generating identification information according to the feature information; wherein the processor judges the state of the touch sensor according to the identification information.

根據本發明,一種用於觸控裝置的控制系統包括:一感測電路,用以感測該觸控裝置的觸控感測器以產生多個感應量;一處理器,連接該感測電路,根據該多個感應量產生一感應圖像,並對該感應圖像進行物件分割處理以決定一子圖像;以及一卷積神經網路,用以處理該子圖像以產生一特徵資訊,以及根據該特徵資訊產生一識別資訊;其中該處理器根據該識別資訊判斷一物件種類。 According to the present invention, a control system for a touch device includes: a sensing circuit for sensing a touch sensor of the touch device to generate a plurality of sensing values; a processor connected to the sensing circuit , generating a sensing image according to the plurality of sensing quantities, and performing object segmentation processing on the sensing image to determine a sub-image; and a convolutional neural network, used to process the sub-image to generate feature information , and generate identification information according to the characteristic information; wherein the processor judges an object type according to the identification information.

根據本發明,一種用於觸控裝置的控制系統包括:一感測電路,用以感測該觸控裝置的觸控感測器以產生多個感應量;一處理器,連接該感測電路,根據該多個感應量產生一感應圖像;一卷積神經網路,用以處理該感應圖像以產生一特徵資訊,以及根據該特徵資訊產生一識別資訊;以及一主機,連接該處理器以接收該感應圖像,並且根據該識別資訊判斷該觸控感測器的狀態。 According to the present invention, a control system for a touch device includes: a sensing circuit for sensing a touch sensor of the touch device to generate a plurality of sensing values; a processor connected to the sensing circuit , generating a sensing image according to the plurality of sensing quantities; a convolutional neural network for processing the sensing image to generate feature information, and generating identification information according to the feature information; and a host connected to the processing The touch sensor receives the sensing image, and judges the state of the touch sensor according to the identification information.

根據本發明,一種用於觸控裝置的控制系統包括:一感測電路,用以感測該觸控裝置的觸控感測器以產生多個感應量;一處理器,連接該感測電路,根據該多個感應量產生一感應圖像;一卷積神經網路,用以處理一子圖像以產生一特徵資訊,以及根據該特徵資訊產生一識別資訊;以及一主機,連接該處理器並且根據該識別資訊判斷該物件種類;其中該主機或該處理器對該感應圖像進行物件分割處理後產生該子圖像。 According to the present invention, a control system for a touch device includes: a sensing circuit for sensing a touch sensor of the touch device to generate a plurality of sensing values; a processor connected to the sensing circuit , generating a sensing image according to the plurality of sensing quantities; a convolutional neural network for processing a sub-image to generate feature information, and generating identification information according to the feature information; and a host connected to the processing The device judges the type of the object according to the identification information; wherein the host or the processor performs object segmentation processing on the sensed image to generate the sub-image.

根據本發明,一種用於觸控裝置的方法包括下列步驟:獲得該觸控制裝置的觸控感測器的一感應圖像,該感應圖像包括多個感應量;藉由一卷 積神經網路處理該感應圖像以產生一特徵資訊以及根據該特徵資訊產生一識別資訊;以及根據該識別資訊判斷該觸控感測器的狀態。 According to the present invention, a method for a touch control device includes the following steps: obtaining a sensing image of a touch sensor of the touch control device, the sensing image including a plurality of sensing values; The neural network processes the sensing image to generate feature information and generates identification information according to the feature information; and judges the state of the touch sensor according to the identification information.

根據本發明,一種用於觸控裝置的方法包括下列步驟:獲得該觸控裝置的觸控感測器的一感應圖像,該感應圖像包括多個感應量;對該感應圖像進行物件分割處理以決定一子圖像;藉由一卷積神經網路處理該子圖像以產生一特徵資訊,根據該特徵資訊產生一識別資訊;以及根據該識別資訊判斷一物件種類。 According to the present invention, a method for a touch device includes the following steps: obtaining a sensing image of a touch sensor of the touch device, where the sensing image includes a plurality of sensing quantities; Segmentation processing is used to determine a sub-image; a convolutional neural network is used to process the sub-image to generate feature information, and identification information is generated according to the feature information; and an object type is judged according to the identification information.

本發明是透過卷積神經網路辨識接觸物件的種類或觸控裝置的狀態,具有效率快速、方便以及正確率高的優點。 The present invention recognizes the type of a contact object or the state of a touch device through a convolutional neural network, and has the advantages of fast efficiency, convenience and high accuracy.

20:觸控裝置 20: Touch device

22:觸控感測器 22:Touch sensor

222:感應點 222: Sensing point

24:控制器 24: Controller

24A:控制器 24A: Controller

24B:控制器 24B: Controller

242:感測電路 242: Sensing circuit

243:記憶體 243: memory

244:處理器 244: Processor

2442:卷積神經網路程式 2442: Convolutional Neural Network Programs

245:處理器 245: Processor

245B:處理器 245B: Processor

246:記憶體 246: memory

247:卷積神經網路電路 247: Convolutional Neural Network Circuits

248:記憶體 248: memory

249:記憶體 249: memory

26:主機 26: Host

262:卷積神經網路程式 262: Convolutional Neural Network Program

30:特徵擷取部分 30: Feature extraction part

32:分類部分 32: Classification section

34:影像 34: Image

40:記憶體 40: memory

42:記憶體 42: Memory

44:卷積神經網路電路 44: Convolutional Neural Network Circuit

圖1顯示物件為水滴時的感應圖像。 Figure 1 shows the sensing image when the object is a water drop.

圖2顯示物件為NFC卡片時的感應圖像。 Figure 2 shows the sensing image when the object is an NFC card.

圖3顯示物件為浮接金屬物體時的感應圖像。 Figure 3 shows the sensing image when the object is a floating metal object.

圖4顯示物件為手指且觸控裝置處於有雜訊的異常狀態時的感應圖像。 FIG. 4 shows a sensing image when the object is a finger and the touch device is in an abnormal state with noise.

圖5顯示本發明的控制系統的第一實施例。 Figure 5 shows a first embodiment of the control system of the present invention.

圖6顯示本發明的控制系統的第二實施例。 Figure 6 shows a second embodiment of the control system of the present invention.

圖7顯示卷積神經網路的基本架構。 Figure 7 shows the basic architecture of a convolutional neural network.

圖8顯示本發明識別觸控感測器狀態的方法。 FIG. 8 shows the method for identifying the state of the touch sensor of the present invention.

圖9顯示本發明識別接觸或接近觸控感測器的物件種類。 FIG. 9 shows that the present invention recognizes the types of objects that touch or approach the touch sensor.

圖10顯示本發明的控制系統的第三實施例。 Fig. 10 shows a third embodiment of the control system of the present invention.

圖11顯示本發明的控制系統的第四實施例。 Fig. 11 shows a fourth embodiment of the control system of the present invention.

圖5說明本發明的控制系統的第一實施例。在圖5中,觸控裝置20 包括觸控感測器22與控制系統,該控制系統包含控制器24及主機26。在一實施例中,觸控感測器22為電容式觸控感測器,具有多條電極TX1~TX4及RX1~RX4,電極之間的交叉點形成感應點222。圖5中電極TX1~TX4及RX1~RX4的佈局僅為觸控感測器22的一種實施例,但本發明並不限於此。控制器24包含感測電路242、處理器244、記憶體246及記憶體248。感測電路242連接觸控感測器22,用以感測觸控感測器22上多個感應點222的電容值,以產生多個感應量dV。處理器244根據來自感測電路242的多個感應量dV產生一感應圖像SP,例如圖1至圖4。處理器244連接記憶體246與248。處理器244具有以韌體實現卷積神經網路程式2442,卷積神經網路程式2442具有推論的能力。記憶體246儲存該卷積神經網路程式2442的運作所需要的參數Dp,記憶體246可以是ROM(Read-Only Memory),或是預先載入初始值的RAM(Random Access Memory),記憶體246不限於ROM及RAM。參數Dp是經由預先在電腦上,以與卷積神經網路程式2442的架構相同的卷積神經網路訓練程式所產生的。卷積神經網路程式2442在執行不同識別功能時,所需的參數Dp也不相同。記憶體248連接處理器244,用以儲存處理器244的卷積神經網路程式2442在運作過程中所產生的暫存資訊或數據,記憶體248可以是但不限於RAM。在一實施例中,記憶體246及248也可以合併成一個記憶體。主機26可以是電子裝置的中央處理器(CPU),例如筆記型電腦的CPU,或是嵌入式控制器(Embedded Controller;EC),或是鍵盤控制器(KeyBoard Controller;KBC)。 Figure 5 illustrates a first embodiment of the control system of the present invention. In FIG. 5, the touch device 20 It includes a touch sensor 22 and a control system, and the control system includes a controller 24 and a host 26 . In one embodiment, the touch sensor 22 is a capacitive touch sensor, which has a plurality of electrodes TX1 - TX4 and RX1 - RX4 , and the intersections between the electrodes form the sensing point 222 . The layout of the electrodes TX1 - TX4 and RX1 - RX4 in FIG. 5 is just an example of the touch sensor 22 , but the present invention is not limited thereto. The controller 24 includes a sensing circuit 242 , a processor 244 , a memory 246 and a memory 248 . The sensing circuit 242 is connected to the touch sensor 22 and is used for sensing capacitance values of a plurality of sensing points 222 on the touch sensor 22 to generate a plurality of sensing values dV. The processor 244 generates a sensing image SP according to a plurality of sensing values dV from the sensing circuit 242 , as shown in FIG. 1 to FIG. 4 . The processor 244 is connected to the memories 246 and 248 . The processor 244 implements the convolutional neural network program 2442 with firmware, and the convolutional neural network program 2442 has the ability of inference. The memory 246 stores the parameters Dp required for the operation of the convolutional neural network program 2442. The memory 246 can be a ROM (Read-Only Memory), or a RAM (Random Access Memory) preloaded with initial values. 246 is not limited to ROM and RAM. The parameter Dp is pre-generated by the convolutional neural network training program with the same structure as the convolutional neural network program 2442 on the computer. When the convolutional neural network program 2442 performs different recognition functions, the required parameter Dp is also different. The memory 248 is connected to the processor 244 for storing temporary information or data generated by the convolutional neural network program 2442 of the processor 244 during operation. The memory 248 may be but not limited to RAM. In one embodiment, the memories 246 and 248 can also be combined into one memory. The host 26 may be a central processing unit (CPU) of an electronic device, such as a CPU of a notebook computer, or an embedded controller (Embedded Controller; EC), or a keyboard controller (KeyBoard Controller; KBC).

圖6說明本發明的控制系統的第二實施例。圖5的控制器24與圖6的控制器24A的架構大致相同。在圖6的控制器24A中,感測電路242連接觸控感測器22,用以感測觸控感測器22的多個感應點222以產生多個感應量dV。處理器245根據來自感測電路242的多個感應量dV產生一感應圖像SP。卷積神經網路電路247連接處理器245、記憶體243與249。記憶體243儲存卷積神經網路電路247 的運作所需要的參數Dp,記憶體243可以是但不限於ROM或是預先載入初始值的RAM。參數Dp是經由預先在電腦上,以與卷積神經網路電路247的架構相同的卷積神經網路訓練程式所產生的。卷積神經網路電路247在執行不同識別功能時,所需的參數Dp也不相同。記憶體249連接卷積神經網路電路247,用以儲存卷積神經網路電路247在運作過程中所產生的暫存資訊或數據,記憶體247可以是但不限於RAM。在一實施例中,記憶體243及249也可以合併成一個記憶體。 Figure 6 illustrates a second embodiment of the control system of the present invention. The architecture of the controller 24 of FIG. 5 is substantially the same as that of the controller 24A of FIG. 6 . In the controller 24A of FIG. 6 , the sensing circuit 242 is connected to the touch sensor 22 for sensing a plurality of sensing points 222 of the touch sensor 22 to generate a plurality of sensing values dV. The processor 245 generates a sensing image SP according to a plurality of sensing values dV from the sensing circuit 242 . The convolutional neural network circuit 247 is connected to the processor 245 , the memories 243 and 249 . The memory 243 stores the convolutional neural network circuit 247 The parameter Dp required for the operation of the memory 243 can be, but not limited to, a ROM or a RAM pre-loaded with initial values. The parameter Dp is pre-generated by the convolutional neural network training program on the computer with the same structure as the convolutional neural network circuit 247 . When the convolutional neural network circuit 247 performs different recognition functions, the required parameter Dp is also different. The memory 249 is connected to the convolutional neural network circuit 247 for storing temporary information or data generated by the convolutional neural network circuit 247 during operation. The memory 247 may be but not limited to RAM. In an embodiment, the memories 243 and 249 can also be combined into one memory.

本發明利用卷積神經網路來判斷觸控感測器22的狀態,或者接觸(或接近)觸控感測器22的物件種類。在圖5中的卷積神經網路程式2442是以韌體來實現卷積神經網路。圖6的卷積神經網路電路247則是以硬體電路的形式來實現卷積神經網路。卷積神經網路程式2442與/或卷積神經網路電路247都具有圖7所示的卷積神經網路的基本架構,其可分成特徵擷取部分30及分類部分32。特徵擷取部分30用於進行卷積(convolution)操作及子取樣(subsampling)操作,卷積操作的主要功能是特徵擷取,子取樣操作的主要功能是將圖片資料量減少並保留重要資訊。分類部分32是根據擷取到的特徵資訊進行分類。如圖7所示,當一張包括數字3的影像34輸入卷積神經網路後,特徵擷取部分30會萃取影像34特徵以產生一特徵資訊DF。該特徵資訊DF被提供給分類部分32進行分類以產生一識別資訊DI,該識別資訊DI被用來判斷該影像34中的數字為3。以辨識數字3為例,在訓練卷積神經網路認識數字3的過程中,需要把各種數字3的影像提供到卷積神經網路,卷積神經網路將影像的特徵資訊萃取出來後,將其放到一數字特徵群組中。卷積神經網路根據這些數字特徵群組中的特徵資訊即可辨識影像中的數字3。卷積神經網路已是相當成熟的技術,故不再對其細節再作詳細的說明。本發明所使用的卷積神經網路可以是標準的卷積神經網路架構,也可以是由卷積神經網路延伸變形的架構。至於特徵擷取部份30與分類部份32可以用韌體或硬體電路來實現。 The present invention utilizes a convolutional neural network to determine the state of the touch sensor 22 or the type of an object touching (or approaching) the touch sensor 22 . The convolutional neural network program 2442 in FIG. 5 implements the convolutional neural network with firmware. The convolutional neural network circuit 247 in FIG. 6 implements the convolutional neural network in the form of a hardware circuit. Both the convolutional neural network program 2442 and/or the convolutional neural network circuit 247 have the basic structure of the convolutional neural network shown in FIG. 7 , which can be divided into a feature extraction part 30 and a classification part 32 . The feature extraction part 30 is used for convolution operation and subsampling operation. The main function of the convolution operation is feature extraction, and the main function of the subsampling operation is to reduce the amount of image data and retain important information. The classification part 32 performs classification according to the retrieved feature information. As shown in FIG. 7 , when an image 34 including a number 3 is input into the convolutional neural network, the feature extraction part 30 will extract features of the image 34 to generate a feature information DF. The feature information DF is provided to the classification part 32 for classification to generate identification information DI, and the identification information DI is used to determine that the number in the image 34 is 3. Taking the recognition of the number 3 as an example, in the process of training the convolutional neural network to recognize the number 3, it is necessary to provide various images of the number 3 to the convolutional neural network. After the convolutional neural network extracts the feature information of the image, Put it into a numeric feature group. The convolutional neural network can recognize the number 3 in the image based on the feature information in these number feature groups. The convolutional neural network is a fairly mature technology, so its details will not be described in detail. The convolutional neural network used in the present invention may be a standard convolutional neural network architecture, or an extended and deformed architecture of the convolutional neural network. The feature extraction part 30 and the classification part 32 can be realized by firmware or hardware circuits.

以下搭配圖5的控制系統說明圖8與圖9的方法。 The methods shown in FIG. 8 and FIG. 9 are described below with the control system shown in FIG. 5 .

圖8是一種識別觸控感測器狀態的方法。步驟S10與S11是在獲得觸控裝置20之觸控感測器22的感應圖像。在步驟S10,控制器24的感測電路242對觸控感測器22進行感測以產生多個感應量dV。接著處理器244根據感測電路242提供的多個感應量dV產生感應圖像SP,如步驟S11所示。感應圖像SP包含觸控感測器22中各個感應點222的感應量dV。在取得感應圖像SP後,進行步驟S12。 FIG. 8 is a method for identifying the state of a touch sensor. Steps S10 and S11 are to obtain the sensing image of the touch sensor 22 of the touch device 20 . In step S10 , the sensing circuit 242 of the controller 24 senses the touch sensor 22 to generate a plurality of sensing values dV. Next, the processor 244 generates a sensing image SP according to a plurality of sensing values dV provided by the sensing circuit 242 , as shown in step S11 . The sensing image SP includes sensing values dV of each sensing point 222 in the touch sensor 22 . After the sensing image SP is acquired, go to step S12.

步驟S12是藉由卷積神經網路程式2442根據感應圖像SP識別觸控感測器22的狀態。在步驟S12中,卷積神經網路程式2442處理感應圖像SP以產生一特徵資訊DF1,以及根據該特徵資訊DF1產生一識別資訊DI1。在步驟S14,處理器244根據該識別資訊DI1判斷觸控感測器22的狀態。舉例來說,卷積神經網路程式2442產生的識別資訊DI1的內容包括觸控感測器22有水的機率為10%,觸控感測器22被雜訊干擾的機率為90%。很明顯的,觸控感測器22被雜訊干擾的機率較高,因此,根據該識別資訊DI1,處理器244即可判斷在觸控感測器22的狀態是有出現雜訊。接下來處理器244可進行對應的處理,例如限制單指操作,或者改變施加到觸控感測器22上的驅動信號的頻率。 Step S12 is to identify the state of the touch sensor 22 according to the sensing image SP by the convolutional neural network program 2442 . In step S12 , the convolutional neural network program 2442 processes the sensing image SP to generate feature information DF1 , and generates identification information DI1 according to the feature information DF1 . In step S14 , the processor 244 determines the state of the touch sensor 22 according to the identification information DI1 . For example, the content of the identification information DI1 generated by the convolutional neural network program 2442 includes that the probability that the touch sensor 22 has water is 10%, and the probability that the touch sensor 22 is disturbed by noise is 90%. Obviously, the probability of the touch sensor 22 being interfered by noise is high. Therefore, according to the identification information DI1 , the processor 244 can determine whether there is noise in the state of the touch sensor 22 . Next, the processor 244 can perform corresponding processing, such as limiting single-finger operations, or changing the frequency of the driving signal applied to the touch sensor 22 .

要進行圖8的實施例需要事先在電腦上提供一與卷積神經網路程式2442相同架構的卷積神經網路訓練程式CT1,卷積神經網路訓練程式CT1也是由程式語言來實現。為了讓卷積神經網路程式2442能夠認識各種觸控感測器的狀態,例如雜訊干擾、浮接、水滴,需要預先訓練卷積神經網路訓練程式CT1,以獲得卷積神經網路程式2442在進行辨識時所需的參數。以訓練卷積神經網路訓練程式CT1認識觸控感測器22被雜訊干擾的狀態為例,訓練的過程包括對觸控感測器22提供多次的雜訊,每次提供雜訊的位置、強度或範圍都不相同。觸控感測器22被多次的雜訊干擾,使得處理器244獲得各種具有不同感應量分佈的感應圖像SP。這些感應圖像SP交由卷積神經網路訓練程式CT1萃取,以產生卷積 神經網路程式2442獲得特徵資訊DF1與識別資訊DI1所需的參數Dp。獲得的參數Dp儲存於記憶體246中,供卷積神經網路程式2442識別觸控感測器22的狀態時使用。如此一來,卷積神經網路程式2442便具有識別觸控感測器22的狀態是否出現雜訊的能力。卷積神經網路訓練程式CT1也可以被訓練去認識觸控感測器22的其他狀態,使卷積神經網路程式2442也有能力去識別觸控感測器22的更多不同的狀態。其中間的過程大致相同,在此就不再贅述。 To implement the embodiment of FIG. 8 , a convolutional neural network training program CT1 with the same structure as the convolutional neural network program 2442 needs to be provided on the computer in advance, and the convolutional neural network training program CT1 is also implemented by a programming language. In order for the convolutional neural network program 2442 to recognize the states of various touch sensors, such as noise interference, floating contact, and water droplets, it is necessary to pre-train the convolutional neural network training program CT1 to obtain the convolutional neural network program 2442 parameters required for identification. Taking the training of the convolutional neural network training program CT1 to recognize the state of the touch sensor 22 being disturbed by noise as an example, the training process includes providing multiple noises to the touch sensor 22, each time providing the noise Not the same in location, intensity or range. The touch sensor 22 is disturbed by multiple noises, so that the processor 244 obtains various sensing images SP with different sensing quantity distributions. These sensing images SP are extracted by the convolutional neural network training program CT1 to generate convolution The neural network program 2442 obtains the parameter Dp required by the feature information DF1 and the identification information DI1. The obtained parameter Dp is stored in the memory 246 for use when the convolutional neural network program 2442 recognizes the state of the touch sensor 22 . In this way, the convolutional neural network program 2442 has the ability to identify whether there is noise in the state of the touch sensor 22 . The convolutional neural network training program CT1 can also be trained to recognize other states of the touch sensor 22 , so that the convolutional neural network program 2442 is also capable of recognizing more different states of the touch sensor 22 . The process in the middle is roughly the same, and will not be repeated here.

圖9所示的方法是用來識別接觸或接近觸控感測器22的物件種類。步驟S10與S11與圖8相同。在步驟S16中,處理器244對該感應圖像SP進行物件分割處理以決定至少一子圖像。該物件分割處理是從感應圖像SP決定一個或多個物件區域,然後根據每一個物件區域決定出一子圖像,該子圖像包括該物件區域的圖像。換言之,該子圖像是該感應圖像SP的一部分,其中包括有複數個感應量。舉例來說,因應兩個物件接觸觸控感測器22所產生的感應圖像SP經物件分割處理後,處理器244從感應圖像SP中定義出兩個物件區域,處理器244根據這兩個物件區域分別決定對應的兩個子圖像,每一子圖像包括一個物件區域的圖像。 The method shown in FIG. 9 is used to identify the type of object touching or approaching the touch sensor 22 . Steps S10 and S11 are the same as those in FIG. 8 . In step S16 , the processor 244 performs object segmentation processing on the sensed image SP to determine at least one sub-image. The object segmentation process determines one or more object regions from the sensing image SP, and then determines a sub-image according to each object region, and the sub-image includes the image of the object region. In other words, the sub-image is a part of the sensing image SP, which includes a plurality of sensing quantities. For example, after the sensing image SP generated in response to two objects touching the touch sensor 22 is subjected to object segmentation processing, the processor 244 defines two object regions from the sensing image SP, and the processor 244 defines two object regions according to the two objects. The object regions respectively determine two corresponding sub-images, and each sub-image includes an image of the object region.

步驟S17是藉由卷積神經網路程式2442根據步驟S16所決定的子圖像識別接觸或接近觸控感測器22的物件種類。在步驟S17中,卷積神經網路程式2442處理該子圖像以產生一特徵資訊DF2,並且根據該特徵資訊DF2產生一識別資訊DI2。如果有兩個子圖像,則卷積神經網路程式2442便會需要處理這兩個子圖像,以產生兩筆特徵資訊DF2與兩筆識別資訊DI2。在步驟S18,處理器244根據每一識別資訊DI2判斷一物件種類。舉例來說,識別資訊DI2的內容包括物件的種類為水的機率為90%,手指的機率為7%,觸控筆的機率為3%。很明顯的,物件的種類為水的機率特別高,因此,根據該識別資訊DI2,處理器244即可判斷在觸控感測器22上的物件是水。同理,如果有兩筆識別資訊DI2,處理器244 就會根據各筆識別資訊DI2分別判斷物件的種類。接下來處理器244可進行對應的處理,例如接觸物件為水時,處理器244不計算及輸出座標,接觸物件為觸控筆時,調整感測電路242的增益(Gain)。 Step S17 is to use the convolutional neural network program 2442 to identify the type of the object touching or approaching the touch sensor 22 according to the sub-image determined in step S16. In step S17, the convolutional neural network program 2442 processes the sub-image to generate feature information DF2, and generates identification information DI2 according to the feature information DF2. If there are two sub-images, the convolutional neural network program 2442 needs to process these two sub-images to generate two pieces of feature information DF2 and two pieces of identification information DI2. In step S18, the processor 244 determines an object type according to each identification information DI2. For example, the content of the identification information DI2 includes that the probability of the type of the object is water is 90%, the probability of the finger is 7%, and the probability of the stylus is 3%. Obviously, the probability of the object being water is particularly high. Therefore, according to the identification information DI2, the processor 244 can determine that the object on the touch sensor 22 is water. Similarly, if there are two pieces of identification information DI2, the processor 244 The type of the object will be judged respectively according to each piece of identification information DI2. Next, the processor 244 can perform corresponding processing. For example, when the contact object is water, the processor 244 does not calculate and output the coordinates, and when the contact object is a stylus, adjust the gain of the sensing circuit 242 (Gain).

要進行圖9的實施例需要事先訓練卷積神經網路程式2442認識各種物件種類,例如手指、水滴、觸控筆。以訓練卷積神經網路程式2442認識水為例,訓練的過程包括多次在觸控感測器22上滴各種大小不同的水滴,每次水滴的位置和形狀都不相同。多次在觸控感測器22上滴水使得處理器244獲得各種具有不同感應量分佈的感應圖像SP。這些感應圖像SP交由卷積神經網路訓練程式CT2萃取,以產生卷積神經網路程式2442獲得特徵資訊DF2與識別資訊DI2所需的參數Dp。獲得的參數Dp儲存於記憶體246中,供卷積神經網路程式2442識別物件種頪,使得卷積神經網路程式2442具有識別水的能力。卷積神經網路訓練程式CT2也可以被訓練去認識其他物件種類,使卷積神經網路程式2442也有能力去識別更多不同的物件。其中間的過程大致相同,在此就不再贅述。 To implement the embodiment of FIG. 9 , the convolutional neural network program 2442 needs to be trained in advance to recognize various types of objects, such as fingers, water droplets, and stylus. Taking the training of the convolutional neural network program 2442 to understand water as an example, the training process includes dropping water droplets of various sizes on the touch sensor 22 for many times, and the positions and shapes of the water droplets are different each time. Dropping water on the touch sensor 22 multiple times enables the processor 244 to obtain various sensing images SP with different sensing quantity distributions. These sensing images SP are extracted by the convolutional neural network training program CT2 to generate the parameter Dp required by the convolutional neural network program 2442 to obtain the feature information DF2 and the identification information DI2. The obtained parameter Dp is stored in the memory 246 for the convolutional neural network program 2442 to identify the object species, so that the convolutional neural network program 2442 has the ability to identify water. The convolutional neural network training program CT2 can also be trained to recognize other types of objects, so that the convolutional neural network program 2442 also has the ability to recognize more different objects. The process in the middle is roughly the same, and will not be repeated here.

圖8與圖9所示的方法亦適用於圖6的架構。步驟S12亦可以藉由處理器245控制卷積神經網路電路247的操作來實現。因此,步驟S12應被理解為藉由一卷積神經網路處理感應圖像SP以產生一特徵資訊DF1以及根據該特徵資訊DF1產生一識別資訊DI1。步驟S17亦可以藉由處理器245控制卷積神經網路電路247的操作來實現。因此,步驟S17應被理解為藉由一卷積神經網路處理一子圖像以產生一特徵資訊DF2,以及根據該特徵資訊DF2產生一識別資訊DI2。 The methods shown in FIG. 8 and FIG. 9 are also applicable to the architecture of FIG. 6 . Step S12 can also be implemented by the processor 245 controlling the operation of the convolutional neural network circuit 247 . Therefore, step S12 should be understood as processing the sensing image SP through a convolutional neural network to generate feature information DF1 and generating identification information DI1 according to the feature information DF1 . Step S17 can also be implemented by the processor 245 controlling the operation of the convolutional neural network circuit 247 . Therefore, step S17 should be understood as processing a sub-image by a convolutional neural network to generate feature information DF2, and generating identification information DI2 according to the feature information DF2.

在一實施例中,在步驟S11產生感應圖像SP後,處理器244(或245)先對感應圖像SP進行預處理。該預處理包括但不限於處理雜訊或對異常數值進行補償。然後,再藉由經過預處理後的感應圖像SP,進行步驟S12或S16。 In one embodiment, after the sensing image SP is generated in step S11, the processor 244 (or 245) performs preprocessing on the sensing image SP first. The preprocessing includes but is not limited to dealing with noise or compensating for abnormal values. Then, step S12 or S16 is performed by using the preprocessed sensing image SP.

在一實施例中,圖5的控制系統24以及圖6的控制系統24A,可以是一顆積體電路裝置。 In one embodiment, the control system 24 of FIG. 5 and the control system 24A of FIG. 6 may be an integrated circuit device.

根據本發明,只要預先提供充份的感應圖像SP訓練卷積神經網路,並將所需參數預先儲存於記憶體(246或243)中,控制器24(或24A)就可以學會根據感應圖像識別出接觸物件的種類或者觸控感測器的狀態。因此,本發明具有簡便,而且辨識準確率高的優點。 According to the present invention, as long as sufficient sensing images SP are provided in advance to train the convolutional neural network, and the required parameters are pre-stored in the memory (246 or 243), the controller 24 (or 24A) can learn to use The image identifies the type of contact object or the state of the touch sensor. Therefore, the present invention has the advantages of simplicity and high recognition accuracy.

圖10顯示本發明的控制系統的第三實施例。圖10的控制系統包含控制器24B、主機26、記憶體40及記憶體42。控制器24B具有感測電路242與處理器245B。控制器24B可以是一顆積體電路裝置。感測電路242連接觸控感測器22,用以感測觸控感測器22上多個感應點222的電容值,以產生多個感應量dV。處理器245B根據來自感測電路242的多個感應量dV產生一感應圖像SP。主機26連接處理器245B、記憶體40及記憶體42。主機26具有用韌體實現的卷積神經網路程式262。記憶體40連接主機26,用以儲存卷積神經網路程式262的運作所需要的參數Dp,記憶體40可以是但不限於ROM或是預先載入初始值的RAM。參數Dp是經由預先在電腦上,以與卷積神經網路程式262的架構相同的卷積神經網路訓練程式所產生的。卷積神經網路程式262在執行不同識別功能時,所需的參數Dp也不相同。記憶體42連接主機26,用以儲存卷積神經網路程式262在運作過程中所產生的暫存資訊或數據,記憶體42可以是但不限於RAM。在一實施例中,記憶體40及42也可以合併成一個記憶體。在一實施例中,記憶體40可以是主機26中的ROM或快閃(flash)記憶體,記憶體42可以是主機26中的RAM。主機26可以是電子裝置的CPU、EC或KBC。 Fig. 10 shows a third embodiment of the control system of the present invention. The control system in FIG. 10 includes a controller 24B, a host 26 , a memory 40 and a memory 42 . The controller 24B has a sensing circuit 242 and a processor 245B. The controller 24B can be an integrated circuit device. The sensing circuit 242 is connected to the touch sensor 22 and is used for sensing capacitance values of a plurality of sensing points 222 on the touch sensor 22 to generate a plurality of sensing values dV. The processor 245B generates a sensing image SP according to a plurality of sensing values dV from the sensing circuit 242 . The host 26 is connected to the processor 245B, the memory 40 and the memory 42 . The host 26 has a convolutional neural network program 262 implemented in firmware. The memory 40 is connected to the host computer 26 for storing the parameters Dp required for the operation of the convolutional neural network program 262. The memory 40 can be but not limited to ROM or RAM pre-loaded with initial values. The parameter Dp is pre-generated by the convolutional neural network training program with the same structure as the convolutional neural network program 262 on the computer. When the convolutional neural network program 262 performs different recognition functions, the required parameter Dp is also different. The memory 42 is connected to the host computer 26 for storing temporary information or data generated by the convolutional neural network program 262 during operation. The memory 42 may be but not limited to RAM. In an embodiment, the memories 40 and 42 can also be combined into one memory. In one embodiment, the memory 40 may be a ROM or a flash memory in the host 26 , and the memory 42 may be a RAM in the host 26 . The host 26 may be the CPU, EC or KBC of the electronic device.

圖11說明本發明的控制系統的第四實施例。圖11的控制系統與圖10同樣包括控制器24B、主機26、記憶體40及記憶體42,差異在於,圖11的控制系統還包括卷積神經網路電路44。卷積神經網路電路44以硬體電路的形式來實現卷積神經網路。卷積神經網路電路44連接主機26、記憶體40與42。記憶體40儲存卷積神經網路電路44的運作所需要的參數Dp。參數Dp是經由預先在電腦 上,以與卷積神經網路電路44的架構相同的卷積神經網路訓練程式所產生的。卷積神經網路電路44在執行不同識別功能時,所需的參數Dp也不相同。記憶體42連接卷積神經網路電路44,用以儲存卷積神經網路電路44在運作過程中所產生的暫存資訊或數據。在一實施例中,卷積神經網路電路44可以整合至主機26中。 Figure 11 illustrates a fourth embodiment of the control system of the present invention. The control system in FIG. 11 includes the controller 24B, the host computer 26 , the memory 40 and the memory 42 as in FIG. 10 . The difference is that the control system in FIG. 11 also includes a convolutional neural network circuit 44 . The convolutional neural network circuit 44 implements a convolutional neural network in the form of a hardware circuit. The convolutional neural network circuit 44 is connected to the host computer 26 and the memories 40 and 42 . The memory 40 stores the parameters Dp required by the operation of the convolutional neural network circuit 44 . The parameter Dp is obtained via the computer in advance Above, generated by the convolutional neural network training program with the same structure as the convolutional neural network circuit 44. When the convolutional neural network circuit 44 performs different recognition functions, the required parameters Dp are also different. The memory 42 is connected to the convolutional neural network circuit 44 for storing temporary information or data generated by the convolutional neural network circuit 44 during operation. In one embodiment, the convolutional neural network circuit 44 may be integrated into the host 26 .

圖10的卷積神經網路程式262及圖11的卷積神經網路電路44分別與圖5的卷積神經網路程式2442及圖6的卷積神經網路電路247類似。卷積神經網路程式262及卷積神經網路電路44的基本架構可參照圖7。 The convolutional neural network formula 262 in FIG. 10 and the convolutional neural network circuit 44 in FIG. 11 are similar to the convolutional neural network formula 2442 in FIG. 5 and the convolutional neural network circuit 247 in FIG. 6 , respectively. Refer to FIG. 7 for the basic architecture of the convolutional neural network program 262 and the convolutional neural network circuit 44 .

圖8與圖9所示的方法亦適用於圖10的架構。參照圖8及圖10,在步驟S10,控制器24B的感測電路242對觸控感測器22進行感測以產生多個感應量dV。接著處理器245B根據感測電路242提供的多個感應量dV產生感應圖像SP,如步驟S11所示。感應圖像SP包含觸控感測器22中各個感應點222的感應量dV。在取得感應圖像SP後,處理器245B將感應圖像SP傳送至主機26以進行步驟S12。 The methods shown in FIG. 8 and FIG. 9 are also applicable to the architecture of FIG. 10 . Referring to FIG. 8 and FIG. 10 , in step S10 , the sensing circuit 242 of the controller 24B senses the touch sensor 22 to generate a plurality of sensing values dV. Next, the processor 245B generates a sensing image SP according to a plurality of sensing values dV provided by the sensing circuit 242 , as shown in step S11 . The sensing image SP includes sensing values dV of each sensing point 222 in the touch sensor 22 . After obtaining the sensing image SP, the processor 245B transmits the sensing image SP to the host 26 to perform step S12.

步驟S12是藉由卷積神經網路程式262根據感應圖像SP識別觸控感測器22的狀態。在步驟S12中,主機26的卷積神經網路程式262處理感應圖像SP以產生一特徵資訊DF1,以及根據該特徵資訊DF1產生一識別資訊DI1。在步驟S14,主機26根據該識別資訊DI1判斷觸控感測器22的狀態。卷積神經網路程式262判斷狀態的方式及訓練方式與圖5的卷積神經網路程式2442相同,故不再贅述。 Step S12 is to identify the state of the touch sensor 22 according to the sensing image SP by the convolutional neural network program 262 . In step S12 , the convolutional neural network program 262 of the host computer 26 processes the sensing image SP to generate feature information DF1 , and generates identification information DI1 according to the feature information DF1 . In step S14 , the host 26 judges the state of the touch sensor 22 according to the identification information DI1 . The method of judging the state and the training method of the convolutional neural network program 262 are the same as those of the convolutional neural network program 2442 in FIG. 5 , so details are not repeated here.

在步驟S14判斷出觸控感測器22的狀態後,主機26可以將判斷出的狀態通知控制器24B,使得控制器24B可以根據觸控感測器22的狀態進行相應的處理。例如,在判斷出觸控感測器22上有水或雜訊時,控制器24B可以調整用以處理感應圖像SP的參數,或是送出指令至感測電路242以改變對觸控感測器22的掃描方式或掃描頻率。掃描方式包括但不限於自容式掃描及互容式掃描。 After determining the state of the touch sensor 22 in step S14 , the host 26 can notify the controller 24B of the determined state, so that the controller 24B can perform corresponding processing according to the state of the touch sensor 22 . For example, when it is determined that there is water or noise on the touch sensor 22, the controller 24B can adjust the parameters used to process the sensed image SP, or send instructions to the sensing circuit 242 to change the sensitivity to the touch sensing. The scanning mode or scanning frequency of the device 22. Scanning methods include but are not limited to self-capacity scanning and mutual-capacity scanning.

參照圖9及圖10,在步驟S10,控制器24B的感測電路242對觸控感測器22進行感測以產生多個感應量dV。接著處理器245B根據感測電路242提供的多個感應量dV產生感應圖像SP,如步驟S11所示。感應圖像SP包含觸控感測器22中各個感應點222的感應量dV。在步驟S16中,處理器245B在取得感應圖像SP後對該感應圖像SP進行物件分割處理以決定至少一子圖像,接著處理器245B再將至少一子圖像傳送至主機26。在一實施例中,步驟S16也可以是,處理器245B將感應圖像SP傳送至主機26後,由主機26對該感應圖像SP進行物件分割處理以決定至少一子圖像。物件分割處理的操作及原理如前所述,在此不再贅述。 Referring to FIG. 9 and FIG. 10 , in step S10 , the sensing circuit 242 of the controller 24B senses the touch sensor 22 to generate a plurality of sensing values dV. Next, the processor 245B generates a sensing image SP according to a plurality of sensing values dV provided by the sensing circuit 242 , as shown in step S11 . The sensing image SP includes sensing values dV of each sensing point 222 in the touch sensor 22 . In step S16 , after obtaining the sensed image SP, the processor 245B performs object segmentation processing on the sensed image SP to determine at least one sub-image, and then the processor 245B transmits the at least one sub-image to the host 26 . In an embodiment, step S16 may also be that after the processor 245B transmits the sensing image SP to the host 26, the host 26 performs object segmentation processing on the sensing image SP to determine at least one sub-image. The operation and principle of the object segmentation processing are as described above, and will not be repeated here.

步驟S17是藉由卷積神經網路程式262根據步驟S16所決定的子圖像識別接觸或接近觸控感測器22的物件種類。在步驟S17中,卷積神經網路程式262處理該子圖像以產生一特徵資訊DF2,並且根據該特徵資訊DF2產生一識別資訊DI2。如果有兩個子圖像,則卷積神經網路程式262便會需要處理這兩個子圖像,以產生兩筆特徵資訊DF2與兩筆識別資訊DI2。在步驟S18,主機26根據每一識別資訊DI2判斷一物件種類。卷積神經網路程式262判斷物件種類的方式及訓練方式與圖5的卷積神經網路程式2442相同,故不再贅述。 Step S17 is to use the convolutional neural network program 262 to identify the type of the object touching or approaching the touch sensor 22 according to the sub-image determined in step S16. In step S17, the convolutional neural network program 262 processes the sub-image to generate feature information DF2, and generates identification information DI2 according to the feature information DF2. If there are two sub-images, the convolutional neural network program 262 needs to process these two sub-images to generate two pieces of feature information DF2 and two pieces of identification information DI2. In step S18, the host 26 determines an object type according to each identification information DI2. The manner of judging the type of the object by the convolutional neural network program 262 and the training method are the same as those of the convolutional neural network program 2442 in FIG. 5 , so details are not repeated here.

圖8與圖9所示的方法亦適用於圖11的架構。步驟S12亦可以藉由主機26控制卷積神經網路電路44的操作來實現。因此,步驟S12應被理解為藉由一卷積神經網路處理感應圖像SP以產生一特徵資訊DF1以及根據該特徵資訊DF1產生一識別資訊DI1。步驟S17亦可以藉由主機26控制卷積神經網路電路44的操作來實現。因此,步驟S17應被理解為藉由一卷積神經網路處理一子圖像以產生一特徵資訊DF2,以及根據該特徵資訊DF2產生一識別資訊DI2。 The methods shown in FIG. 8 and FIG. 9 are also applicable to the architecture of FIG. 11 . Step S12 can also be implemented by the host 26 controlling the operation of the convolutional neural network circuit 44 . Therefore, step S12 should be understood as processing the sensing image SP through a convolutional neural network to generate feature information DF1 and generating identification information DI1 according to the feature information DF1 . Step S17 can also be implemented by the host 26 controlling the operation of the convolutional neural network circuit 44 . Therefore, step S17 should be understood as processing a sub-image by a convolutional neural network to generate feature information DF2, and generating identification information DI2 according to the feature information DF2.

在一實施例中,在步驟S11產生感應圖像SP後,處理器245B先對感應圖像SP進行預處理。該預處理包括但不限於處理雜訊或對異常數值進行補償。然後,再藉由經過預處理後的感應圖像SP,進行步驟S12或S16。 In one embodiment, after the sensing image SP is generated in step S11, the processor 245B performs preprocessing on the sensing image SP first. The preprocessing includes but is not limited to dealing with noise or compensating for abnormal values. Then, step S12 or S16 is performed by using the preprocessed sensing image SP.

根據本發明,只要預先提供充份的感應圖像SP訓練卷積神經網路程式262(或卷積神經網路電路44),主機26就可以學會根據感應圖像識別出接觸物件的種類或者觸控感測器的狀態。因此,本發明具有簡便,而且辨識準確率高的優點。 According to the present invention, as long as sufficient sensing images SP are provided in advance to train the convolutional neural network program 262 (or convolutional neural network circuit 44), the host computer 26 can learn to recognize the type of the contact object or touch object according to the sensing image. control sensor status. Therefore, the present invention has the advantages of simplicity and high recognition accuracy.

以上對於本發明之較佳實施例所作的敘述係為闡明之目的,而無意限定本發明精確地為所揭露的形式,基於以上的教導或從本發明的實施例學習而作修改或變化是可能的,實施例係為解說本發明的原理以及讓熟習該項技術者以各種實施例利用本發明在實際應用上而選擇及敘述,本發明的技術思想企圖由之後的申請專利範圍及其均等來決定。 The above descriptions of the preferred embodiments of the present invention are for the purpose of illustration, and are not intended to limit the present invention to the disclosed form. It is possible to modify or change based on the above teachings or learning from the embodiments of the present invention. The embodiment is selected and described in order to explain the principle of the present invention and to allow those familiar with the art to use the present invention in various embodiments for practical application. Decide.

20:觸控裝置 20: Touch device

22:觸控感測器 22:Touch sensor

222:感應點 222: Sensing point

24:控制器 24: Controller

242:感測電路 242: Sensing circuit

244:處理器 244: Processor

2442:卷積神經網路程式 2442: Convolutional Neural Network Programs

246:記憶體 246: memory

248:記憶體 248: memory

26:主機 26: Host

Claims (32)

一種用於觸控裝置的控制系統,該觸控裝置包含一觸控感測器,該控制系統包括:一感測電路,用以感測該觸控感測器以產生多個感應量;一處理器,連接該感測電路,根據該多個感應量產生一感應圖像;以及一卷積神經網路,用以處理該感應圖像以產生一特徵資訊,以及根據該特徵資訊產生一識別資訊;其中該處理器根據該識別資訊判斷該觸控感測器的狀態為雜訊干擾、浮接或水滴。 A control system for a touch device, the touch device includes a touch sensor, the control system includes: a sensing circuit, used to sense the touch sensor to generate a plurality of inductive quantities; A processor, connected to the sensing circuit, generates a sensing image according to the plurality of sensing quantities; and a convolutional neural network, used to process the sensing image to generate feature information, and generate a recognition based on the feature information information; wherein the processor judges the state of the touch sensor as noise interference, floating or water drop according to the identification information. 如請求項1的控制系統,其中該卷積神經網路係以該處理器的韌體實現。 The control system of claim 1, wherein the convolutional neural network is implemented in firmware of the processor. 如請求項1的控制系統,其中該卷積神經網路係以硬體電路實現。 The control system according to claim 1, wherein the convolutional neural network is implemented by a hardware circuit. 如請求項2的控制系統,更包括一記憶體連接該處理器,用以儲存該卷積神經網路的運作所需的參數。 The control system of claim 2 further includes a memory connected to the processor for storing parameters required for the operation of the convolutional neural network. 如請求項3的控制系統,更包括一記憶體連接該卷積神經網路,用以儲存該卷積神經網路的運作所需的參數。 The control system of claim 3 further includes a memory connected to the convolutional neural network for storing parameters required for the operation of the convolutional neural network. 如請求項1的控制系統,其中該處理器更包括對該感應圖像進行預處理,該預處理包括處理雜訊或補償異常數值,該處理器提供經過該預處理後的感應圖像給該卷積神經網路,以產生該識別資訊。 The control system according to claim 1, wherein the processor further includes preprocessing the sensing image, the preprocessing includes processing noise or compensating abnormal values, and the processor provides the preprocessed sensing image to the A convolutional neural network is used to generate the recognition information. 一種用於觸控裝置的控制系統,該觸控裝置包含一觸控感測器,該控制系統包括:一感測電路,用以感測該觸控感測器以產生多個感應量;一處理器,連接該感測電路,根據該多個感應量產生一感應圖像, 並且對該感應圖像進行物件分割處理以決定一子圖像;以及一卷積神經網路,用以處理該子圖像以產生一特徵資訊,以及根據該特徵資訊產生一識別資訊;其中該處理器根據該識別資訊判斷一物件種類。 A control system for a touch device, the touch device includes a touch sensor, the control system includes: a sensing circuit, used to sense the touch sensor to generate a plurality of inductive quantities; A processor, connected to the sensing circuit, generates a sensing image according to the plurality of sensing quantities, and performing object segmentation processing on the sensed image to determine a sub-image; and a convolutional neural network for processing the sub-image to generate feature information, and generate identification information based on the feature information; wherein the The processor determines an object type according to the identification information. 如請求項7的控制系統,其中該卷積神經網路係以該處理器的韌體實現。 The control system as claimed in claim 7, wherein the convolutional neural network is implemented in firmware of the processor. 如請求項7的控制系統,其中該卷積神經網路係以硬體電路實現。 The control system as claimed in claim 7, wherein the convolutional neural network is implemented by a hardware circuit. 如請求項8的控制系統,更包括一記憶體連接該處理器,用以儲存該卷積神經網路的運作所需的參數。 The control system of claim 8 further includes a memory connected to the processor for storing parameters required for the operation of the convolutional neural network. 如請求項9的控制系統,更包括一記憶體連接該卷積神經網路,用以儲存該卷積神經網路的運作所需的參數。 The control system of claim 9 further includes a memory connected to the convolutional neural network for storing parameters required for the operation of the convolutional neural network. 如請求項7的控制系統,其中該處理器進行該物件分割處理之前,先對該感應圖像進行預處理,該預處理包括處理雜訊或補償異常數值。 The control system according to claim 7, wherein the processor performs preprocessing on the sensed image before performing the object segmentation processing, and the preprocessing includes processing noise or compensating abnormal values. 一種用於觸控裝置的控制系統,該觸控裝置包含一觸控感測器,該控制系統包括:一感測電路,用以感測該觸控感測器以產生多個感應量;一處理器,連接該感測電路,根據該多個感應量產生一感應圖像;一卷積神經網路,用以處理該感應圖像以產生一特徵資訊,以及根據該特徵資訊產生一識別資訊;以及一主機,連接該處理器以接收該感應圖像,並且根據該識別資訊判斷該觸控感測器的狀態為雜訊干擾、浮接或水滴。 A control system for a touch device, the touch device includes a touch sensor, the control system includes: a sensing circuit, used to sense the touch sensor to generate a plurality of inductive quantities; A processor, connected to the sensing circuit, generates a sensing image according to the plurality of sensing quantities; a convolutional neural network is used to process the sensing image to generate feature information, and generate identification information according to the feature information and a host connected to the processor to receive the sensing image, and judge the state of the touch sensor as noise interference, floating or water drop according to the identification information. 如請求項13的控制系統,其中該主機是中央處理器、嵌入式控制器或鍵盤控制器。 The control system according to claim 13, wherein the host is a central processing unit, an embedded controller or a keyboard controller. 如請求項13的控制系統,其中該卷積神經網路係以該主機的韌體實現。 The control system according to claim 13, wherein the convolutional neural network is implemented with firmware of the host. 如請求項13的控制系統,其中該卷積神經網路係以硬體電路實現。 The control system according to claim 13, wherein the convolutional neural network is implemented by a hardware circuit. 如請求項16的控制系統,其中該卷積神經網路係整合在該主機中。 The control system according to claim 16, wherein the convolutional neural network is integrated in the host. 如請求項15的控制系統,更包括一記憶體連接該主機,用以儲存該卷積神經網路的運作所需的參數。 The control system according to claim 15 further includes a memory connected to the host for storing parameters required for the operation of the convolutional neural network. 如請求項16的控制系統,更包括一記憶體連接該卷積神經網路,用以儲存該卷積神經網路的運作所需的參數。 The control system of claim 16 further includes a memory connected to the convolutional neural network for storing parameters required for the operation of the convolutional neural network. 如請求項13的控制系統,其中該處理器更包括對該感應圖像進行預處理,該預處理包括處理雜訊或補償異常數值,該處理器提供經過該預處理後的感應圖像給該卷積神經網路,以產生該識別資訊。 The control system according to claim 13, wherein the processor further includes pre-processing the sensing image, the pre-processing includes processing noise or compensating abnormal values, and the processor provides the pre-processed sensing image to the A convolutional neural network is used to generate the recognition information. 一種用於觸控裝置的控制系統,該觸控裝置包含一觸控感測器,該控制系統包括:一感測電路,用以感測該觸控感測器以產生多個感應量;一處理器,連接該感測電路,根據該多個感應量產生一感應圖像;一卷積神經網路,用以處理一子圖像以產生一特徵資訊,以及根據該特徵資訊產生一識別資訊;以及一主機,連接該處理器並且根據該識別資訊判斷該物件種類;其中該主機或該處理器對該感應圖像進行物件分割處理後產生該子圖像。 A control system for a touch device, the touch device includes a touch sensor, the control system includes: a sensing circuit, used to sense the touch sensor to generate a plurality of inductive quantities; A processor, connected to the sensing circuit, generates a sensing image according to the plurality of sensing quantities; a convolutional neural network is used to process a sub-image to generate feature information, and generate identification information according to the feature information and a host connected to the processor and judging the type of the object according to the identification information; wherein the host or the processor generates the sub-image after performing object segmentation processing on the sensed image. 如請求項21的控制系統,其中該主機是中央處理器、嵌入式控制器或鍵盤控制器。 The control system according to claim 21, wherein the host computer is a central processing unit, an embedded controller or a keyboard controller. 如請求項21的控制系統,其中該卷積神經網路係以該主機的韌體實現。 The control system as in claim 21, wherein the convolutional neural network is implemented with firmware of the host. 如請求項21的控制系統,其中該卷積神經網路係以硬體電路實現。 The control system according to claim 21, wherein the convolutional neural network is implemented by a hardware circuit. 如請求項24的控制系統,其中該卷積神經網路係整合在該主機中。 The control system of claim 24, wherein the convolutional neural network is integrated in the host. 如請求項23的控制系統,更包括一記憶體連接該主機,用以儲存該卷積神經網路的運作所需的參數。 The control system of claim 23 further includes a memory connected to the host for storing parameters required for the operation of the convolutional neural network. 如請求項24的控制系統,更包括一記憶體連接該卷積神經網路,用以儲存該卷積神經網路的運作所需的參數。 The control system of claim 24 further includes a memory connected to the convolutional neural network for storing parameters required for the operation of the convolutional neural network. 如請求項21的控制系統,其中在該感應圖像進行該物件分割處理之前,該處理器先對該感應圖像進行預處理,該預處理包括處理雜訊或補償異常數值。 The control system according to claim 21, wherein before the object segmentation process is performed on the sensed image, the processor firstly performs preprocessing on the sensed image, and the preprocessing includes processing noise or compensating abnormal values. 一種用於觸控裝置的方法,包括下列步驟:a.獲得該觸控制裝置的觸控感測器的一感應圖像,該感應圖像包括多個感應量;b.藉由一卷積神經網路處理該感應圖像以產生一特徵資訊以及根據該特徵資訊產生一識別資訊;以及c.根據該識別資訊判斷該觸控感測器的狀態為雜訊干擾、浮接或水滴。 A method for a touch control device, comprising the following steps: a. Obtaining a sensing image of a touch sensor of the touch control device, the sensing image including a plurality of sensing values; b. The network processes the sensing image to generate feature information and generate identification information according to the feature information; and c. judging the state of the touch sensor as noise interference, floating or water drop according to the identification information. 如請求項29的方法,其中在該步驟b之前,更包括對該感應圖像進行預處理,該預處理包括處理雜訊或補償異常數值。 The method according to claim 29, wherein before the step b, it further includes preprocessing the sensed image, and the preprocessing includes processing noise or compensating abnormal values. 一種用於觸控裝置的方法,包括下列步驟:a.獲得該觸控裝置的觸控感測器的一感應圖像,該感應圖像包括多個 感應量;b.對該感應圖像進行物件分割處理以決定一子圖像;c.藉由一卷積神經網路處理該子圖像以產生一特徵資訊,根據該特徵資訊產生一識別資訊;以及d.根據該識別資訊判斷一物件種類。 A method for a touch device, comprising the following steps: a. obtaining a sensing image of a touch sensor of the touch device, the sensing image including a plurality of Sensing amount; b. performing object segmentation processing on the sensing image to determine a sub-image; c. processing the sub-image by a convolutional neural network to generate a feature information, and generating a recognition information based on the feature information and d. judging an object type according to the identification information. 如請求項31的方法,其中更包括在該步驟b之前,對該感應圖像進行預處理,該預處理包括處理雜訊或補償異常數值。 The method according to claim 31, further comprising preprocessing the sensed image before step b, the preprocessing including processing noise or compensating abnormal values.
TW109111097A 2019-05-08 2020-04-01 Control system for a touch device and method thereof TWI788651B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202010333138.9A CN111708448B (en) 2019-05-08 2020-04-24 Control system and method for touch device
CN202311286508.8A CN117234356A (en) 2019-05-08 2020-04-24 Control system and method for touch device
US16/868,956 US11320927B2 (en) 2019-05-08 2020-05-07 Control system for a touch device and method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962844744P 2019-05-08 2019-05-08
US62/844,744 2019-05-08

Publications (2)

Publication Number Publication Date
TW202042039A TW202042039A (en) 2020-11-16
TWI788651B true TWI788651B (en) 2023-01-01

Family

ID=74201218

Family Applications (1)

Application Number Title Priority Date Filing Date
TW109111097A TWI788651B (en) 2019-05-08 2020-04-01 Control system for a touch device and method thereof

Country Status (1)

Country Link
TW (1) TWI788651B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI871540B (en) * 2022-11-10 2025-02-01 義隆電子股份有限公司 Control method of a touchpad

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201626188A (en) * 2014-09-30 2016-07-16 惠普發展公司有限責任合夥企業 Determining unintended touch rejection
CN107656644A (en) * 2017-09-26 2018-02-02 努比亚技术有限公司 Grip recognition methods and corresponding mobile terminal
CN107797751A (en) * 2017-10-26 2018-03-13 努比亚技术有限公司 The recognition methods of mobile terminal grip, mobile terminal and readable storage medium storing program for executing
CN108960405A (en) * 2017-05-18 2018-12-07 电装It研究所 Identifying system, generic features value extraction unit and identifying system constructive method
TWI654541B (en) * 2018-04-13 2019-03-21 矽統科技股份有限公司 Method and system for identifying tapping events on a touch panel, and terminal touch products

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201626188A (en) * 2014-09-30 2016-07-16 惠普發展公司有限責任合夥企業 Determining unintended touch rejection
CN108960405A (en) * 2017-05-18 2018-12-07 电装It研究所 Identifying system, generic features value extraction unit and identifying system constructive method
CN107656644A (en) * 2017-09-26 2018-02-02 努比亚技术有限公司 Grip recognition methods and corresponding mobile terminal
CN107797751A (en) * 2017-10-26 2018-03-13 努比亚技术有限公司 The recognition methods of mobile terminal grip, mobile terminal and readable storage medium storing program for executing
TWI654541B (en) * 2018-04-13 2019-03-21 矽統科技股份有限公司 Method and system for identifying tapping events on a touch panel, and terminal touch products

Also Published As

Publication number Publication date
TW202042039A (en) 2020-11-16

Similar Documents

Publication Publication Date Title
CN108710866B (en) Chinese character model training method, chinese character recognition method, device, equipment and medium
US9384403B2 (en) System and method for superimposed handwriting recognition technology
CN110751043A (en) Face recognition method and device based on face visibility and storage medium
JP2020527260A (en) Text detection analysis methods, devices and devices
CN112464809A (en) Face key point detection method and device, electronic equipment and storage medium
WO2018041250A1 (en) Touch control operation identification method and apparatus
KR20170005378A (en) System and method for superimposed handwriting recognition technology
KR20210017090A (en) Method and electronic device for converting handwriting input to text
EP2639743A2 (en) Image processing device, image processing program, and image processing method
CN111708448B (en) Control system and method for touch device
KR20170045813A (en) Detecting method and apparatus of biometrics region for user authentication
CN114223021A (en) Electronic device and method for processing handwriting input
CN111160173B (en) Gesture recognition method based on robot and robot
TWI788651B (en) Control system for a touch device and method thereof
WO2021130888A1 (en) Learning device, estimation device, and learning method
CN110633027A (en) A method, system, computer equipment and storage medium for realizing point reading
KR100800439B1 (en) Touchpad input error correction method and terminal
KR20210157052A (en) Object recognition method and object recognition apparatus
EP3295292B1 (en) System and method for superimposed handwriting recognition technology
CN117058693A (en) Intelligent handwriting recognition method of electromagnetic touch screen
CN111507201B (en) Human eye image processing method, human eye recognition method, human eye image processing device and storage medium
JP7347750B2 (en) Verification device, learning device, method, and program
CN111652930B (en) Image target detection method, system and equipment
WO2020101036A1 (en) Teaching signal generation device, model generation device, object detection device, teaching signal generation method, model generation method, and program
JP2008282327A (en) Character symmetry determination method and character symmetry determination device