JPS60195613A - Robotic teaching device with verification function - Google Patents

Robotic teaching device with verification function

Info

Publication number
JPS60195613A
JPS60195613A JP59049068A JP4906884A JPS60195613A JP S60195613 A JPS60195613 A JP S60195613A JP 59049068 A JP59049068 A JP 59049068A JP 4906884 A JP4906884 A JP 4906884A JP S60195613 A JPS60195613 A JP S60195613A
Authority
JP
Japan
Prior art keywords
robot
point
motion
view
teaching device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP59049068A
Other languages
Japanese (ja)
Inventor
Takashi Yajima
敬士 矢島
Mikio Ueyama
幹夫 植山
Kosuke Shinnai
新内 浩介
Kenjiro Kumamoto
熊本 健二郎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to JP59049068A priority Critical patent/JPS60195613A/en
Publication of JPS60195613A publication Critical patent/JPS60195613A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Program-control systems
    • G05B19/02Program-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the program is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • G05B19/425Teaching successive positions by numerical control, i.e. commands being entered to control the positioning servo of the tool head or end effector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40478Graphic display of work area of robot, forbidden, permitted zone

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

(57)【要約】本公報は電子出願前の出願データであるた
め要約のデータは記録されません。
(57) [Summary] This bulletin contains application data before electronic filing, so abstract data is not recorded.

Description

【発明の詳細な説明】 〔発明の利用分野〕 本発明は、ロボットの作業点教示方式に係り、特に、ロ
ボットの実行可能な作業点を誤りなく容易に与えること
が可能な作業点教示装置に関する。
DETAILED DESCRIPTION OF THE INVENTION [Field of Application of the Invention] The present invention relates to a work point teaching method for a robot, and particularly to a work point teaching device that can easily give work points that a robot can execute without error. .

〔発明の背景〕[Background of the invention]

従来のティーチングボックスや計算機と図形表示装置と
座標入力装置を用いたロボット動作教示方式では、ロボ
ットの動作範囲が明示されていないため、動作可能領域
外の点を指定した場合には、再び目標点を指定し直す必
要があり、目標点選択の指標がないという欠点があった
In the conventional robot motion teaching method using a teaching box, calculator, graphic display device, and coordinate input device, the robot's motion range is not specified, so if a point outside the movable region is specified, the target point is returned to the target point. It was necessary to respecify the target point, and there was a drawback that there was no index for selecting the target point.

〔発明の目的〕[Purpose of the invention]

本発明の目的は、図形入力装置にロボットの動作可能領
域を明示することにより、ロボットの動作軌跡を計算機
との対話形式で容易に入力し、その軌跡を三次元表示し
て動作の確認を容易かつ正確に行なえる作業点教示装置
を提供することにある。
An object of the present invention is to clearly indicate the robot's movable area on a graphic input device, so that the robot's motion trajectory can be easily input in an interactive manner with a computer, and the trajectory can be displayed in three dimensions to facilitate confirmation of the robot's motion. It is an object of the present invention to provide a work point teaching device that can perform work points accurately.

[発明の概要〕 前記目的を達成するため、本発明では図形表示装置上に
ロボット動作可能領域を明示した、対象に応じた最低二
面の正投影図と一面の三次元投影図を表示し、続いて正
投影図上の動作可能領域内の任意の点を指定すると他の
投影図には対応する投影線を表示する。この投影線上の
点を指定する事によって作業点を容易に決定する事を特
徴とする。また移動形ロボットなどでは、マニピュレー
タの動作範囲と実際に与えられた作業を遂行する際のマ
ニピュレータの作業軌跡および動作範囲と作業軌跡のず
れを表示することにより、ロボット本体が移動すべき移
動量を容易に知ることができる。
[Summary of the Invention] In order to achieve the above-mentioned object, the present invention displays at least two orthographic projection views and one three-dimensional projection view according to the object, clearly indicating the robot movable area on the graphic display device, Subsequently, when an arbitrary point within the operable area on the orthographic projection is specified, the corresponding projection line is displayed on other projections. A feature of this method is that the work point can be easily determined by specifying a point on this projection line. In addition, for mobile robots, by displaying the operating range of the manipulator, the working trajectory of the manipulator when actually performing a given task, and the deviation between the operating range and the working trajectory, it is possible to determine the amount of movement the robot body should move. It is easy to know.

また、ロボットのシミュレータを持ち、入力の都度シミ
ュレーションにより、特異点を検出し、検出された特異
点を排除する機能を付与することも可能である。
It is also possible to have a robot simulator and provide a function to detect singularities through simulation each time an input is made and to eliminate the detected singularities.

〔発明の実施例〕[Embodiments of the invention]

以下、本発明の詳細な説明する。 The present invention will be explained in detail below.

第1図は、本発明を実施するロボット動作教示装置のシ
ステム構成図である。同図において計算機1にCRT2
.ライトペン3.タブレットあるいはデジタイザ4が接
続している。ダブレットあるいはデジタイザ4をライト
ペン3で押下すると対応する点がCRT2に表示される
FIG. 1 is a system configuration diagram of a robot motion teaching device that implements the present invention. In the same figure, computer 1 has CRT2
.. Light pen 3. A tablet or digitizer 4 is connected. When the doublet or digitizer 4 is pressed with the light pen 3, a corresponding point is displayed on the CRT 2.

第2図は、CRT画面の構成図である。画面5は正投影
法による上面図6.正面図7.側面図8のうち最低2つ
、および三次元投影図9から構成される。投影図は個別
に表示してもよい。投影図には、任意の座標軸を設定で
きる。
FIG. 2 is a configuration diagram of a CRT screen. Screen 5 is a top view using orthographic projection 6. Front view 7. It consists of at least two of the side views 8 and a three-dimensional projection view 9. Projections may be displayed individually. Any coordinate axes can be set in the projection view.

第3図は三次元の位置を二次元の投影図上で決定する方
法の説明図である。たとえば画面10の正面図上の任意
点をライトペンで指定すると同図上に指定点15が表示
され、上面図11.側面図13上では、それぞれ線分1
6.17として表示される。次に線分16と線分17の
どちらか一方の線上の点、たとえば線分16上の点をラ
イトペンで指定すると、同図上で点18が表示される。
FIG. 3 is an explanatory diagram of a method for determining a three-dimensional position on a two-dimensional projection diagram. For example, if you specify an arbitrary point on the front view of the screen 10 with a light pen, the specified point 15 will be displayed on the same view, and the top view 11. On the side view 13, each line segment 1
6.17. Next, when a point on either line segment 16 or line segment 17, for example a point on line segment 16, is specified with a light pen, point 18 is displayed on the diagram.

同時に、側面図13および三次元投影図14上にも点1
9および点20として前記指定点が表示される1以上の
操作により三次元空間上の一点が決定できる。最初に指
定する点は、上面図、正面図および側面図のなかで利用
しやすい図を選択できる。指定した投影図以外の投影図
に表示された二つの線分もどちらか一方を選択すること
ができる。
At the same time, the point 1 is also shown on the side view 13 and the three-dimensional projection view 14.
A point on the three-dimensional space can be determined by one or more operations in which the specified point is displayed as 9 and 20. For the first point you specify, you can select the most convenient view from top view, front view, and side view. You can also select one of the two line segments displayed in a projection view other than the specified projection view.

第4図および第5図は本発明によるロボット作業点教示
法の説明図である。第4図には、ロボットの一例が示さ
れており、ロボット21の可能な動作範囲22も同時に
図示されている。実際のロボットには必ず動作可能領域
がある。この領域内の動作可能点(ロボットの可動領域
から静止特異点を除いたもの)を指定しなければならな
い。
FIGS. 4 and 5 are explanatory diagrams of the robot work point teaching method according to the present invention. An example of a robot is shown in FIG. 4, and the possible range of motion 22 of the robot 21 is also shown at the same time. Actual robots always have a range in which they can operate. A movable point within this area (the robot's movable area minus the stationary singularity) must be specified.

第5図の画面30は、ロボットの動作可能な領域を明示
した上面図31.正面図32.側面図33から構成され
ている。これらの投影図を用いて、前記の指定方式によ
り、ロボットの動作可能な領域内の作業点のみを指定す
ることができる。
The screen 30 in FIG. 5 is a top view 31. which clearly shows the movable area of the robot. Front view 32. It consists of a side view 33. Using these projections and the above-described designation method, only the work points within the robot's movable area can be designated.

また、移動ロボットなどが各作業場所において同様の作
業を繰り返す場合、ロボットの移動量をロボットに教示
する必要がある。
Furthermore, when a mobile robot or the like repeats the same work at each work location, it is necessary to teach the robot the amount of movement of the robot.

第6図に示すようにロボットマニピュレータ部分の動作
範囲42と作業軌跡43および動作範囲42と作業軌跡
43とのずれ44をCRT画面に表示する。
As shown in FIG. 6, the motion range 42 and work trajectory 43 of the robot manipulator portion and the deviation 44 between the motion range 42 and the work trajectory 43 are displayed on the CRT screen.

オペレーターはそのずれ44を読み取り、作業軌跡に平
行移動あるいは回転移動′を施こし、作業軌跡44が動
作範囲43の内部になるように作業軌跡を容易に修正す
ることが可能である。図6中の45は修正後の作業軌跡
である。
The operator can read the deviation 44, apply parallel movement or rotational movement to the working trajectory, and easily correct the working trajectory so that the working trajectory 44 is within the operating range 43. 45 in FIG. 6 is the work trajectory after correction.

変形例 実ロボットには動特性が存在しては動作速度等のメカ部
分の制限がある。この制限によって、ロボットの動作軌
跡決定後にも、動作不可能な点が存在する。たとえば、
ロボットの関節角が急激に変化する場合などである。こ
の特異点を動作特異点と呼ぶことにする。この動作特異
点を動作範囲から除くために以下のようなシステムを構
成する。
Modifications Actual robots have dynamic characteristics and mechanical limitations such as operating speed. Due to this limitation, even after the robot's motion trajectory is determined, there are still points where the robot cannot operate. for example,
This is the case, for example, when the robot's joint angle changes rapidly. This singular point will be called the motion singular point. In order to remove this operating singularity from the operating range, the following system is configured.

第7図にシステム構成を示す。シミュレーター50は、
前述の動作軌跡教示法によって決められたロボット動作
に対応したロボットの各関節角を計算し、これを時間的
推移としてCRTに表示する。その−例を図8に示す。
Figure 7 shows the system configuration. The simulator 50 is
Each joint angle of the robot corresponding to the robot motion determined by the above-described motion locus teaching method is calculated and displayed on the CRT as a temporal change. An example is shown in FIG.

急激な関節角Oの変化がある場合(図8の時刻1.)、
その点をCRT画面に表示する。オペレータはこの表示
を見て、ロボットの動作軌跡を修正する。
When there is a sudden change in the joint angle O (time 1 in Figure 8),
The point is displayed on the CRT screen. The operator looks at this display and corrects the robot's motion trajectory.

〔発明の効果〕〔Effect of the invention〕

本発明によれば、ロボットの動作可能な作業点のみを選
択することができるので、無駄がなく効率的にロボット
の動作を教示することが出来る。
According to the present invention, only the work points at which the robot can operate can be selected, so the robot's operations can be taught efficiently without waste.

さらに、移動ロボットなどでは、ロボットマニピュレー
タの動作範囲と作業軌跡とのずれを知ることができるの
で容易にロボットの移動量を教示することができる。
Furthermore, in the case of mobile robots, it is possible to know the deviation between the operating range of the robot manipulator and the work trajectory, so the amount of movement of the robot can be easily taught.

【図面の簡単な説明】[Brief explanation of drawings]

第1図はロボット動作教示装置のシステム構成図、第2
図はCR7画像の構成図、第3図は三次元の位置を二次
元の投影図上で決定する方法の説明図、第4図はロボッ
トとその動作可能領域の例を示す図、第5図は本発明に
よるCRT画面の一構成例を示す図、第6図は本発明に
よる新しい効果の例を示す図、第7図はシミュレータの
構成ブロックを示す図、第8図の関節角を時間の関数と
児 112] 第 2121 第 3 図 ′rn4 図 第 5 図 冗 /) 口 4 第 q 口 第 δ 図
Figure 1 is a system configuration diagram of the robot motion teaching device, Figure 2
The figure is a configuration diagram of the CR7 image, Figure 3 is an explanatory diagram of a method for determining a three-dimensional position on a two-dimensional projection diagram, Figure 4 is a diagram showing an example of a robot and its movable area, and Figure 5 6 is a diagram showing an example of the configuration of a CRT screen according to the present invention, FIG. 6 is a diagram showing an example of new effects according to the present invention, FIG. 7 is a diagram showing the configuration blocks of a simulator, and FIG. Functions and Children 112] No. 2121 No. 3 Fig.'rn4 Fig. No. 5 Fig. 4 /) No. 4 q No. δ Fig.

Claims (1)

【特許請求の範囲】[Claims] 対話的に3次元の座標点を入力し目視確認する手段と、
ロボットの動作機能モデルと動作可能範囲解析手段と、
動作目標点を検証する手段と1作られた制御命令群を実
行命令に変換する手段とを備えたことを特徴とする検証
機能付ロボット教示装置。
A means for interactively inputting and visually confirming three-dimensional coordinate points;
A robot motion function model and a movable range analysis means,
A robot teaching device with a verification function, comprising means for verifying a motion target point and means for converting a group of control commands created into execution commands.
JP59049068A 1984-03-16 1984-03-16 Robotic teaching device with verification function Pending JPS60195613A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP59049068A JPS60195613A (en) 1984-03-16 1984-03-16 Robotic teaching device with verification function

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP59049068A JPS60195613A (en) 1984-03-16 1984-03-16 Robotic teaching device with verification function

Publications (1)

Publication Number Publication Date
JPS60195613A true JPS60195613A (en) 1985-10-04

Family

ID=12820758

Family Applications (1)

Application Number Title Priority Date Filing Date
JP59049068A Pending JPS60195613A (en) 1984-03-16 1984-03-16 Robotic teaching device with verification function

Country Status (1)

Country Link
JP (1) JPS60195613A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62165213A (en) * 1986-01-17 1987-07-21 Agency Of Ind Science & Technol Work environment teaching device
JPS6374585A (en) * 1986-09-12 1988-04-05 三菱重工業株式会社 Operation simulator device for robot arm
WO1989000094A1 (en) * 1987-06-29 1989-01-12 Fanuc Ltd Method of robot arrangement determination
JPH06238582A (en) * 1992-08-28 1994-08-30 Samsung Electronics Co Ltd Robot travel path measuring device
JP2017529929A (en) * 2014-09-30 2017-10-12 270 ビジョン リミテッド270 Vision Ltd Trajectory mapping of the anatomical part of the human or animal body
CN107866813A (en) * 2017-11-07 2018-04-03 龚土婷 A kind of intelligently guiding robot
JP2018111155A (en) * 2017-01-11 2018-07-19 セイコーエプソン株式会社 Robot control device, robot and robot system
JP2020082274A (en) * 2018-11-26 2020-06-04 キヤノン株式会社 Image processing apparatus, control method thereof, and program

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62165213A (en) * 1986-01-17 1987-07-21 Agency Of Ind Science & Technol Work environment teaching device
JPS6374585A (en) * 1986-09-12 1988-04-05 三菱重工業株式会社 Operation simulator device for robot arm
WO1989000094A1 (en) * 1987-06-29 1989-01-12 Fanuc Ltd Method of robot arrangement determination
US4979128A (en) * 1987-06-29 1990-12-18 Fanuc Ltd. Method of deciding robot layout
JPH06238582A (en) * 1992-08-28 1994-08-30 Samsung Electronics Co Ltd Robot travel path measuring device
JP2017529929A (en) * 2014-09-30 2017-10-12 270 ビジョン リミテッド270 Vision Ltd Trajectory mapping of the anatomical part of the human or animal body
US10561346B2 (en) 2014-09-30 2020-02-18 270 Vision Ltd. Mapping the trajectory of a part of the anatomy of the human or animal body
US11337623B2 (en) 2014-09-30 2022-05-24 270 Vision Ltd. Mapping the trajectory of a part of the anatomy of the human or animal body
JP2018111155A (en) * 2017-01-11 2018-07-19 セイコーエプソン株式会社 Robot control device, robot and robot system
CN107866813A (en) * 2017-11-07 2018-04-03 龚土婷 A kind of intelligently guiding robot
JP2020082274A (en) * 2018-11-26 2020-06-04 キヤノン株式会社 Image processing apparatus, control method thereof, and program
US11590657B2 (en) 2018-11-26 2023-02-28 Canon Kabushiki Kaisha Image processing device, control method thereof, and program storage medium

Similar Documents

Publication Publication Date Title
CN110238831B (en) Robot teaching system and method based on RGB-D image and teaching device
US10751877B2 (en) Industrial robot training using mixed reality
US9984178B2 (en) Robot simulator, robot teaching apparatus and robot teaching method
CN110385694B (en) Robot motion teaching device, robot system, and robot control device
US9958862B2 (en) Intuitive motion coordinate system for controlling an industrial robot
JP5113666B2 (en) Robot teaching system and display method of robot operation simulation result
US20150151431A1 (en) Robot simulator, robot teaching device, and robot teaching method
JPH0772844B2 (en) Robot teaching device
JPS63196388A (en) Teaching device for remote control robot
JPS6179589A (en) Operating device for robot
Gong et al. Projection-based augmented reality interface for robot grasping tasks
JPS60195613A (en) Robotic teaching device with verification function
JP2001216015A (en) Operation teaching device for robot
Rastogi et al. Telerobotic control with stereoscopic augmented reality
JPS60195615A (en) Posture teaching method for articulated robots
JP3076841B1 (en) Teaching program creation method for real environment adaptive robot
JPS6097409A (en) Robot motion teaching method
JP2868343B2 (en) Off-line teaching method of 3D laser beam machine
Owens WORKSPACE-a microcomputer-based industrial robot simulator and off-line programming system
JPS6292003A (en) Position designating system for 3-dimensional simulator
JPS61127007A (en) Interference check method between robot and workpiece
JPH0752367B2 (en) Robot teaching device
JPS59216210A (en) Robot motion teaching method
JPH0752068A (en) Remote control system
Bejczy et al. Sensor fusion in telerobotic task controls