• Aucun résultat trouvé

Commercial Robot Controllers

Dans le document Robot Manipulator Control (Page 28-39)

Type I SCARA. The SCARA (selectively compliant assembly robot

1.3 Commercial Robot Controllers

Commercial robot controllers are specialized multiprocessor computing systems that provide four basic processes allowing integration of the robot into an automation system: Motion Trajectory Generation and Following, Motion/Process Integration and Sequencing, Human User integration, and Information Integration.

Motion Trajectory Generation and Following. There are two important controller-related aspects of industrial robot motion generation. One is the extent of manipulation that can be programmed, the other is the ability to execute controlled programmed motion. A unique aspect of each robot system is its real-time servo-level motion control. The details of real-time control are typically not revealed to the user due to safety and proprietary information secrecy reasons. Each robot controller, through its operating system programs, converts digital data from higher-level coordinators into coordinated arm motion through precise computation and high-speed distribution and communication of the individual axis motion commands which are executed by individual joint servo-controllers. Most commercial robot controllers operate at a sample period of 16 msec. The real-time motion controller invariably uses classical independent-joint proportional-integral-derivative (PID) control or simple modifications of PID. This makes commercially available controllers suitable for point-to-point motion, but most are not suitable for following continuous position/velocity profiles or exerting prescribed forces without considerable programming effort, if at all.

Recently, more advanced controllers have appeared. The Adept Windows family of automation controllers (http://www.adept.com) integrates robotics, motion control, machine vision, force sensing, and manufacturing

Figure 1.2.7: Parallel-link robot (courtesy of ABB Robotics).

11 logic in a single control platform compatible with Windows 98 & Windows NT/2000. Adept motion controllers can be configured to control other robots and custom mechanisms, and are standard on a variety of systems from OEMs.

Motion/Process Integration and Sequencing. Motion/process integration involves coordinating manipulator motion with process sensors or other process controller devices. The most primitive process integration is through discrete digital input/output (I/O). For example a machine controller external to the robot controller might send a one bit signal indicating that it is ready to be loaded by the robot. The robot controller must have the ability to read the digital signal and to perform logical operations (if then, wait until, do until, etc.) using the signal. That is, some robot controllers have some programmable logic controller (PLC) functions built in. Coordination with sensors (e.g. vision) is also often provided.

Human Integration. The controller’s human interfaces are critical to the expeditious setup and programming of robot systems. Most robot controllers have two types of human interface available: computer style CRT/keyboard terminals for writing and editing program code off-line, and teach pendants, which are portable manual input terminals used to command motion in a telerobotic fashion via touch keys or joy sticks. Teach pendants are usually the most efficient means available for positioning the robot, and a memory in the controller makes it possible to play back the taught positions to execute motion trajectories. With practice, human operators can quickly teach a series of points which are chained together in playback mode. Most robot applications currently depend on the integration of human expertise during the programming phase for the successful planning and coordination of robot motion. These interface mechanisms are effective in unobstructed workspaces where no changes occur between programming and execution. They do not allow human interface during execution or adaptation to changing environments.

More recent advanced robot interface techniques are based on behavior-based programming, where various specific behaviors are programmed into the robot controller at a low level (e.g. pick up piece, insert in machine chuck). The behaviors are then sequenced and their specific motion parameters specified by a higher-level machine supervisor as prescribed by the human operator. Such an approach was used in [Mireles and Lewis 2001].

Information Integration. Information integration is becoming more important as the trend toward increasing flexibility and agility impacts robotics. Many commercial robot controllers now support information integration functions by employing integrated PC interfaces through the 1.3 Commercial Robot Controllers

monitoring and control. There are many techniques for this, the most convenient of which is Lab VIEW 6.1, which doe not require programming in Java.

1.4 Sensors

Much of the information in this section was prepared by Kok-Meng Lee [Lewis 1998]. Sensors and actuators [Tzou and Fukuda 1992] function as transducers, devices through which high-level workcell Planning, Coordination, and Control systems interface with the hardware components that make up the workcell. Sensors are a vital element as they convert states of physical devices into signals appropriate for input to the workcell PC&C control system; inappropriate sensors can introduce errors that make proper operation impossible no matter how sophisticated or expensive the PC&C system, while innovative selection of sensors can make the control and co-ordination problem much easier.

Sensors are of many different types and have many distinct uses. Having in mind an analogy with biological systems, proprioceptors are sensors internal to a device that yield information about the internal state of that device (e.g. robot arm joint-angle sensors). Exteroceptors yield information about other hardware external to a device. Sensors yield outputs that are either analog or digital; digital sensors often provide information about the status of a machine or resource (gripper open or closed, machine loaded, job complete). Sensors produce outputs that are required at all levels of the PC&C hierarchy, including uses for:

• servo-level feedback control (usually analog proprioceptors)

• process monitoring and coordination (often digital exteroceptors or part inspection sensors such as vision)

• failure and safety monitoring (often digital—e.g. contact sensor, pneumatic pressure-loss sensor)

• quality control inspection (often vision or scanning laser).

Sensor output data must often be processed to convert it into a form meaningful for PC&C purposes. The sensor plus required signal processing is shown as a Virtual Sensor. It functions as a data abstraction—a set of data plus operations on that data (e.g. camera, plus framegrabber, plus signal processing algorithms such as image enhancement, edge detection,

13 segmentation, etc.). Some sensors, including the proprioceptors needed for servo-level feedback control, are integral parts of their host devices, and so processing of sensor data and use of the data occurs within that device; then, the sensor data is incorporated at the servocontrol level or Machine Coordination level. Other sensors, often vision systems, rival the robot manipulator in sophistication and are coordinated by a Job Coordinator, which treats them as valuable shared resources whose use is assigned to jobs that need them by some priority assignment (e.g. dispatching) scheme. An interesting coordination problem is posed by so-called active sensing, where, e.g., a robot may hold a scanning camera, and the camera effectively takes charge of the motion coordination problem, directing the robot where to move to effect the maximum reduction in entropy (increase in information) with subsequent images.

Types of Sensors

This section summarizes sensors from an operational point of view. More information on functional and physical principles can be found in [Fraden 1993], [Fu et al. 1987], [Snyder 1985].

Tactile Sensors. Tactile sensors rely on physical contact with external objects.

Digital sensors such as limit switches, microswitches, and vaccuum devices give binary information on whether contact occurs or not. Sensors are available to detect the onset of slippage. Analog sensors such as spring-loaded rods give more information. Tactile sensors based on rubberlike carbon- or silicon-based elastomers with embedded electrical or mechanical components can provide very detailed information about part geometry, location, and more. Elastomers can contain resistive or capacitive elements whose electrical properties change as the elastomer conmpresses. Designs based on LSI technology can produce tactile grid pads with, e.g., 64×64 ‘forcel’ points on a single pad. Such sensors produce ‘tactile images’ that have properties akin to digital images from a camera and require similar data processing.

Additional tactile sensors fall under the classification of ‘force sensors’

discussed subsequently.

Proximity and Distance Sensors. The noncontact proximity sensors include devices based on the Hall effect or inductive devices based on the electromagnetic effect that can detect ferrous materials within about 5 mm.

Such sensors are often digital, yielding binary information about whether or not an object is near. Capacitance-based sensors detect any nearby solid or liquid with ranges of about 5mm. Optical and ultrasound sensors have longer ranges.

1.4 Sensors

360 deg. coverage in navigation applications for mobile robots, both scanning sonars and ring-mounted multiple sonars are available. Sonar is typically noisy with spurious readings, and requires low-pass filtering and other data processing aimed at reducing the false alarm rate. The more expensive laser rangefinders are extremely accurate in distance and have very high angular resolution.

Position, Velocity, and Acceleration Sensors. Linear position-measuring devices include linear potentiometers and the sonar and laser rangefinders just discussed. Linear velocity sensors may be laser- or sonar-based Doppler-effect devices.

Figure 1.4.1: Optical Encoders, (a) Incremental optical encoder, (b) Absolute optical encoder with n=4 using Grey code. (Snyder, W.E., 1985. Industrial Robots, Prentice-Hall, NJ, with permission.)

Joint-angle position and velocity proprioceptors are an important part of the robot arm servocontrol drive axis. Angular position sensors include potentiometers, which use dc voltage, and resolvers, which use ac voltage and have accuracies of 15 min. Optical encoders can provide extreme accuracy using digital techniques. Incremental optical encoders use three optical sensors and a single ring of alternating opaque/clear areas, Figure 1.4.1(a), to provide angular position relative to a reference point and angular velocity information;

commercial devices may have 1200 slots per turn. More expensive absolute optical encoders, Figure 1.4.1(b), have n concentric rings of alternating opaque/clear areas and require n optical sensors. They offer increased accuracy and minimize errors associated with data reading and transmission, particularly if they employ the Grey code, where only one bit changes between

15 two consecutive sectors. Accuracy is 3600/2n with commercial devices having n=12 or so.

Gyros have good accuracy if repeatability problems associated with drift are compensated for. Directional gyros have accuracies of about 1.5 deg.

Vertical gyros have accuracies of 0.5 deg and are available to measure multiaxis motion (e.g. pitch and roll). Rate gyros measure velocities directly with thresholds of 0.05 deg/sec or so.

Various sorts of accelerometers are available based on strain gauges (next paragraph), gyros, or crystal properties. Commercial devices are available to measure accelerations along three axes. A popular new technology involves microelectromechanical systems (MEMS), which are either surface or bulk micromachined devices. MEMS accelerometers are very small, inexpensive, robust, and accurate. MEMS sensors have especially been used in the automotive industry [Eddy 1998].

Force and Torque Sensors. Various torque sensors are available, though they are often not required; for instance, the internal torques at the joints of a robot arm can be computed from the motor armature currents. Torque sensors on a drilling tool, for instance, can indicate when tools are becoming dull.

Linear force can be measured using load cells or strain gauges. A strain gauge is an elastic sensor whose resistance is a function of applied strain or deformation. The piezoelectric effect, the generation of a voltage when a force is applied, may also be used for force sensing. Other force sensing techniques are based on vacuum diodes, quartz crystals (whose resonant frequency changes with applied force), etc.

Robot arm force-torque wrist sensors are extremely useful in dexterous manipulation tasks. Commercially available devices can measure both force and torque along three perpendicular axes, providing full information about the Cartesian force vector F. Standard transformations allow computation of forces and torques in other coordinates. Six-axis force-torque sensors are quite expensive.

Photoelectric Sensors. A wide variety of photoelectric sensors are available, some based on fibreoptic principles. These have speeds of response in the neighborhood of 50 microsec with ranges up to about 45 mm, and are useful for detecting parts and labeling, scanning optical bar codes, confirming part passage in sorting tasks, etc.

Other Sensors. Various sensors are available for measuring pressure, temperature, fluid flow, etc. These are useful in closed-loop servo-control applications for some processes such as welding, and in job coordination and/or safety interrupt routines in others.

1.4 Sensors

experimentation, computation, and tuning after installation. Manufacturers often provide calibration procedures though in some cases, including vision, such procedures may not be obvious, requiring reference to the published scientific literature. Time-consuming recalibration may be needed after any modifications to the system.

Figure 1.4.2: Signal Processing using FSM for Optical Encoders (a) Phase relations in incremental optical encoder output, (b) Finite state machine to decode encoder output into angular position. (Snyder 1985).

Figure 1.4.3: Hardware design from FSM. (a) FSM for sonar transducer control on a mobile robot, (b) Sonar driver control system from FSM.

Particularly for more complex sensors such as optical encoders, significant sensor signal conditioning and processing is required. This might include amplification of signals, noise rejection, conversion of data from analog to digital or from digital to analog, and so on. Hardware is usually provided for such purposes by the manufacturer and should be considered as part of the sensor package for robot workcell design. The sensor, along with its signal processing hardware and software algorithms may be considered as a data abstraction and is called the ‘virtual sensor’.

17 If signal processing does need to be addressed, it is often very useful to use finite state machine (FSM) design. A typical signal from an incremental optical encoder is shown in Figure 1.4.2(a); a FSM for decoding this into the angular position is given in Figure 1.4.2(b). FSM are very easy to convert directly to hardware in terms of logical gates. A FSM for sequencing a sonar is given in Figure 1.4.3(a); the sonar driver hardware derived from this FSM is shown in Figure 1.4.3(b).

A particular problem is obtaining angular velocity from angular position measurements. All too often the position measurements are simply differenced using a small sample period to compute velocity. This is guaranteed to lead to problems if there is any noise in the signal. It is almost always necessary to employ a low-pass-filtered derivative where velocity samples vk are computed from position measurement samples pk using, e.g.,

where T is the sample period and ␣ is a small filtering coefficient. A similar approach is needed to compute acceleration.

Vision Systems, Cameras, and Illumination. Typical commercially available vision systems conform to the RS-170 standard of the 1950’s, so that frames are acquired through a framegrabber board at a rate of 30 frames/sec. Images are scanned; in a popular US standard, each complete scan or frame consists of 525 lines of which 480 contain image information. This sample rate and image resolutions of this order are adequate for most applications with the exception of vision-based robot arm servoing. Robot vision system cameras are usually TV cameras—either the solid-state charge-coupled device (CCD), which is responsive to wavelengths of light from below 350nm (ultraviolet) to 1100nm (near infrared) and has peak response at approximately 800nm, or the charge injection device (CID), which offers a similar spectral response and has a peak response at approximately 650nm. Both line-scan CCD cameras, having resolutions ranging between 256 and 2048 elements, and area-scan CCD cameras are available. Medium-resolution area-scan cameras yield images of 256×256, though high-resolution devices of 1024×1024 are by now available. Line-scan cameras are suitable for applications where parts move past the camera, e.g., on conveyor belts. Framegrabbers often support multiple cameras, with a common number being four, and may support black-and-white or color images.

If left to chance, illumination of the robotic workcell will probably result in severe problems in operations. Common problems include low-contrast images, specular reflections, shadows, and extraneuos details. Such prob-lems can be corrected by overly sophisticated image processing, but all of 1.4 Sensors

backlighting (which produces easily processed silhouettes), structured-lighting (which provides additional depth information and simplifies object detection and interpretation), and directional lighting.

REFERENCES

[Decelle 1988] Decelle, L.S., “Design of a Robotic Workstation For Compo-nent Insertions,” AT&T Technical Journal, March/April 1988, Volume 67, Issue 2. pp 15–22.

[Eddy 1998] Eddy, D.S., and D.R.Sparks, “Application of MEMS technol-ogy in automotive sensors and actuators,” Proc. IEEE, vol. 86, no. 8, pp. 1747–1755, Aug. 1998.

[Fraden 1993] Fraden, J. AIP Handbook Of Modern Sensors, Physics, De-sign, and Applications, American Institute of Physics. 1993.

[Fu et al. 1987] Fu, K.S., R.C.Gonzalez, and C.S.G.Lee, Robotics, McGraw-Hill, New York, 1987.

[Jamshidi et al. 1992] Jamshidi, M., Lumia, R., Mullins, J., and Shahinpoor, M., 1992. Robotics and Manufacturing: Recent Trends in Re-search, Education, and Applications, Vol. 4, ASME Press, New York.

[Lewis and Fitzgerald 1997] Lewis, F.L., M.Fitzgerald, and K.Liu “Robotics,”

in The Computer Science and Engineering Handbook, Allen B.Tucker, Jr. ed., Chapter 33, CRC Press, 1997.

[Lewis 1998] Lewis, F.L., “Robotics,” in Handbook of Mechanical Engi-neering, F.Kreith ed., chapter 14, CRC Press, 1998.

[Liu and Lewis 1993] Liu, K., F.L.Lewis, G.Lebret, and D.Taylor, “The singularities and dynamics of a Stewart Platform Manipulator,” J.

Intelligent and Robotic Systems, vol. 8, pp. 287–308, 1993.

[Mireles and Lewis 2001] Mireles, J., and F.L.Lewis, “Intelligent Mate-rial Handling: Development and implementation of a matrix-based discrete-event controller,” IEEE Trans. Industrial Electronics, vol. 48, no. 6, pp.

1087–1097, Dec. 2001.

[Pugh 1983] Pugh, A., ed., 1983. Robotic Technology, IEE Control Engi-neering Series 23, Pergrinus, London.

[Snyder 1985] Snyder, W.E., 1985, Industrial Robots: Computer Interfac-ing and Control, Prentice-Hall, Inc., Englewood Cliffs, New Jersey, USA.

[Tzou and Fukuda 1992] Tzou, H.S., and Fukuda, T., Precision Sensors, Actuators, and Systems, Kluwer Academic, 1992.

Chapter 2

Introduction to Control

Dans le document Robot Manipulator Control (Page 28-39)