Ruilai Yang1, Jonathan Sahagun2, 1 USA, 2 California State University, USA
Chess-playing robots represent an ideal testbed for integrating computer vision, robotic manipulation, and artificial intelligence into cohesive autonomous systems [1]. This project addresses the challenge of creating an affordable, accessible chess-playing robot arm capable of competing against human opponents. The proposed solution integrates four core technologies: servo motor control via the Adafruit ServoKit library for precise arm manipulation, inverse kinematics using TinyIK for position-to-angle calculations, OpenCV-based computer vision for chessboard detection and move recognition, and the Stockfish chess engine for AI-powered move computation. Key challenges included achieving reliable board detection under varying lighting conditions, computing accurate joint angles for a 4-DOF arm, and synchronizing physical movements with game state [2]. Experimental evaluation demonstrates 94% board detection accuracy and sub-centimeter positioning precision. The system successfully enables human-robot chess gameplay with configurable difficulty levels, contributing to accessible robotics education and human-robot interaction research.
Robotics, Computer Vision, Human–Robot Interaction, Artificial Intelligence