Support Vector Machine

Note: this post meant to help clarify the tutorial question number 2  for COMP 9417 – Week 9, School of Computer Science and Engineering, UNSW (s1 – 2017)

Support Vector Machine

Support Vector Machine (SVM) is essentially an approach to learning linear classifiers  which enables SVM to maximising the margin. Here is the picture, inspired by Flach – Fig. 7.6 – 7.7, that shows the difference between decision boundary produced by SVM, and other linear classifiers (such as: linear regression or perceptron).

To achieve that, SVM utilise below objective function, which attempts to find the values of alpha_1,...,alpha_n that maximise the function.

Continue reading

Algorithm Independent Aspects of Machine Learning

Note: this post meant to help clarify the tutorial questions (number 1, 3, 4, 5)  for COMP 9417 – Week 11, School of Computer Science and Engineering, UNSW (s1 – 2017)

Regardless of machine learning algorithm that we choose, we still may want to know about these stuffs (slide – page 6):

  • How many training examples a learner should have before it converges to correct hypothesis? (sample complexity)
  • How large is a hypothesis space? How complex is a learner’s hypothesis? (hypothesis complexity)
  • How many errors a learner are allowed to misclassify before it finally converges to a successful hypothesis? (mistake bounds)

Let’s take a look at first aspect. Continue reading

How smart is a robot?

Recently, there is a hype about smart robots that will replace humans in many areas. These robots usually has a certain degrees of artificial intelligence. But are most of the robots are truly intelligent? What is intelligence anyway? This post want to give simple explanation about intelligent robot from artificial intelligence (AI) perspective.

An example of “an intelligent robot”

Consider the example of a robot which competes in firefighting robot contest. In this competition, robot should explore a maze, with many obstacles, to find the fire and extinguish it. Although the robot might seem intelligence, but it is very limited. For instance, how if the source of fire is not inside the maze, but outside of it? How if the robot does not see the fire, but it sees a lot of smokes? Suddenly, this robot seems not so smart anymore when its environment, or its task, is changed.

Continue reading

Getting started with Baxter Simulator

To learn robotics properly, one needs a robot to play with. Building a robot is costly and not simple, so a simulated robot can be utilised to replace the physical one. Most of robots for research purpose currently utilise Robot Operating System (ROS), because it is flexible, modular, and open source. Mastering ROS is a must for robotics researchers or enthusiasts nowadays. Baxter robot, created by Rethink Robotics, is a good medium to learn ROS and robotics.

baxter_pick_placeAlthough Baxter is a great robot with unique features, but it is reasonably expensive. For learning purpose, it can be replaced with Baxter simulator. Previously, it is only available for the organisation who has bought Baxter, and it does not have any grippers attached in the Baxter’s arms. Since the end of 2015, the simulator has been updated, one of the biggest change is the grippers addition, and now it is open-source, so everyone can use it. This post will guide you to learn to use Baxter simulator in more systematic way.

Why Baxter simulator?

Here are several reasons to use Baxter simulator:

  • It is fully functioning, and open source, simulator.
  • Baxter is ROS-ready. It provides a user friendly Python API that wrapped ROS interfaces in Python classes. Baxter simulator is developed in Gazebo, another open source program.
  • While one can learn about ROS by using the turtlebot simulator, it is only a mobile robot. On the other hand, Baxter has two manipulators, so one can learn to control the arm control using a simple joints control, an inverse kinematics (IK) solving or a complex trajectory planning. Many other complex extension can be performed on Baxter.
  • Baxter robot is well documented. This post just fills the small gap on that excellent documentation.

Continue reading

Building a 3D Printed Mini Rescue Robot (based on OARK)

Most robot kits available in the market, such as : LEGO Mindstorm, Fischer Technic, Bioloid, are limited in flexibility and durability. There may be no available parts that suitable for particular need. Those robots will also not be able to traverse on irregular and cluttered terrain which is common, for example, in rescue area.  Open Academic Robot Kit (OARK) is an open source robot kit that exploits the advent of 3D printer technology. It is created by Dr. Raymond Sheh (Curtin University) to lower the barrier for everyone who wants to enter robotics research field, particularly rescue robot. The robot also has a nick name: Emu Mini 2, as it is a mini version of the Emu, an autonomous rescue robot in school of Computer Science and Engineering , UNSW.  The official webpage of OARK is http://oarkit.intelligentrobots.org/.

Here is the compilation of Emu Mini 2 which I demonstrate on a mini rescue arena at RoboCup 2015 in Hefei, China.

I will describe the main components of the robot that we build here at CSE – UNSW based on OARK:

  • 7 servo motors Dynamixel AX-12 from Robotis
    • 4 motors to move the mobile base and 3 others to move the arm
  • USB2Dynamixel from Robotis
    • It is utilised to connect the motors to RaspberryPi via its USB port (not its GPIO pins as usual)
    • OpenCM 9.04 board is also can be used, see here
  • RaspberryPi 2 and a Micro SD card
    • Although the first version can be used, RaspberryPi 2 have much faster processor and it has 4 USB ports which is useful in our system
  • RaspberryPi Camera and a long camera cable (50 cm)
    • A long cable is needed as the camera is located in an articulated arm
  •  Wi-Pi dongle from Element 14
    • A wi-fi dongle for Raspberry Pi
  • Wireless Gamepad Logitech F710
    • A PS3 dualshock controller is also can be used, see here
  • Rechargeable LiPo battery (output : 12V, 1750 mA)
    • To supply the servo motors
  • Power bank (output 5V, 1 A) or attach a voltage regulator to convert 12 V (from battery) to 5V
    • To supply the Pi

Here is the not-so-complete list of the robot’s materials and where to get them.

There are 7 steps to build this robot.

  1. Preparing the mechanical robot parts
  2. Preparing the RaspberryPi 2
  3. Preparing connection between RaspberryPi and Dynamixel AX12 motor
  4.  Controlling Dynamixel AX12 using PyDynamixel Library
  5.  Interfacing a wireless joystick and RaspberryPi
  6.  Intuitive motor control using a wireless joystick
  7.  Video streaming via PiCamera

Continue reading