Robots and automation manufacturing machineries have become an inseparable part of industry, nowadays. However, robotic systems are generally limited to operate in highly structured environments. Although, sensors such as laser tracker, indoor GPS, 3D metrology and tracking systems are used for positioning and tracking in manufacturing and assembly tasks, these devices are highly limited to the working environment and the speed of operation and they are generally very expensive. Thus, integration of vision sensors with robotic systems and generally visual servoing system allows the robots to work in unstructured spaces, by producing non-contact measurements of the working area. However, projecting a 3D space into a 2D space, which happens in the camera, causes the loss of one dimension data. This initiates the challenges in vision based control. Moreover, the nonlinearities and complex structure of a manipulator robot make the problem more challenging. This project aims to develop new reliable visual servoing methods that allow its use in real robotic tasks. The main contributions of this project are in two parts; the visual servoing controller and trajectory planning algorithm. In the first part of the project, a new image based visual servoing controller called Augmented Image Based Visual Servoing (AIBVS) is presented. A proportional derivative (PD) controller is developed to generate acceleration as the controlling command of the robot. The stability analysis of the controller is conducted using Lyapanov theory. The developed controller has been tested on a 6 DOF Denso robot. The experimental results on point features and image moment features demonstrate the performance of the proposed AIBVS. Experimental results show that a damped response could be achieved using a PD controller with acceleration output. Moreover, smoother feature and robot trajectories are observed compared to those in conventional IBVS controllers. Later on, this controller is used on a moving object catching process. Visual servoing controllers have shown difficulty in stabilizing the system in global space. Hence, in the second part of the project, a trajectory planning algorithm is developed to achieve the global stability of the system. The trajectory planning is carried out by parameterizing the camera's velocity screw. The camera's velocity screw is parameterized using time-based profiles. The parameters of the velocity profile are then determined such that the velocity profile guides the robot to its desired position. This is done by minimizing the error between the initial and desired features. This method provides a reliable path for the robot considering all robotic constraints. The developed algorithm is tested on a Denso robot. The results show that the trajectory planning algorithm is able to perform visual servoing tasks which are unstable when performed using visual servoing controllers.