In typical (forward) optimization, the goal is to obtain optimal values for the decision variables given known values of optimization model parameters. However, in practice, it may be challenging to determine appropriate values for these parameters. Assuming the availability of historical observations that represent past decisions made by an optimizing agent, the goal of inverse optimization is to impute the unknown model parameters that would make these observations optimal (or approximately optimal) solutions to the forward optimization problem. Inverse optimization has many applications, including geology, healthcare, transportation, and production planning. In this dissertation, we study inverse optimization with integer observation(s), focusing on the cost coefficients as the unknown parameters. Furthermore, we demonstrate an application of inverse optimization to recommender systems. First, we address inverse optimization with a single imperfect integer observation. The aim is to identify the unknown cost vector so that it makes the given imperfect observation approximately optimal by minimizing the optimality error. We develop a cutting plane algorithm for this problem. Results show that the proposed cutting plane algorithm works well for small instances. To reduce computational time, we propose an LP relaxation heuristic method. Furthermore, to obtain an optimal solution in a shorter amount of time, we combine both methods into a hybrid approach by initializing the cutting plane algorithm with a solution from the heuristic method. In the second study, we generalize the previous approach to inverse optimization with multiple imperfect integer observations that are all feasible solutions to one optimization problem. A cutting plane algorithm is proposed and then compared with an LP heuristic method. The results show the value of using multiple data points instead of a single observation. Finally, we apply the proposed methods in the setting of recommender systems. By accessing past user preferences, through inverse optimization we identify the unknown model parameters that minimize an aggregate of the optimality errors over multiple points. Once the unknown parameters are imputed, the recommender system can recommend the best items to the users. The advantage of using inverse optimization is that when users are optimizing their decisions, there is no need to have access to a large amount of data for imputing recommender system model parameters. We demonstrate the accuracy of our approach on a real data set for a restaurant recommender system.