This book provides a coherent and unifying view for logic and representation learning to contribute to knowledge graph (KG) reasoning and produce better computational tools for integrating both worlds. To this end, logic and deep neural network models are studied together as integrated models of computation. This book is written for readers who are interested in KG reasoning and the new perspective of neuro-symbolic integration and have prior knowledge to neural networks and deep learning. The authors first provide a preliminary introduction to logic and background knowledge closely related to the surveyed techniques such as the introduction of knowledge graph and ontological schema and the technical foundations of first-order logic learning. Reasoning techniques for knowledge graph completion are presented from three perspectives, including: representation learning-based, logical, and neuro-symbolic integration. The book then explores question answering on KGs with specific focus on multi-hop and complex-logic query answering before outlining work that addresses the rule learning problem. The final chapters highlight foundations on ontological schema and introduce its usage in KG before closing with open research questions and a discussion on the potential directions in the future of the field.