It was way back in 1990s that the term Big Data was first used. It is used to refer to data that is "huge, overwhelming, and in uncontrollable amounts". Such humongous data can be used for decision making using techniques of data analysis. However, in the last two decades, the volume and speed with which data is generated has changed beyond measures of human comprehension. IDC (International Data Corporation) estimated the newly created data will reach close to 35 zettabytes (35 trillion gigabytes) by 2020. But by 2018 it was already at 33 zettabytes, leading IDC to predict that in 2025, 175 zettabytes (175 trillion gigabytes) of new data will be created around the world.The need to process and analyse these increasingly larger (and unstructured) data sets led to the transformation of traditional data analysis into 'Big Data Analytics' in the last decade. This transformation led to complete transformation of the organizational processes and the strategic decision making and also to the evolution of newer technologies and tools for handling and servicing the Big Data Architecture and Big Data Analytics. This book has been designed for students (both graduates and postgraduates) and practitioners who want to understand all the concepts related to Artificial Intelligence, Big Data and Big Data Analytics, the related terminologies, technologies, and use cases in various domains. The examples and use cases provided in the book do not require a technical or conceptual skill set but an interest in the concerned domain to appreciate the problems and the results and advantages achieved through application of Intelligent Techniques. The book covers the concepts from the standpoint of a novice, so the readers need no prior knowledge of any tool or technology before reading this book. But the readers should have an inclination towards Statistics and Analysis to appreciate and understand the concepts.