You are on page 1of 24

Airlines & Dynamic

Pricing
Made by :
Mohamed Selim Maazouz
Mohamed Tegedi
Fatma Sakka
Index 1 Dynamic Pricing

2 What is Spark ?

3 Spark's Interest In Big Data

4 How To Use it For Functional Programming

5
Transformation And Actions Examples

6 Demo

7 Conclusion
Dynamic Pricing
Dynamic Pricing is a pricing strategy in which companies set flexible prices for products or
services based on current market demands.

Factors that Help Determine a “Dynamic” Price :


Time of day or night
Customer location
Day of the week
Level of demand
Competitor pricing
Purchase method/stage
What is Spark ?
Apache Spark is a powerful open-source distributed computing system used for processing
large-scale data sets. It is designed to be highly performant and flexible, making it ideal for
big data processing tasks that require fast data processing and analysis.
Spark's interest in big data
Spark's interest in big data stems from the need to process and analyze vast amounts of data
generated by modern applications and systems. Traditional data processing tools and
technologies are often unable to handle the volume, velocity, and variety of big data. Spark
addresses these challenges by providing a scalable, distributed computing platform that
enables fast and flexible processing of large-scale data sets, making it a popular choice for big
data processing tasks.
How To Use it For Functional
Programming
Functional programming is a programming paradigm that emphasizes the use of pure
functions, immutability, and higher-order functions. In Spark, functional programming is
achieved through the use of its built-in APIs, such as the RDD (Resilient Distributed
Dataset) API and the DataFrame API.
How To Use it For Functional
Programming
Difference between Dataset and Data Frame ?
A dataset is a structured collection of data generally associated with a unique body of work.
A database is an organized collection of data stored as multiple datasets. Those datasets
are generally stored and accessed electronically from a computer system that allows the
data to be easily accessed, manipulated, and updated.
How To Use it For Functional
Programming
To use Spark for functional programming, you can follow these general steps:
1. Create a Spark session: This is typically done using a SparkSession object, which provides a
unified entry point to Spark functionality.
2. Load data: You can load data into Spark from a variety of sources, including Hadoop
Distributed File System (HDFS), local file systems, and cloud storage systems like Amazon
S3.
3. Create RDDs or DataFrames: Depending on the type of data you are working with, you can
create RDDs or DataFrames.
How To Use it For Functional
Programming
4.Apply transformations: Use transformations to perform operations on
RDDs or DataFrames, such as filtering, mapping, reducing, and joining.

5.Execute actions: Actions trigger the computation of RDDs or DataFrames


and return results to the driver program. Examples of actions include
count, collect, and save.

6.Clean up: When you are finished with your Spark job, make sure to clean
up any resources that were used.
Transformation And Actions Examples
Difference between Transformations and Actions ?
Transformations : which create a new dataset from an existing one, and
Actions : which return a value to the driver program after running a computation on the dataset
Transformation And Actions Examples
Transformations:
1. map: Applies a function to each element of an RDD or DataFrame and returns a new RDD or
DataFrame with the results.
2. filter: Selects elements from an RDD or DataFrame that satisfy a given predicate and returns a
new RDD or DataFrame with the selected elements.
3. flatMap: Similar to map, but each input element can be mapped to zero or more output
elements, resulting in a new RDD or DataFrame with more elements than the original.
4. groupBy: Groups elements of an RDD or DataFrame by a given key and returns a new RDD or
DataFrame where the values are grouped together for each key.
5. join: Joins two RDDs or DataFrames based on a common key.
Transformation And Actions Examples
Actions:
1. collect: Returns all the elements of an RDD or DataFrame to the driver
program as an array.
2. count: Returns the number of elements in an RDD or DataFrame.
3. reduce: Applies a function to the elements of an RDD or DataFrame to
reduce them to a single value.
4. save: Saves the RDD or DataFrame to a file or storage system.
Demo
Demo
Demo
Demo
Demo
Demo
Demo
Demo
Demo
Commands used :
val flight=sc.textFile("C:/flights.csv")
val booking=sc.textFile("C:/booking.csv")
flight.collect()
booking.collect()
flight.foreach(println)
booking.foreach(println)
val flight = spark.read.format("csv").load("C:/flights.csv")
val flights=flight.toDF("flight_number","departure","arrival","date","number_of_passengers")
flights.show
Demo
Rest of the Commands used :
val booking = spark.read.format("csv").load("C:/booking.csv")
val bookings=booking.toDF("passport_id","flight_number","price")
bookings.show()
val join = bookings.join(flights,"flight_number")
join.show()
flights.createOrReplaceTempView("table_of_flights")
bookings.createOrReplaceTempView("table_of_bookings")
spark.sql("select min(price) from table_of_bookings").show
Conclusion
In conclusion, dynamic pricing has emerged as a valuable tool for businesses to optimize
revenue in the era of big data. By leveraging large datasets and advanced analytics
techniques, businesses can dynamically adjust their pricing strategies in real-time, based on
factors such as consumer demand, competitor pricing, and market trends. However,
implementing dynamic pricing requires careful consideration of ethical concerns and
transparency to ensure fairness to consumers. As big data continues to evolve, dynamic
pricing will likely become even more prevalent, and businesses will need to adapt and
innovate to stay competitive.
Thank you for your attention

You might also like