You are on page 1of 4

Heli1.

mp4

Helicopter Racing League is a global sportingly 40, competitive helicopter recipe. Every year they hold the
world championship and the rule The league competition where teams compete with each other to earn a spot
in the world championship at General,

-field service to stream the races, all over the world, with live Telemetry and prediction throughout each race.
-streaming service all over the world for this is that they had,
-means they have customers all over the world but providing live streaming of these days with not enough for
them
+provide other keyword related, data point, maybe, like clear the acting in last one year.
billing rate in the last few competition and information about the helicopter that the player is like like

manufacturing company and its technical specification.


-predictions like likelihood of winning of any player based on some data related to the rate that happened in
the past.

-They expect to things from Google Cloud


-want to use Mallet service for their machine learning and AI. But we keep section led to the Rain.
-not interested in writing custom UI logic and taking the headache of managing the AI related introspective
infects backing custom.

The organization who have the required, but it for such down. So, use the Melody. I ll service is generally
more preferred.
-want to provide both real-time streaming and recorded video to their customers. It means they need to store
dear a scheme as well so that their customer can see be raised according later on at any point in time.
-while scrubbing the content to their customers, they want to ensure that content is the from the location that is
-better access to contents for their customers.
pcts because TDL service provide previous point of presence across multiple geography – cache your data on
those locations, and then can serve the content to your customer

from the nearest location and still video streaming is a loan structure data.
-such data should be stored in object-based storage

existing technical architecture


-some critical applications are already deployed in Cloud.
-means they are already using some cloud provider for hosting their existing services
-prefer to use public Cloud for most any of their service

They have their viewers across the globe and for such any sporting business, they usually have
very high traffic during some big tournament or live competition.

Otherwise once to DeMint is over, then traffic volume goes down to a great extent for such kind of business.
We always need highly dynamically scalable services.

-video recording and video editing at the racetrack and then content is encoded and transported.
In flowers where it is needed, they use various p.m. in the cloud to perform the different type of encoding
and decoding.
Actually we do TV Coding and decoding in various formats. Agent, who supported different type of
devices that dear viewer can use to watch the race video.

It feels different devices, have different resolution, and application on those devices.
-video in different format so it is always good to convert any video in required format file and then save them
this way. Multiple copies of the same video is generated to support eight different devices.
But it saves a lot of computation power Body Company because this way it will not need to perform this
encoding and decoding for each line at the time of the request.

They store all this encoded content in an object storage sub, it on their existing public cloud provider.

They provide Enterprise data connectivity, and local computer via truck-mounted mobile data center at the
racetrack.

-Innovative way to provide T good connectivity. And What their team at the racetrack.
There is prediction services are boosted exclusively on their existing public Cloud providers. And these
predictions are performed using details, are below the link, only be em in the barrel, public cloud provider,
-means in their current architecture.

They have created custom machine learning models and it'll be custom AI code that runs using the tensor flow.
Now,

Requirement
-want to expand their predictive capabilities and reduce the latency for their view in the emerging market for
this.
-want to support the ability with both effective model to their partner.

-they are not only interested in doing extraction liquid to the days but it also want to prove to their partners.
He beating behind this prediction and this is

-want to increase their production capabilities, both the 40 days and during the race.
-want to do the prediction related to the result, like with going to win Indy race
-want to predict the mechanical failures that can occur is only Telemetry data that they collect.
-want to predict the crowd sentiment or any event that can occur during the game.
-want to measure these been engagement with the new production and they have the requirement to
create a merchandising Revenue stream.

only goal sediments in the Sin engagement data.


They can decide what product to advertise on this team.
If they want to show some kind of advertisement or they can decide when to any particular advertisement, we
want to capture more Telemetry data who has the additional inside.

-to enhance the global availability and quality of their broadcast and increase the number of concurrent viewers.
-target to As their operational complexity.
-ensure the compliance with the regulation

Technical Requirement
-want to maintain or increase the petition to put and security
-reduce the viewer latency increases the transparency
-Performance.
-analytic of the fuel consumption pattern and engagement
-create a data Mart to enable the processing of large volume of his data

-focused on increasing the viewership of their Destiny.


-Currently they can just predict the results in 48 actually start - do not have the capability to do, the real-time
protection during the game and they want to bet this capability.
This is a prediction capability can help them to have the higher fan engagement. Because nowadays viewers are
not only interested in TV live video, but they want something extra in terms of prediction to keep them more
engaging.

-want to build a data processing pipeline that can process the large amount of full season data.
Solution proposal

to store video content and to serve this content from the location nearest :
-cloud storage to store the video file and use cloud, tedious to distribute and catch.

It is contact globally at the various point of presence location. So that it's like idiom. Most this data is
cash. Only CDL. They can serve this data to the viewers from the nearest location

-to build a data model that can provide the capability to store and process, the very large motor beta, which is
captured during the whole game. Jesus.

attack analysis and prediction for this requirement.


-Kendra data processing pipeline for the best data. Here, we can use cloud storage to store data from the
whole gaming season, we can use Dataflow for the data processing and BigQuery

-use biggest machine learning and AI based tools to get more insight into this data. Since they are more
interested in using he managed services for their machine learning and AI requirement.

couple of options in terms of managed services in AI and ML in Google Cloud are BigQuery ML pipette
pump and Cloud 420 ml.

-BigQuery animal. Trombone by using bigquery.


They can serve their machine, learning use cases, just by using the sql-like query, but it works on this
structure data only.

They can easily launch compute engine, VM, instances restored with tensorflow.

it will be easy, lift and shift approach completely custom model for the LA I use,
Skip like Risk prediction predicting the potential failure and predicting the couch agreement.

executive clearly mentioned that they also want to build the capability to do the real-time protection.

-need to ingest data in real time and cloud pub/sub is the best choice which is highly available and highly
scalable service for real-time data injection. With this data pipeline they can see T live Telemetry data from
their trucks Mobile Data, Center to Cloud, pub/sub at real-time

rest of the pipeline will remain same as I explained for the batch processing data pipeline. For real-time video
analysis. We can you do intelligence Epi along with GI steamer in Json Library here. This e. Isomer in Json
Library support, the real-time video streaming,

in addition, using the various protocols, like HTTP live streaming real time, streaming protocol, or real-time
messaging protocol, and this Library can connect to video intelligence API.

To immediate outputting performance:


-For video transcoding, one option to invoke this transporter API, this pipeline here.
-First original video is stored in cloud storage here. They can get multiple floor function or Heats not putting
-transcoding by invoking the calculator. API. And these log functions can be configured to be invoked on new
object storage even with this approach multiple paths putting of freaking can be triggered in parallel and that
too, without any it's positive maintenance headache
Once video transcoding is complete, then be transported video can be stored back in cloud storage but in
different buckets this time, so that transporting plot function are not triggered.

another option if you do plan on putting his lair Adventure here, they can when they are transporting
application inside this app engine, which will do the transcoding, by invoking this drug product API and user
can move The application is running inside this opinion by a rest API.

You might also like