You are on page 1of 6

Machine learning

Machine learning methods are playing an increasingly important role in


data analysis because they can deal with massive amounts of data. In fact,
the more data the better. Most machine learning methods construct
hypotheses from data. our growing abilities to store large amounts of data
in rapid-access computer memories and to compute with these data has
enabled techniques that store and use all the data as they are needed.
These insights yield connections between deep learning and diverse
physical and mathematical topics, including random landscapes, spin
glasses, jamming, dynamical phase transitions, chaos, Riemannian
geometry, random matrix theory, free probability, and non equilibrium
statistical mechanics. Indeed, the fields of statistical mechanics and
machine learning have long enjoyed a rich history of strongly coupled
interactions, and recent advances at the intersection
of statistical mechanics and deep learning suggest these interactions will
only deepen going forward
Linear regression

Linear regression analysis is often used by life scientists. For example, the
equation for the regression of one variable on another may suggest
hypotheses about why the two variables are functionally related. More
practically, regression can be used in situations where the dependent
variable is difficult, expensive or impossible to measure, but its values can
be predicted from another easily measured variable to which it is
functionally related.
Linear regression analysis gives an equation for a line that describes the
functional relationship between two variables and tests whether the
statistics that describe this line are significantly different to zero. The
simplest functional relationship between a dependent and independent
variable is a straight line. Only two statistics, the intercept a (which is the
value of Y when X is zero) and the slope of the line b, are needed to
uniquely describe where that line occurs on a graph.
Python the programming language of future

The growth rate of python was not something easy to achieve. Python
programming language is better used for app development, web app or
web development, game development, scientific computing, system
administration.
Python is an easy to learn and expressive language, making it more
suitable for everyone to use it.It is based on object oriented programming.
It is interpreted cross platform and easy to use language.
There are a wide range of applications that python offers and will keep
on offering for future purposes. Some of its applications are web
application,software based application,desktop gui,console based
,scientific and numeric,business,enterprise, image processing etc.
Future of python is really bright as it is playing a vital role in backend
web development hence it has skyrocketed to number one
language.Python is getting updated frequently and the new features
are increasing its efficiency in a contiguous manner.Python growth is
promising in future. Top companies stuck with java, python trending
technologies now and also in future. As a result, python has become a
core language, using python for research, production, development.
Small, big, start-up organizations choose python to meet their customer
requirements. Python has been voted as a favorite language rather than
c, c ++, machine learning course, learn r programming.
To write scripts, test mobile devices, it is the most demanded language
in the IT industry. Python is just a great tool at large scale. Companies
in India expect highly skilled python developers for their companies.
Statistics show a continuous rise in salaries for python developers.
Python offers excellent management memory.
Design and microarchitecture of the IBM System
z10 microprocessor
In addition to the high-frequency pipeline that runs at 4.4 GHz, other
distinctive innovations within the z10 core have also been described.
These innovations address various aspects of a microprocessor design.
The enhanced branch prediction reduces misprediction penalties and
initiates I-cache prefetching. The L1.5 cache and the support for both
software cache management and hardware data prefetching reduce the
overall cache-miss penalties. The second-level TLB and the large page
provision reduce overall TLB-miss latencies and software overhead. New
instructions have been added to support software optimization. In addition,
decimal floating-point operations are done in hardware, and COP
functionalities are enhanced. Finally, many power-saving techniques are
incorporated for an energy-efficient design suitable for a mainframe
system.
Computing Game Design with Automata Theory
The basic theme of applying automata theory to real world problems will
remain intact as just applied to game theory. The limitation of automata
theory tools to mapping in computing games is that non-deterministic finite
state automata cannot be directly programmed into a language until it is
converted back to deterministic finite state automata. Because in
programming languages the epsilon edges in non-deterministic automata
cannot be translated to commands. As computer is a deterministic
machine which is instructed to perform a task but at non determinism level
it’s the human brain that works better. To translate the non-deterministic
finite state automaton (NDFSA) logic unification and minimization may be
applied using Kleene's theorem [9] of unification to obtain the regular
expression which may be translated then to deterministic finite state
automaton which is ideal to be programmed in any computing language.
Despite this limitation ‘applied automata theory’ is a discipline, in its
inception of course in applied nature, in which applied research may result
advancement in many fields not only limited to computing but also to social
and medical sciences.
Understanding the internet AS topology and its
applications
Autonomous Systems (AS) in the Internet use BGP to perform inter-domain
routing. A set of import and export policies at an AS make up the routing table of
an AS. Since AS relationships are not publicly available, several studies have
proposed heuristic algorithms for inferring AS relationships using publicly
available BGP data. Content Delivery Network (CDN) servers placed around the
world cater to the needs of clients that access their content. Since, the majority
of the Internet traffic today is content delivery traffic, it is important to study the
efficiency of the routing paths from users to content servers which are not under
the control of content providers. Netflix and Akamai are two major CDN
providers. The user experience depends upon the performance of CDN servers.
Hence, it is important for CDNs to choose the ideal server when a user requests
content from its network. Due to lack of authentication of routes in BGP, prefixes
are prone to being hijacked by ASes to which the prefixes do not belong. The
mechanisms used to address this is to detect the hijack after it has happened
and react to it. A more preventive mechanism is necessary to prevent it from
happening in the first place. A recent work proposed a list of serial hijackers that
would enable such a solution. Unfortunately, the ground truth of serial hijackers
is very small.

You might also like