Professional Documents
Culture Documents
1.IT vs IS
information technology is a subset of information systems. Information systems consist of
people, processes, machines and information technology.
Current global and competitive business environment constantly asks for innovation, existing knowledge
base is getting obsolete, continuously thriving for advancement in process improvement. The learning
curve is always put to test, and every company is striving to remain ahead of the curve. Due to this shift in
the way business is getting conducted has thrown out new reality of ever shortening product and service
life cycle. More and more companies are coming out with customized products and finding ways to
differentiate from competition.
A recent survey conducted has highlighted that the change in the business environment can be
summarized with following:
Globalization and opening up of markets has not only increased competition but also has allowed
companies to operate in markets previously considered forbidden.
Inclusion of information technology as integral part of business environment has ensured that
companies are able to process, store and retrieve the huge amount of data at ever dwindling
costs.
Globalization has encouraged free movement of capital, goods and service across countries
Business environments are complex in nature as well as dynamic because they are dependent
upon factors like political, economic, legal, technological, social, etc. for sustenance.
Business environment affects companies in different industries in its own unique way. For
example, importers may favor lower exchange rate while exporters may favor higher exchange
rate.
With change in the business environment, some fundamental effects are short term in nature
while some are felt over a period of time.
Outsourcing has help companies reduce their overhead expenses, improve productivity, shorten
innovation cycles, encourage new market penetration and also improving customer experience. India has
seen tremendous growth in BPO industry within function like customer care, finance/accounts, payroll,
high end financial services, human-resource, etc.
Emerging Trends
The recent explosion of information technology has seen few but significant emerging trends, for
example, mobile platform for doing business, cloud computing, technology to handle a large volume of
data, etc.
These fresh technologies and platforms are offering numerous opportunities for companies to drive
strategic business advantage and stay ahead of the competition. Companies need to work on new plans
as to maintain flexibility and deliver customer satisfying products and services.
Hardware consists of input/output device, processor, operating system and media devices. Software
consists of various programs and procedures. Database consists of data organized in the required
structure. Network consists of hubs, communication media and network devices. People consist of device
operators, network administrators and system specialist.
Information processing consists of input; data process, data storage, output and control. During input
stage data instructions are fed to the systems which during process stage are worked upon by software
programs and other queries. During output stage, data is presented in structured format and reports.
In any given organization information system can be classified based on the usage of the information.
Therefore, an information system in an organization can be divided into operations support system and
management support system.
The purpose of the operation support system is to facilitate business transaction, control
production, support internal as well as external communication and update organization central
database. The operation support system is further divided into a transaction-processing system,
processing control system and enterprise collaboration system.
In manufacturing organization, there are several types of transaction across department. Typical
organizational departments are Sales, Account, Finance, Plant, Engineering, Human Resource
and Marketing. Across which following transaction may occur sales order, sales return, cash
receipts, credit sales; credit slips, material accounting, inventory management, depreciation
accounting, etc.
These transactions can be categorized into batch transaction processing, single transaction
processing and real time transaction processing.
In a manufacturing organization, certain decisions are made by a computer system without any
manual intervention. In this type of system, critical information is fed to the system on a real-time
basis thereby enabling process control. This kind of systems is referred as process control
systems.
In recent times, there is more stress on team effort or collaboration across different functional
teams. A system which enables collaborative effort by improving communication and sharing of
data is referred to as an enterprise collaboration system.
In any given organization information system can be classified based on the usage of the information.
Therefore, information systems in business can be divided into operations support system and
management support system.
Information Technology
Everyday knowingly or unknowingly, everyone is utilizing information technology. It has grown rapidly and
covers many areas of our day to day life like movies, mobile phones, the internet, etc.
Information technology greatly enhances the performance of economy; it provides edge in solving social
issues as well as making information system affordable and user friendly.
Information technology has brought big change in our daily life be it education, life at home, work place,
communication and even in function of government.
Origin: Information systems have been in existence since pre-mechanical era in form of books,
drawings, etc. However, the origin of information technology is mostly associated with invention of
computers.
Development: Information systems have undergone great deal of evolution, i.e. from manual
record keeping to the current cloud storage system. Similarly, information technology is seeing
constant changes with evermore faster processor and constantly shrinking size of storage
devices.
Business Application: Businesses have been using information systems for example in form of
manual books of accounts to modern TALLY. The mode of communication has also gone under
big change, for example, from a letter to email. Information technology has helped drive efficiency
across organization with improved productivity and precision manufacturing.
Information systems have been known to mankind in one form or the other as a resource for decision
making. However, with the advent of information technology information systems have become
sophisticated, and their usage proliferated across all walks of life. Information technology has helped
managed large amount of data into useful and valuable information.
The truly productive employees are those who do not multitask nor spend endless hours watching the big
game scores or news and event updates from around the world. Indeed, one of the reasons investment
bankers and consultants are much sought after is that they have learnt to distinguish between short term
and ephemeral trends and instead, detect longer-term trends and extrapolations from existing information
that is meaningful and makes business sense.
Cloud Computing
One of the most talked about concept in information technology is the cloud computing. Clouding
computing is defined as utilization of computing services, i.e. software as well as hardware as a service
over a network. Typically, this network is the internet.
Cloud computing offers 3 types of broad services mainly Infrastructure as a Service (IaaS), Platform as a
Service (PaaS) and Software as a Service (SaaS).
Some issues concerning cloud computing are privacy, compliance, security, legal, abuse, IT governance,
etc
Mobile Application
Another emerging trend within information technology is mobile applications (software application
on Smart phone, tablet, etc.)
Mobile application or mobile app has become a success since its introduction. They are designed
to run on Smartphone, tablets and other mobile devices. They are available as a download from
various mobile operating systems like Apple, Blackberry, Nokia, etc. Some of the mobile app are
available free where as some involve download cost. The revenue collected is shared between
app distributor and app developer.
User Interfaces
User interface has undergone a revolution since introduction of touch screen. The touch screen
capability has revolutionized way end users interact with application. Touch screen enables the
user to directly interact with what is displayed and also removes any intermediate hand-held
device like the mouse.
Touch screen capability is utilized in smart phones, tablet, information kiosks and other
information appliances.
Analytics
The field of analytics has grown many folds in recent years. Analytics is a process which helps in
discovering the informational patterns with data. The field of analytics is a combination of
statistics, computer programming and operations research.
The field of analytics has shown growth in the field of data analytics, predictive analytics and
social analytics.
Data analytics is tool used to support decision-making process. It converts raw data into
meaningful information.
Predictive analytics is tool used to predict future events based on current and historical
information.
Social media analytics is tool used by companies to understand and accommodate customer
needs.
The concept of big data has become reality, with development of high memory storage devices.
Portability: advances in information technology have made portability of all electronic gadgets
possible.
Speed: computing is now done at speed at which earlier generations of super computer were
working.
Miniaturization: another innovation is in form of hand-held computing devices as well as an
information system, like GPS, Smartphone, IPad etc.
Connectivity: information technology has transformed communication capability.
Entertainment: proliferation of multimedia and digital information has been tremendous.
User Interface: advancement in information technology has changed way users interact with
computing devices. The advent of touch screen has made computing intuitive and interactive.
7S Framework
The 7S framework constitutes of 7 factors, which affect organizational effectiveness. These 7 factors are
strategy, organizational structure, IT systems, shared values, employee skills, management style and
staff. These 7 factors can be broadly categorized into Hard Elements-Strategy, Structure, Systems and
Soft Elements-Shared Values, Skills, Style and Staff. Hard elements highlighted above are the ones
which are under direct control of management. Soft elements are not in direct control of management and
are driven by internal culture.
1. Strategy: It is defined as an action plan working towards the organizational defined objective.
2. Structure: It is defined as design of organization-employees interaction to meet defined
objective.
3. Systems: It is defined as information systems in which organization has invested to fulfill its
defined objective.
4. Staff: It is defined as workers employed by the organization.
5. Style: It is defined as the approach adopted by the leadership to interact with employees,
supplier and customers.
6. Skills: It is defined as characteristics of employees associated with the organization.
7. Shared Values: It is the central piece of the whole 7S framework. It is a concept based on which
organization has decided to achieve its objective.
Usage of 7S Framework
The basis of the 7S framework is that for organization to meet its objective it is essential all the seven
elements are in sync and mutually balancing. The model is used to identify which out of 7 factors need to
be balanced as to align with change in organization.
7S framework is helpful in identifying the pain points which are creating a hurdle in organization growth.
DBMS:
A database management system (or DBMS) is essentially nothing more than a computerized
data-keeping system. Users of the system are given facilities to perform several kinds of
operations on such a system for either manipulation of the data in the database or the
management of the database structure itself.
Stands for "Database Management System." In short, a DBMS is a database program. ... Some
DBMS examples include MySQL, PostgreSQL, Microsoft Access, SQL Server, FileMaker, Oracle,
RDBMS, dBase, Clipper, and FoxPro
Two types of database structure
Databases typically have one of two basic forms:
Consists of hardware and software that manage and archive electronic documents and also
convert paper documents into e-documents.
DMS, besides capturing and storing the documents takes care of indexing which facilitates
searching of documents from the repository.
Transaction data always has a time dimension, a numerical value and refers to one or more
objects (i.e., the reference data).
Typical transactions are:
Financial: orders, invoices, payments
Work: plans, activity records
Logistics: deliveries, storage records, travel records, etc.
• Redundancy
• Inconsistency
• Security issues
https://www.guru99.com/dbms-keys.html
Super Key – A super key is a group of single or multiple keys which identifies rows in a table.
Primary Key – is a column or group of columns in a table that uniquely identify every row in that
table.
Candidate Key – is a set of attributes that uniquely identify tuples in a table. Candidate Key is a
super key with no repeated attributes.
Alternate Key – is a column or group of columns in a table that uniquely identify every row in that
table.
Foreign Key – is a column that creates a relationship between two tables. The purpose of Foreign
keys is to maintain data integrity and allow navigation between two different instances of an
entity.
Compound Key – has two or more attributes that allow you to uniquely recognize a specific
record. It is possible that each column may not be unique by itself within the database.
Composite Key – is a combination of two or more columns that uniquely identify rows in a table.
The combination of columns guarantees uniqueness, though individual uniqueness is not
guaranteed.
Data Models are fundamental entities to introduce abstraction in a DBMS. Data models define how
data is connected to each other and how they are processed and stored inside the system. The very first
data model could be flat data-models, where all the data used are to be kept in the same plane.
Analytical processing involves the analysis of accumulated data. Analytical processing, sometimes
referred to as business intelligence, includes data mining, decision support systems (DSS), querying, and
other analysis activities. These analyses place strategic information in the hands of decision makers to
enhance productivity and make better decisions, leading to greater competitive advantage.
Metadata has been identified as a key success factor in data warehouse projects. It captures all kinds
of information necessary to extract, transform and load data from source systems into the data
warehouse, and afterwards to use and interpret the data warehouse contents.
TCP/IP stands for Transmission Control Protocol/Internet Protocol and is a suite of communication
protocols used to interconnect network devices on the internet. TCP/IP is also used as a
communications protocol in a private computer network (an intranet or extranet).
TCP/IP is a two-layered program: the higher layer (TCP) disassembles message content into small "data
packets"
2. TELNET is also used for chat operation. FTP is used for downloading the files.
5. Remote Login is necessary in TELNET. Remote Login does not necessary in FTP.
There is various type of connectivity to get hook on to Internet. They all can be broadly classified into following category:
(i) Gateway Access
(ii) Dial-up Connection
(iii) Leased Connection
(iv) DSL
(v) Cable Modem Connection
(vi) VSAT
• Domain Registration • Identifying the web site hosting provider • Website software and content
development • Deployment of website on the hosting server • Promotion of the website • Maintenance
What Is SEO?
Search engine optimization (SEO) is the practice of getting targeted traffic to a website from a search
engine’s organic rankings. Common tasks associated with SEO include creating high-quality content,
optimizing content around specific keywords, and building backlinks.
SEO is all about improving a site’s rankings in the organic (non-paid) section of the search results. As
an Internet marketing strategy, SEO considers how search engines work, the computer-
programmed algorithms that dictate search engine behavior, what people search for, the actual search
terms or keywords typed into search engines, and which search engines are preferred by their targeted
audience..
Search engine marketing (SEM) is a form of Internet marketing that involves the promotion
of websites by increasing their visibility in search engine results pages (SERPs) primarily through paid
advertising. Search engine marketing gives you the framework, tools, and processes to gain more visibility
in search engines either by getting higher positions in organic results or better positions for your ads. The
two main types of SEM, SEO, and PPC can work together in harmony and maximize your results.
Big data
is a term that describes large, hard-to-manage volumes of data – both structured and
unstructured – that inundate businesses on a day-to-day basis. ... Big data can be analyzed for
insights that improve decisions and give confidence for making strategic business moves.
When big data is distilled and analyzed in combination with traditional enterprise data,
enterprises can develop a more thorough and insightful understanding of their business.
It leads to enhanced productivity, a stronger competitive position and greater innovation.
NETWORK
In the today world, Two devices are in network if a process in one device is able to
exchange information with a process in another device. Networks are known as a medium of connections
between nodes (set of devices) or computers. A network is consist of group of computer systems,
servers, networking devices are linked together to share resources, including a printer or a file server. The
connections is established by using either cable media or wireless media.
A Local Area Network is a privately owned computer network covering a small Networks geographical
area, like a home, office, or groups of buildings e.g. a school Network. A LAN is used to connect the
computers and other network devices so that the devices can communicate with each other to share the
resources. The resources to be shared can be a hardware device like printer, software like an application
program or data. The size of LAN is usually small. The various devices in LAN are connected to central
devices called Hub or Switch using a cable. Now-a-days LANs are being installed using wireless
technologies. Such a system makes use of access point or APs to transmit and receive data. One of the
computers in a network can become a server serving all the remaining computers called Clients.
MAN stands for Metropolitan Area Networks is one of a number of types of networks. A MAN is a
relatively new class of network. MAN is larger than a local area network and as its name implies, covers
the area of a single city. MANs rarely extend beyond 100 KM and frequently comprise a combination of
different hardware and transmission media. It can be single network such as a cable TV network, or it is a
means of connecting several LANs into a larger network so that resources can be shared LAN to LAN as
well as device to device.
A wide area network (WAN) is a telecommunication network. A wide area network is simply a LAN of
LANs or Network of Networks. WANs connect LANs that may be on opposite sides of a building, across
the country or around the world. WANS are characterized by the slowest data communication rates and
the largest distances. WANs can be of two types: an enterprise WAN and Global WAN.
Hubs
A hub is a physical layer networking device which is used to connect multiple devices in a network. They
are generally used to connect computers in a LAN.
A hub has many ports in it. A computer which intends to be connected to the network is plugged in to
one of these ports. When a data frame arrives at a port, it is broadcast to every other port, without
considering whether it is destined for a particular destination or not.
Switches
A switch is a data link layer networking device which connects devices in a network and uses packet
switching to send and receive data over the network. Like a hub, a switch also has many ports, to which
computers are plugged in. However, when a data frame arrives at any port of a network switch, it
examines the destination address and sends the frame to the corresponding device(s). Thus, it supports
both unicast and multicast communications.
A network protocol is a set of established rules that dictate how to format, transmit and receive data so
that computer network devices -- from servers and routers to endpoints -- can communicate, regardless
of the differences in their underlying infrastructures, designs or standards. To successfully send and
receive information, devices on both sides of a communication exchange must accept and follow protocol
conventions. In networking, support for protocols can be built into software, hardware or both. Without
computing protocols, computers and other devices would not know how to engage with each other. As a
result, except for specialty networks built around a specific architecture, few networks would be able to
function, and the internet as we know it wouldn't exist. Virtually all network end users rely on network
protocols for connectivity.
set of cooperating network protocols is called a protocol suite. The Transmission Control
Protocol/Internet Protocol (TCP/IP) suite, which is typically used in client-server models, includes
numerous protocols across layers -- such as the data, network, transport and application layers -- working
together to enable internet connectivity. These include the following:
TCP, which uses a set of rules to exchange messages with other internet points at the information packet
level.
User Datagram Protocol, or UDP, which acts as an alternative communication protocol to TCP and
is used to establish low-latency and loss-tolerating connections between applications and the
internet.
IP, which uses a set of rules to send and receive messages at the level of IP addresses.
1stAnswer:
In general, business analysts must create reports and communicate project findings to business
stakeholders and clients in order to get a better understanding.
Here, IHH plans to provide pocket-size wireless tablet devices to all its doctors for data access,
entry, and communication where the collected data is sent and stored in data base management
system. Here multi-file relational or structured database which help in collecting relational
database contains multiple tables of data with rows and columns that relate to each other
through special key fields.
With a tablet-based system, IHH hospitals can relocate devices wherever they are required,
whether at the usual check-in desk, pre-stage areas, or pop-up temporary hospital tents and
drive-thru screening locations. The mobility of tablet-based systems allows hospitals to flexibly
reconfigure fast as needs change, with the added benefit of having a solution that can be
disinfected between patients
Many mobile (tablet) apps are now available to help hospitals with a variety of important tasks,
including: information and time management; health record maintenance and access;
communications and consulting; reference and information gathering; patient management and
monitoring; clinical decision-making; and medical education and training. Here, Ideally, IHHs
require access to many types of resources in a clinical setting, including:
Paperless Check In: Paper registration forms can be frustrating for hospital staff. Making
sure they are filled out completely, scanning them, manually entering the data into the
EMR … dealing with these types of forms is a time and labor-intensive process that is
prone to error and takes valuable time away from patient.
Faster Insurance Verification: Insurance ID, coverage and required co-pay details comes
next. Manual scanning and entry of insurance cards is time consuming, prone to error
and tedious.
Electronic Record Keeping: A patient checking in can experience check-in delays simply
due to searching for misfiled paper-based records. Scanning, barcoding, and data entry
consume a lot of your time when using paper forms.
Communication capabilities—voice calling, video conferencing, text, and e-mail
Hospital information systems (HISs)—electronic health records (EHRs), electronic medical
records (EMRs), clinical decision support systems (CDSSs), picture archiving and
communication systems (PACSs), and laboratory information systems
Informational resources—textbooks, guidelines, medical literature, drug references
Clinical software applications—disease diagnosis aids, medical calculators.
Five Ways to Improve Access to Care:
See your own patients. Good care comes from access to the same person or team who
knows a patient's history. ...
Make it easy to schedule an appointment. ...
Offer to see patients the day they call. ...
Manage patient demand. ...
Use e-mail with patients.
RFID technology enables hundreds of healthcare applications to improve patient safety, control
surgical instruments, assist staff and patient workflow, automate restocking and invoicing,
authenticate quality and sterilising procedures, and maintain medical records.
After logging into the doctor's account, the patient ID may be retrieved by scanning the RFID
card. The doctor has access to and may update the patient's medical information and
medications. This method helps emergency departments enhance efficiency while also
improving patient care.
3rdAnswer:
As the central focus of HIPAA is data security and privacy the main rules and regulations consists
of three major components,
HIPAA Privacy rules
Security rules,
Breach Notification rules.
The main data security aspects that you would consider when enterprise is deployed into
cloud base
Data Breaches: A data breach is a security violation in which sensitive, protected or
confidential data is copied, transmitted, viewed, stolen or used by an individual
unauthorized to do so.
Hijacking of Accounts : The growth and implementation of the cloud in many
organizations has opened a whole new set of issues in account hijacking.
Attackers now have the ability to use your (or your employees’) login information to
remotely access sensitive data stored on the cloud; additionally, attackers can falsify and
manipulate information through hijacked credentials
Insider Threat: An attack from inside your organization may seem unlikely, but the insider
threat does exist. Employees can use their authorized access to an organization’s cloud-
based services to misuse or access information such as customer accounts, financial
forms, and other sensitive information.
Malware Injection: Malware injections are scripts or code embedded into cloud services
that act as “valid instances” and run as SaaS to cloud servers. This means that malicious
code can be injected into cloud services and viewed as part of the software or service
that is running within the cloud servers themselves.
Abuse of Cloud Services:The expansion of cloud-based services has made it possible
for both small and enterprise-level organizations to host vast amounts of data easily.
However, the cloud’s unprecedented storage capacity has also allowed both hackers and
authorized users to easily host and spread malware, illegal software, and other digital
properties.
Insecure APIs: Application Programming Interfaces (API) give users the opportunity to
customize their cloud experience. Here any access one can edit
Denial of Service Attacks
Insufficient Due Diligence
Shared Vulnerabilities
Cloud security is a shared responsibility between the provider and the client.
This partnership between client and provider requires the client to take preventative
actions to protect their data. While major providers like Box, Dropbox, Microsoft, and
Google do have standardized procedures to secure their side, fine grain control
is up to you, the client.
Data Loss:Data on cloud services can be lost through a malicious attack, natural disaster,
or a data wipe by the service provider. Losing vital information can be devastating to
businesses that don’t have a recovery plan.
2nd
Authorized users can access a web-based solution with customized content. IHH intends to launch a new
online page to give information about its enhanced products as part of its IT renovation strategy. For
Creating any web portal the main important thing is to maintain web host and web presence after creation.
Web Hosting – Check points
• Hosting in-house vs Outsourced • Selection of the hosting platform based on the technology. • Easy
Maintenance • Security • Support
Web Presence – Life Cycle
• Domain Registration • Identifying the web site hosting provider • Website software and content
development • Deployment of website on the hosting server • Promotion of the website • Maintenance
Preparation: Determining the target audience, analyzing competitors, defining the main goals
Planning: it’s necessary to outline requirements for every element of the platform
Design: You should the visual style of the web portal
Content writing: you try to define the topics that may be interesting to your consumers
Coding: here all the previous are combined into a completely functional web portal
Testing and Launching: It is important to make sure your code is valid
Maintenance and Updating: Adding new features, updating content, processing feedback, etc.
SEO is all about improving a site’s rankings in the organic (non-paid) section of the search results.
As an Internet marketing strategy, SEO considers how search engines work, the computer-
programmed algorithms that dictate search engine behavior, what people search for, the actual
search terms or keywords typed into search engines, and which search engines are preferred by
their targeted audience.
Search engine marketing (SEM) is a form of Internet marketing that involves the promotion
of websites by increasing their visibility in search engine results pages (SERPs) primarily through
paid advertising. Search engine marketing gives you the framework, tools, and processes to gain
more visibility in search engines either by getting higher positions in organic results or better
positions for your ads. The two main types of SEM, SEO, and PPC can work together in harmony
and maximize your results.
4th
Considering the typical operations of IHH and the specific technical changes it is planning, the
proposed project can be regarded as a "Big Data" problem. It can be justified by the following
A project is considered as a
Lack of knowledge Professionals: To run these modern technologies and large Data tools,
companies need skilled data professionals. These professionals will include data scientists, data
analysts, and data engineers to work with the tools and make sense of giant data sets.
Lack of proper understanding of Massive Data: Companies struggle to succeed in their Big Data
projects due to a lack of knowledge. Employees may not understand what data is, how it is
stored, processed, and where it comes from. Others may not have a clear picture of what's going
on, even if data specialists do.
Data Growth Issues: The appropriate storage of these large collections of knowledge is one of
the most serious issues of big data. The amount of data being kept in data centers and company
databases are continuously rising.
Confusion while Big Data Tool selection: Companies frequently become perplexed while deciding
on the most basic instrument for large-scale data analysis and storage. They are prone to making
bad judgments and utilizing ineffective technology.
Integrating Data from a Spread of Sources: In a company, data comes from a variety of places,
including social media sites, ERP systems, customer logs, financial reports, e-mails,
presentations, and employee-created reports. Combining all of this information to create reports
may be a difficult process.
Securing Data: One of the biggest difficulties of large data is securing these massive amounts of
knowledge. Companies are frequently so preoccupied with comprehending, preserving, and
analysing their data sets that data security is pushed to the back burner.
IHH might encounter the following challenges too while making specific technical changes
PwC predicts that AR and VR will positively impact 23 million jobs by 2030. The technology finds
applicability in training, meetings, and even business functions such as customer service.
Both AR and VR technologies are among the most exciting trends today for various reasons. On the one
hand, AR applications let you explore superimpose digital content in real-world locations, such as
images, text, and sounds.
And on the other, VR transforms your vision into a simulated experience that can be similar or
completely different from the real world — only limited by your imagination.
The pandemic has disrupted business operations across industries, leading to changes such as remote
working. It made collaboration among teams tougher than ever, especially in roles like operations,
finance and IT where security is paramount.
The adoption of digital approaches, including the implementation of AR, has helped businesses improve
the productivity of their workforces.
Specific use cases include real-time interaction among teams worldwide, real-time access to
information, and a controlled training environment — a necessity to execute change management
during the pandemic.
The technologies have successfully come together — from recruitment and prototyping to creating an
Extended Reality (XR) environment, thus boosting your organization’s resilience.
When you check out the five pros mentioned below, you will realize how AR and VR have helped
implement change management, enabling employees to learn faster and retain the information for
longer. Here is how offices can benefit from the two technologies:
AR devices such as smart goggles can help your workforce access crucial data in real-time. This is
extremely useful in the case of manufacturing units where handling equipment on assembly lines or
other critical spots across the factory floor has to be done properly.
Smart goggles can quickly spot any issues and fix them in time, thus avoiding accidents. Easy access to
real-time data during meetings can help in informed decision-making. This improves productivity by
ensuring the relevant data is used while framing policies or designing solutions.
The use of Augmented Reality in the workplace can be beneficial for the safety of workers and
equipment at the workplace. It provides them with a realistic training scenario that they would face if an
emergency arose.
For instance, there is nothing better than using virtual simulations of a live wire obstruction or fire
breakout. It allows the employees to practice handling possible hazards before things actually go wrong.
AR also allows employees to access relevant information even from critical places like near a boiler or on
top of machinery.
Companies are adopting newer processes across functions, thereby increasing the skills gap. HR
managers often face the challenge of finding the right people for a complex task. AR helps them train
new hires through interactive training sessions.
The learning curve is shortened for the new employees, thereby saving training time. The recruits can
also be guided step-by-step through non-repetitive tasks.
Virtual prototyping speeds up the process by eliminating unnecessary steps. Virtual testing allows
optimization of the design, reducing the cost and time of product development.
5. Improving collaboration
Collaboration among remote teams has been the most significant challenge during the pandemic.
Deploying AR technology allows each teammate to be a part of the experience while saving travel costs
and keeping all safe. Visuals and data can also be shared in an easily consumable fashion with
employees for fruitful discussions.
Now that you know the benefits the two technologies bring to the table, let us see how your business
can support their deployment for addressing a change:
Introducing innovative technologies such as visual assistance requires a smooth roll-out for a quick
return on your investment. By adopting established change management strategies, you empower your
employees to leverage the new tech quickly.
A crucial step of change management is identifying leaders and change agents. Identifying key
individuals and teams help create an environment that makes others comfortable with the change as it
is easy to resist it.
Leaders can communicate changes internally and motivate teams. They help to reduce resistance to
new processes. It would be best if you shared the benefits of adopting AR/VR support to make your
employees realize how using it will help them every day.
Acceptance to change increases once the employees relate to how it eliminates mundane tasks and
achieves faster results. You can consider stakeholder feedback to determine the best ways to
communicate change and its benefits.
It can range from creating video tutorials demonstrating how to use the software or having face-to-face
interactions to explain how the adoption of AR/VR will reduce common pain points and make work
easier for employees.
Teams respond to rewards and recognition with quicker adoption of the change. Healthy competition
works best to gear up teams to change. Take help from leaders and change agents to set clear
expectations of the new process.
More resistant teams can be paired up with established champions to motivate them and unfreeze
acceptance behavior. Gamification also helps in boosting employees to use new processes and tools.
You can reward them by setting up different levels of the process as challenges and tallying performance
on a leaderboard. Take the challenge up by rewarding achievers with gift cards, lunches, or personalized
accessories.
Companies can use various tactics to drive the adoption of new technologies, like enforcing clear
expectations and forced compliance with targets. Though formal processes of withholding bonuses or
promotion upon non-adherence add pressure onto an already sensitive situation.
Your employees are already aligned with your business processes, so you must connect the new
technology with existing ones. Instead of redefining the entire process, you can integrate visual
assistance into them.
It is vital to ensure that the AR/VR technology you plan to use is easily embedded into your
infrastructure. A new interface or duplication of work can lead to resistance from teams.
Easy deployment is another benefit of identifying where visual assistance fits your existing business
processes. Identify where the new technology fits and how it can improve existing processes. Identify
common problems and resolve them to reduce employee frustration.
Create documentation for the change management process steps and decide how and when to initiate
the adoption. Plan training and support sessions to minimize repetition.
You have the capacity to plan your roll-out strategy well. A successful and lasting implementation starts
with confident training on using the tools at hand.
When you are looking to train your team on a new system, it is best not just to provide them with the
tool. Instead, it would be best if you eased their minds about why the change is happening and what will
happen at each stage.
Set a clear path to adoption of the new technology and easier enablement. When your team arrives for
training, they should know why the organization has invested in visual support and how they will be
trained.
Send them introductory videos, so they turn up for the training well-prepared. Allow time during
sessions to experiment with the technology with friends or family members to put users at ease with the
technology.
It provides new users a hands-on experience while they learn how it works. Discuss their experience and
even point them to the correct person for any queries they might have later
What is Quantum Computing?
This non-classical model of computation uses quantum mechanical principles like quantum
entanglement and superposition. Quantum computers can perform certain tasks that classical
computers cannot within a reasonable timeline. The world of quantum is indeed mysterious. Unlike the
classical ones, they can store information as zero or one or the superposition of both.
Several tech giants like IBM, Intel, Microsoft, and Google (along with NASA and USRA) have already
started investing tons of money into this field.
Computational chemistry, financial applications, optimizations, quantum machine learning, etc. are
generally good targets of quantum computing applications. However, Quantum computing has spread
its roots far and wide from insurance to agriculture.
Quantum computing is a rising phenomenon in the Gartner Hype Cycle. The hype cycle is a graphical
representation of technologies’ maturity, adoption & applications across industries, and the benefits
they bring to the businesses. It is expected to become one of the greatest disruptions of the age.
Quantum computing can process huge datasets in a fraction of a second that would have previously
taken days and weeks. It can also calculate almost any kind of risks, such as the impact of an
approaching hurricane.
Mitch Wein and Tom Kramer offer various use cases of quantum computing in “Quantum Computing
and Insurance: Overview and Potential Players.” However, this technology isn’t yet available for
commercial use, unlike AI.
The insurance industry is simplifying many of its back and front-office operations through AI, it is still
restricted by barriers of binary computing. Quantum computing can unlock and change the entire
dynamics of how insurance companies carry out complex calculations. Insurtech companies are creating
and testing solutions around this approach and its effects will soon be visible.
With the introduction of quantum computers, the process of analyzing the data will become much faster
and easier- hence making it easy to deliver a highly personalized experience to the customers. Microsoft
envisions a future where quantum computing is available to a broad audience, scaling as needed to
solve some of the world’s toughest challenges.
The quantum approach in retail begins within a tool we are acquainted with- such as Visual Studio. It
provides development resources to build and promote quantum solutions, and it continues with
deployment through Azure for a streamlined combination of both quantum and classical processing.
Quantum computing could provide unprecedented power and speed of processing as well as novel and
fundamentally different algorithmic search and data homogenization strategies.
From the healthcare perspective, quantum computing technology can lead to “dramatic” accelerations
in speed and performance.
With increased computing available, clinicians could easily review CT scans over time and quickly
identify changes and anomalies. Similarly, it can accelerate precision medicine. With quantum
computing’s enhanced data processing abilities, medical practitioners can quickly identify targeted
chemotherapy protocols and with more customization.
A Quantum computer in agriculture could help in detecting weed through an invasive weed optimization
algorithm. Farmers can hence, effectively craft fertilizers.
Almost all the fertilizers contain ammonia. Therefore efficient manufacturing of ammonia or its
substitute will result in cheaper and less energy-intensive fertilizer generation. However, there hasn’t
been substantial progress because the number of possible catalyst combinations to do so is infinite. A
Quantum computer can quickly analyze and come up with a catalytic combination, which is beyond the
abilities of our largest supercomputers.
In the past few years, governments worldwide and giant tech companies such as Google, IBM,
Microsoft, Alibaba, and many others have been investing heavily in quantum computing.
With quantum supremacy, a quantum computer can perform operations that a classical computer
practically cannot. Despite the far-reaching goal, tremendous progress significant technical barriers
must be surmounted before achieving its potential. This would require stable hardware, a commercial
platform for software development and development of cloud computing capabilities to unleash the
ultimate quantum supremacy
IDE: What is different about quantum computing now, as we enter 2022, and why should business
take notice? Is there urgency?
Ruane: There is a wide gap between understanding the complexities of quantum computing and
harnessing the physical capabilities. Researchers have been working on this for decades, and we
shouldn’t expect 2022 to be the year it is seen everywhere; it’s an evolution. It is slowly emerging, and
we hope substantial progress will follow. Even so, businesses need to take note now.
One sign that things are heating up is the huge amount of resources being applied. Tremendous levels of
investment, private-sector competition, and scientific talent are focused on quantum research. Venture
capital funding grew by 500% from 2015 to 2020, according to CB Insights. Research and development
heavyweights Google, Amazon, Honeywell, IBM, and Intel are also in the race to deliver the next
quantum breakthrough.
When we see the public sector, academia, and industry getting involved and investing in a technology at
this rate, breakthroughs often happen. There is no physics-based reason why quantum progress is not
achievable. Many of the processes we are trying to harness already exist in the natural world – that’s
why it is such an exciting space.
Executives need to think about how new computing models will spur digital investment, reshape
industries, and spark innovation. A solid understanding of quantum applications today is crucial for
positioning a company to reap the benefits—and avoid potential catastrophe—during the next decade.
IDE: Discuss the cybersecurity gains and risks posed by quantum computing and the timeframe for
implementation.
Ruane: Today, most encrypted data–whether at rest or in transit — would be useless if a nefarious third
party got access to it. Data safety is ensured because it would take an unfeasible amount of time to
break the security protocols using classical computers, not because it is theoretically impossible to break
them. Quantum computers, on the other hand, have the potential to break important elements of
current cryptography. Even though quantum machines of requisite scale are a long way off, any data
that is captured today could be stored and eventually unencrypted once quantum computers are widely
used.
IDE: What should business leaders consider today? Can you offer some specific recommendations?
Ruane: Quantum computers won’t be widely available this decade, but it takes time to realize the
impact of a new paradigm like this and how it will be used. Businesses can start building teams to
understand what data changes will be needed and how new architecture will work with current data,
systems, and computers. One immediate approach is to run small internal experiments and to build
relationships with quantum cloud providers.
There is already a shortage of talent. Up-skilling of data scientists to be quantum-capable is a great way
to address this within an organization. Application developers will have to be retrained, too, and
executive buy-in is needed. Advocates must get the conversation started while speaking realistically
about the benefits.
What is virtualization?
Virtualization is the creation of a virtual -- rather than actual -- version of something, such as an
operating system (OS), a server, a storage device or network resources.
Virtualization uses software that simulates hardware functionality to create a virtual system. This
practice allows IT organizations to operate multiple operating systems, more than one virtual system and
various applications on a single server. The benefits of virtualization include greater efficiencies and
economies of scale.
OS virtualization is the use of software to allow a piece of hardware to run multiple operating system
images at the same time. The technology got its start on mainframes decades ago, allowing
administrators to avoid wasting expensive processing power.
A key use of virtualization technology is server virtualization, which uses a software layer -- called
a hypervisor</a -- to emulate the underlying hardware. This often includes the CPU's memory,
input/output (I/O) and network traffic.
Hypervisors take the physical resources and separate them so they can be utilized by the virtual
environment. They can sit on top of an OS or they can be directly installed onto the hardware. The latter
is how most enterprises virtualize their systems.
The Xen hypervisor is an open source software program that is responsible for managing the low-level
interactions that occur between virtual machines (VMs) and the physical hardware. In other words, the
Xen hypervisor enables the simultaneous creation, execution and management of various virtual
machines in one physical environment.
With the help of the hypervisor, the guest OS, normally interacting with true hardware, is now doing so
with a software emulation of that hardware; often, the guest OS has no idea it's on virtualized hardware.
While the performance of this virtual system is not equal to the performance of the operating system
running on true hardware, the concept of virtualization works because most guest operating systems
and applications don't need the full use of the underlying hardware.
This allows for greater flexibility, control and isolation by removing the dependency on a given hardware
platform. While initially meant for server virtualization, the concept of virtualization has spread to
applications, networks, data and desktops.
Types of virtualization
You probably know a little about virtualization if you have ever divided your hard drive into different
partitions. A partition is the logical division of a hard disk drive to create, in effect, two separate hard
drives.
3. Server virtualization is the masking of server resources -- including the number and identity of
individual physical servers, processors and operating systems -- from server users. The intention
is to spare the user from having to understand and manage complicated details of server
resources while increasing resource sharing and utilization and maintaining the capacity to
expand later.
4. Data virtualization is abstracting the traditional technical details of data and data management,
such as location, performance or format, in favor of broader access and more resiliency tied to
business needs.
5. Desktop virtualization is virtualizing a workstation load rather than a server. This allows the user
to access the desktop remotely, typically using a thin client at the desk. Since the workstation is
essentially running in a data center server, access to it can be both more secure and portable.
The operating system license does still need to be accounted for as well as the infrastructure.
6. Application virtualization is abstracting the application layer away from the operating system.
This way, the application can run in an encapsulated form without being depended upon on by
the operating system underneath. This can allow a Windows application to run on Linux and vice
versa, in addition to adding a level of isolation.
Advantages of virtualization
Lower costs. Virtualization reduces the amount of hardware servers necessary within a company
and data center. This lowers the overall cost of buying and maintaining large amounts of
hardware.
Easier disaster recovery. Disaster recovery is very simple in a virtualized environment. Regular
snapshots provide up-to-date data, allowing virtual machines to be feasibly backed up and
recovered. Even in an emergency, a virtual machine can be migrated to a new location within
minutes.
Easier testing. Testing is less complicated in a virtual environment. Even if a large mistake is
made, the test does not need to stop and go back to the beginning. It can simply return to the
previous snapshot and proceed with the test.
Quicker backups. Backups can be taken of both the virtual server and the virtual
machine. Automatic snapshots are taken throughout the day to guarantee that all data is up-to-
date. Furthermore, the virtual machines can be easily migrated between each other and
efficiently redeployed.
Improved productivity. Fewer physical resources result in less time spent managing and
maintaining the servers. Tasks that can take days or weeks in a physical environment can be
done in minutes. This allows staff members to spend the majority of their time on more
productive tasks, such as raising revenue and fostering business initiatives.
Benefits of virtualization
Virtualization provides companies with the benefit of maximizing their output. Additional benefit for
both businesses and data centers include the following:
Expedited deployment and redeployment. When a physical server crashes, the backup server
may not always be ready or up to date. There also may not be an image or clone of the server
available. If this is the case, then the redeployment process can be time-consuming and tedious.
However, if the data center is virtualized, then the process is quick and fairly simple. Virtual
backup tools can expedite the process to minutes.
Reduced heat and improved energy savings. Companies that use a lot of hardware servers risk
overheating their physical resources. The best way to prevent this from happening is to decrease
the number of servers used for data management, and the best way to do this is through
virtualization.
Better for the environment. Companies and data centers that utilize copious amounts of
hardware leave a large carbon footprint; they must take responsibility for the pollution they are
generating. Virtualization can help reduce these effects by significantly decreasing the necessary
amounts of cooling and power, thus helping clean the air and the atmosphere. As a result,
companies and data centers that virtualize will improve their reputation while also enhancing
the quality of their relationship with customers and the planet.
Limitations of virtualization
Before converting to a virtualized environment, it is important to consider the various upfront costs. The
necessary investment in virtualization software, as well as hardware that might be required to make the
virtualization possible, can be costly. If the existing infrastructure is more than five years old, an initial
renewal budget will have to be considered.
Fortunately, many businesses have the capacity to accommodate virtualization without spending large
amounts of cash. Furthermore, the costs can be offset by collaborating with a managed service
provider that provides monthly leasing or purchase options.
There are also software licensing considerations that must be considered when creating a virtualized
environment. Companies must ensure that they have a clear understanding of how their vendors view
software use within a virtualized environment. This is becoming less of a limitation as more software
providers adapt to the increased use of virtualization.
Converting to virtualization takes time and may come with a learning curve. Implementing and
controlling a virtualized environment demands each IT staff member to be trained and possess expertise
in virtualization. Furthermore, some applications do not adapt well when brought into a virtual
environment. The IT staff will need to be prepared to face these challenges and should address them
prior to converting.
There are also security risks involved with virtualization. Data is crucial to the success of a business and,
therefore, is a common target for attacks. The chances of experiencing a data breach significantly
increase while using virtualization.
Finally, in a virtual environment, users lose control of what they can do because there are several links
that must collaborate to perform the same task. If any part is not working, then the entire operation will
fail.
Today, ERP systems are critical for managing thousands of businesses of all sizes and in all
industries. To these companies, ERP is as indispensable as the electricity that keeps the lights on.
What is an ERP system?
How can these solutions manage organizations day-to-day business activities, such as
accounting, finance, procurement, project management, supply chain, and manufacturing.
Enterprise resource planning systems are complete, integrated platforms, either on-premises or
in the cloud, managing all aspects of a production-based or distribution business. Furthermore,
ERP systems support all aspects of financial management, human resources, supply chain
management, and manufacturing with your core accounting function.
ERP systems will also provide transparency into your complete business process by tracking all
aspects of production, logistics, and financials. These integrated systems act as a business's
central hub for end-to-end workflow and data, allowing a variety of departments to access.
ERP Systems and software support multiple functions across the enterprise, mid-sized, or small
businesses, including customizations for your industry.
ERP fundamentals
ERP systems are designed around a single, defined data structure (schema) that typically has a
common database. This helps ensure that the information used across the enterprise is
normalized and based on common definitions and user experiences. These core constructs are
then interconnected with business processes driven by workflows across business departments
(e.g. finance, human resources, engineering, marketing, and operations), connecting systems
and the people who use them. Simply put, ERP is the vehicle for integrating people, processes,
and technologies across a modern enterprise.
To the suppliers −
o Help in giving clear-cut instruction
o Online data transfer reduce paper work
Inventory Economy −
o Low cost of handling inventory
o Low cost of stock outage by deciding optimum size of replenishment
orders
o Achieve excellent logistical performance such as just in time
Distribution Point −
o Satisfied distributor and whole seller ensure that the right products
reach the right place at right time
o Clear business processes subject to fewer errors
o Easy accounting of stock and cost of stock
Channel Management −
o Reduce total number of transactions required to provide product
assortment
o Organization is logically capable of performing customization
requirements
Financial management −
o Low cost
o Realistic analysis
Operational performance −
o It involves delivery speed and consistency.
External customer −
o Conformance of product and services to their requirements
o Competitive prices
o Quality and reliability
o Delivery
o After sales services
To employees and internal customers −
o Teamwork and cooperation
o Efficient structure and system
o Quality work
o Delivery
CUSTOMER RELATIONSHIP MANAGEMENT
CRM is an enterprise application module that manages a company's interactions with current and
future customers by organizing and coordinating, sales and marketing, and providing better
customer services along with technical support.
Atul Parvatiyar and Jagdish N. Sheth provide an excellent definition for customer relationship
management in their work titled - 'Customer Relationship Management: Emerging Practice,
Process, and Discipline' −
Customer Relationship Management is a comprehensive strategy and process of acquiring, retaining,
and partnering with selective customers to create superior value for the company and the customer. It
involves the integration of marketing, sales, customer service, and the supply-chain functions of the
organization to achieve greater efficiencies and effectiveness in delivering customer value.
Why CRM?
To keep track of all present and future customers.
To identify and target the best customers.
To let the customers know about the existing as well as the new products and
services.
To provide real-time and personalized services based on the needs and habits of the
existing customers.
To provide superior service and consistent customer experience.
To implement a feedback system.
SCOPE OF CRM
Marketing and fulfillment
Customer service and support
Retention and loyalty
Store front and field service
Sales : cross sales , tele sales , up sales
Advantages of CRM
Provides better customer service and increases customer revenues.
Discovers new customers.
Cross-sells and up-sells products more effectively.
Helps sales staff to close deals faster.
Makes call centers more efficient.
Simplifies marketing and sales processes.
Disadvantages of CRM
Some times record loss is a major problem.
Overhead costs.
Giving training to employees is an issue in small organizations.
Types of Computers
Supercomputers
Mainframe computers
Minicomputers
Personal computers (PCs) or microcomputers
Supercomputers
– a powerful computer that can process large amounts of data and do a great amount of
computation very quickly.
Science
Engineering
Education
Defence
Aerospace
Supercomputers are useful for applications involving very large databases or that require a great
amount of computation.
Weather forecasting
Climate research
Scientific simulation
Oil and gas exploration
Quantum mechanics
Cryptanalysis
Institutions
Research
Academics
Health care
Libraries
Large businesses
Financial institutions
Stock brokerage firms
Insurance agencies
Census taking
Industry and consumer statistics
Enterprise resource planning
Transaction processing
e-business and e-commerce
“Minicomputer” is a term that is no longer used much. In recent years, minicomputers are
often referred to as small or midsize servers (a server is a central computer that provides
information to other computers).
Personal computer (PC) – a small computer designed for use by a single user at a time.
A PC or microcomputer uses a single chip (microprocessor) for its central processing unit
(CPU).
“Microcomputer” is now primarily used to mean a PC, but it can refer to any kind of
small computer, such as a desktop computer, laptop computer, tablet, smartphone, or
wearable
RACK SERVER
A rack server, or rack-mounted server, is any server that is built specifically to be mounted within a
server rack. Rack servers are a general-purpose machine that can be configured to support a wide range
of requirements. They are most commonly found in data center environments but can also be used in
smaller computer closets. Unlike traditional servers that look much like a PC, a rack server is wider. So it
can be secured into the rack using mounting screws or rails, depending on the design. If you only require
a small number of servers, they are the best choice economically due to the lower upfront costs.
The height, or the amount of rack units the system might take up, can vary quite a bit. Depending on
what is required from the system. Larger servers allow for additional CPUs, memory, or other
components. The servers themselves are mounted one on top of the other within a rack. To help
minimize the amount of space used.
Power – Rack servers are typically built with all the needed components to operate as a stand-
alone system. They can be very powerful and are used to run high end applications.
Convenience – Having the ability to easily mount a server within a rack is convenient and saves a
lot of space, especially when compared to a traditional tower style server.
Cooling – Cooling a rack server is easier than most others. They are usually equipped with
internal fans and placing them in a rack increases airflow.
Ideal for Lower Quantity – Rack servers are best suited when you need more than one server
(but less than about 10) because they don’t require a massive chassis
BLADE SERVER(virtualization)
A blade server is a modular server that allows multiple servers to be housed in a smaller area. These
servers are physically thin and typically only have CPUs, memory, integrated network controllers, and
sometimes storage drives built in. Any video cards or other components that are needed will be
facilitated by the server chassis. Which is where the blades slide into. Blade servers are often seen in
large data centers. Due to their ability to fit so many servers into one single rack and their ability to
provide a high processing power.
In most cases, one large chassis such as HPE’s BladeSystem will be mounted into a server rack and then
multiple blade servers slide into the chassis. The chassis can then provide the power, manage
networking, and more. This allows each blade server to operate more efficiently and requires fewer
internal components.
Blade servers are generally used when there is a high computing requirement with some type of
Enterprise Storage System: Network Attached Storage (NAS) or a Storage Area Network (SAN). They
maximize available space by providing the highest processor per RU availability. Blade Servers also
provide rapid serviceability by allowing components to be swapped out without taking the machine
offline. You will be able to scale to a much higher processor density using the Blade architecture. The
facility will need to support a much higher thermal and electrical load per square foot.
Hot Swappable – Blade servers can be configured to be hot swappable so if one blade has a
problem, it can be pulled and replaced much more easily. This helps to facilitate redundancy.
Less Need for Cables – Rather than having to run individual cables for each server, blade servers
can have one cable (often fiber) run to the chassis, thus reducing the total cable requirements.
Processing Power – Blade Servers can provide an extremely high processing power while taking
up minimal space
RFID
The use of RFID for inventory management requires a scanner that uses radio waves to
communicate with an RFID tag. The tag itself contains a microchip that allows the reader to read
data and also write data to the tag for real-time updating in place. Each tag is wrapped in a
material like plastic or paper for protection and can be affixed to a variety of surfaces for
tracking. Most tags used for inventory tracking are passive RFID tags, meaning they contain no
battery and are powered by the waves from the readers. Active tags are powered, come at a
higher cost, and are used for long-range tracking of machineries such as trucks and railway cars.
Using RFID tags for inventory management offers several benefits, such as reduced labor costs
and faster scanning. Here’s a look at how RFID tags can be a benefit in the inventory
management process.
Improved visibility and faster scanning. Since RFID tags do not require a “line-of-
sight” scan like barcodes, it is possible to read them at a distance for fast inventory
processing. They can also be read in any orientation and give you improved
visibility into your inventory with the potential for more frequent updates and scanning
locations.
Reduced labor costs. With labor costs accounting for as much as 50-80% of
distribution center costs, RFID offers potential benefits in this area. Inventory check-in,
counting, and shipment verification can be done very quickly and automatically in a few
scans without the need for multiple employees to process them. These savings must be
weighed against the cost of investing in an RFID inventory solution, which we’ll discuss
in more detail below.
Tracking of returnable assets. For those companies that utilize a returnable fleet of
assets such as containers and pallets, there is often a significant capital investment to
protect. Utilizing RFID allows you to track these assets through the entire supply chain
loop and provide increased visibility on inventory locations. This has the added benefit of
improving returns and reducing theft or neglect.
While there are some benefits of using RFID tags for inventory management, the technology also
comes with several disadvantages that hinder usability and introduce other concerns, such as
security. Here’s a look at the distinct disadvantages of using RFID tags for inventory
management.
Inability to use cell phones as scanners. Even though there are fixed and remote RFID
readers available, it is not possible to use a phone to scan them, as can be done with
barcodes. This is especially limiting as it requires drivers or employees in the field to
carry specific RFID readers to do any scans, and phones cannot be used as a backup if the
provided readers fail.
Prohibitive costs when scaling. RFID tags cost significantly more than barcode labels.
In addition, they utilize specific readers that must be purchased from the limited number
of RFID equipment manufacturers. This can add significant costs when scaling these
solutions with the requirement for additional specialized scanners and RFID tags.
Demanding infrastructure needs. Setup for these systems requires the integration of the
readers, tags, inventory management system, network, and building wiring that can take a
significant amount of time and resources to set up. In some cases, companies may need
to update their inventory management system entirely, as some software platforms do not
support RFID. Also, if real-time asset tracking is required, the RFID-enabled system will
need to utilize GPS and cellular data to communicate, which can put a significant burden
on your system.
Security concerns. While RFID systems continue to update and improve their data
security, they can still be vulnerable to hacking. Remote devices, including cell phones,
can sometimes be used to scan tags at close range and copy tag data. This could later be
used to create a cloned tag or copy the information to another tag, a risk of particular
concern in the retail industry.
While the use of RFID tags in inventory management offers some compelling and tangible
benefits, there is a great deal of work to be done to streamline these systems. Much of the
challenge involves scaling this solution in a cost-effective way while updating infrastructure
enough to be able to capitalize on its greatest benefits.
RFID can be useful for some applications, but for most companies looking for an accurate, user-
friendly, and cost-effective solution for inventory management, barcode labels are a proven and
trusted solution. Some barcode labels, such as Camcode’s Metalphoto® inventory tags, are
durable enough to withstand harsh environments in both indoor and outdoor applications and
offer excellent resistance to chemicals, solvents, and abrasion.
Compared to RFID tags, barcodes are just as accurate – if not better, and they can be affixed to
any surface material without impacting accuracy. In contrast, materials like metal can interfere
with an RFID tag’s ability to transmit data, and liquid can hinder an RFID tag’s signal. While it’s
possible to use RFID tags on metal surfaces or items, it requires the use of a special type of tag
with an RFID block to prevent interference, adding to the overall cost.
The main difference between an active and passive RFID inventory system is the way in which
the tags are powered during operation, but the basic workflow in a warehouse is the same for
both configurations. Before a shipment is sent to the warehouse it will have had an RFID tag, or
chip, attached to individual items or an entire pallet. This RFID tag stores important information
about the item.
When the shipment arrives at the destination, each RFID tag will transmit its information to
readers installed within the warehouse. These readers will have been placed in strategic locations
within the receiving and storage areas to pick up the best possible signal. The data is transmitted
via electromagnetic waves and is relayed from the readers to a central warehouse management
system. From there, information can be modified and sent back to the RFID tags for later recall
at any time. This gives warehouse operators the ability to perform tasks such as real-time asset
counts and advanced inventory transactions.
Using an RFID inventory system has been shown to improve inventory accuracy by up to
13% compared to traditional inventory tracking methods and manual inventory checks in some
situations. Particularly in the retail industry, where maintaining inventory accuracy is an ongoing
challenge, there are some benefits to implementing a more automated system. In a warehouse
environment, the implementation costs may outweigh the benefits in some cases. It can be
beneficial in situations where a very high inventory accuracy rate is required, but there is always
a balance between hardware and labor costs that should be considered.
For companies looking to adopt an inventory tracking solution, it’s important to conduct a
thorough analysis of the differences between established technologies such as barcode labels
and tags designed for inventory control and newer solutions like RFID to ensure that you will
achieve a desirable ROI for your investment. For most companies, barcode labels are smart and
practical choices for inventory management.
Applications of SQL
As mentioned before, SQL is one of the most widely used query language over the databases. I'm
going to list few of them here:
Allows users to access data in the relational database management systems.
Allows users to describe the data.
Allows users to define the data in a database and manipulate that data.
Allows to embed within other languages using SQL modules, libraries & pre-
compilers.
Allows users to create and drop databases and tables.
Allows users to create view, stored procedure, functions in a database.
Allows users to set permissions on tables, procedures and views
MySQL
MySQL is an open source SQL database, which is developed by a Swedish company – MySQL
AB. MySQL is pronounced as "my ess-que-ell," in contrast with SQL, pronounced "sequel."
MySQL is supporting many different platforms including Microsoft Windows, the major Linux
distributions, UNIX, and Mac OS X.
MySQL has free and paid versions, depending on its usage (non-commercial/commercial) and
features. MySQL comes with a very fast, multi-threaded, multi-user and robust SQL database
server.
Features
High Performance.
High Availability.
Scalability and Flexibility Run anything.
Robust Transactional Support.
Web and Data Warehouse Strengths.
Strong Data Protection.
Comprehensive Application Development.
Management Ease.
Open Source Freedom and 24 x 7 Support.
Lowest Total Cost of Ownership.
MS ACCESS
This is one of the most popular Microsoft products. Microsoft Access is an entry-level database
management software. MS Access database is not only inexpensive but also a powerful database
for small-scale projects.
MS Access uses the Jet database engine, which utilizes a specific SQL language dialect (sometimes
referred to as Jet SQL).
MS Access comes with the professional edition of MS Office package. MS Access has easyto-use
intuitive graphical interface.
Features
Users can create tables, queries, forms and reports and connect them together with
macros.
Option of importing and exporting the data to many formats including Excel,
Outlook, ASCII, dBase, Paradox, FoxPro, SQL Server, Oracle, ODBC, etc.
There is also the Jet Database format (MDB or ACCDB in Access 2007), which can
contain the application and data in one file. This makes it very convenient to
distribute the entire application to another user, who can run it in disconnected
environments.
Microsoft Access offers parameterized queries. These queries and Access tables can
be referenced from other programs like VB6 and .NET through DAO or ADO.
The desktop editions of Microsoft SQL Server can be used with Access as an
alternative to the Jet Database Engine.
Microsoft Access is a file server-based database. Unlike the client-server relational
database management systems (RDBMS), Microsoft Access does not implement
database triggers, stored procedures or transaction logging.
EMERGING TRENDS IN INFORMATION TECHNOLOGY
Due to several impacts of pandemic and changing business environments, we can see
substantial growth in technological transformations in Information Technology field.
Robotics: it is a significant industry that is gaining momentum with time. Field of
robotics offers a lot to world. Among them, it offers smart factories, agile robotics, cobots
(Cobots are robots that can work together with humans). Companies using are Boston
Dynamics and Universal Robots. Global robots are market valued at 27.73 billion dollars
in 2020 and projected to reach 74.1 billion dollars.
Real life example -South Korea's top wireless carrier SK Telecom has introduced virtual
human model to promote its AI assistant-based platform. The female virtual model is
named as SUA. Sua will make a debut as an SK telecom model in commercials for
company A.it operates around 10 channels for movie drama, series.
Virtual Reality /Augmented Reality: Augmented reality blends real world
elements with virtual ones. VR creates an environment which is fully virtual. It does not
stop with entertainment industry it continues to grow in military, sports, medicine,
education. Top companies using AR and VR is Microsoft, apple, google, QUALCOMM,
Samsung.
Real life example – lens kart and aviation training and maintenance.
Big data analytics: it is a collection of data and helps us to store, process, analyse and
make sense of cutting-edge technologies that help in every sector be like quality of health
care, detecting fraud.
Companies using this tool Netflix, Amazon.
Quantum computing: it uses the properties of quantum physics to perform
calculations and simulation that would not be possible on traditional machine. Tools help
in biology and climate change. Companies using this are IBM and Intel.
Real life example -The most beneficial impact of quantum computing on human life
may be in healthcare. Better internal imaging simulations will detect and diagnose the
early stages of cancer and other diseases. And the other, Quantum computing has the
potential to play a crucial role in national security where Quantum computers can be used
for Défense such as developing superior materials for military vehicles.
5G Networks: we live in a world where massive amount of data should be transmitted
quickly. With increasing in conference reliable connectivity and better band width it
estimated that by 2025 it will have more than 1.7 billion subscribers across globe. Its all
about better connectivity with super-fast and low latency (delay between sending of
information and response) and it will power new service to wide range of verticals.
Real life example - Nearly five years after India embarked on the road to launching the
next generation of mobile telephony, Prime Minister Narendra Modi on 1 Oct,2022
launched 5G services in select cities, ushering in what promises to be an era of ultra-high-
speed Internet on mobile phones and devices.
Telecom companies launching the services are Bharati Airtel and Reliance Jio and
Vodafone India.
IOT: IOT is fast becoming integral part of our lives. It is a futuristic technology
recognised for its social and technological marvels. It means accessing and controlling
daily usable equipment’s and devices using Internet. The most important features of IoT
on which it works are connectivity, analysing, integrating, active engagement. Companies
using are IBM, Verizon.
Real life example - Smart appliances (stoves, refrigerators, washers and dryers, coffee
machines, slow cookers), Smart security systems, smart locks, and smart doorbells, Smart
home hubs (that control lighting, home heating and cooling, etc.), Smart assistants (like
Amazon Alexa or Apple’s Siri), Fitness trackers, sleep trackers, and smart scales.
Whereas in business industry it gives businesses a real-time glimpse into the inner
workings of their company’s systems. From the factory floor to the customer’s door, IoT
delivers insights into everything from machine performance to supply chain and
logistics operations.
CYBER SECURITY: several organisations fallen prey to cyber-attacks including
big names like linked in and Facebook. Few of the cyber security companies are cisco,
mc A fee, IBM. Cyber security is a potential activity by which information and other
communication systems are protected from and/or defended against the unauthorized use
or modification or exploitation or even theft.
BLOCK CHAIN: A blockchain is a growing list of records, called blocks, which are
linked using cryptography. Each block contains a cryptographic hash of the previous
block, a timestamp, and transaction data. Blockchain has been in a lot of buzz these days.
And that is mainly because it is backbone of the very famous cryptocurrency in the world
- the Bitcoin. Many Governments and leading Banks have decided to bring many of their
conventional transactions based on Blockchain concept. The applications and potential of
this framework is huge and is changing the way transactions are made in various
domains.
Real life example - Supply chain management: The blockchain promotes overall
efficiency in supply chain management. And in Healthcare Patients have the right to get
accurate information and this makes security and privacy of patient health data crucial.
You can track batch numbers of drugs using a blockchain. Digital IDs Microsoft, for
instance, is focused on creating digital identification for thousands of refugees and
impoverished people. This is to link connect them to the financial sector through an app.
Digital voting: Voting frauds are commonplace but with the blockchain, these will no
longer be a cause for concern. The blockchain helps governments by making votes
transparent. Any change to this network will immediately be detected by regulators.
Royalty protection is one of the biggest benefits of the blockchain technology. Ownership
laws on blogs, music, videos, etc are necessary and should only be stored in a blockchain.
This protects the artist or content creator’s rights and ensures that he gets a fair share of
the profits.