Friday, July 10, 2015

BCS/CER/CNT/OSI 7 LAYERS


Definition: Learn what the Open Systems Interconnection (OSI) reference model is and how its seven layers of functions provide vendors and developers with a common language for discussing how messages should be transmitted between any two points in a telecommunication network


OSI (Open Systems Interconnection) is reference model for how applications can communicate over a network. A reference model is a conceptual framework for understanding relationships. 

The purpose of the OSI reference model is to guide vendors and developers so the digital communication products and software programs they create will Inter-operate, and to facilitate clear comparisons among communications tools. Most vendors involved in telecommunications make an attempt to describe their products and services in relation to the OSI model. And although useful for guiding discussion and evaluation, OSI is rarely actually implemented, as few network products or standard tools keep all related functions together in well-defined layers as related to the model. The TCP/IP protocols, which define the Internet, do not map cleanly to the OSI model.

The seven Open Systems Interconnection layers are:

Layer 7: The application layer. This is the layer at which communication partners are identified (Is there someone to talk to?), network capacity is assessed (Will the network let me talk to them right now?), and that creates a thing to send or opens the thing received.  (This layer is not the application itself, it is the set of services an application should be able to make use of directly, although some applications may perform application layer functions.)

Layer 6: The presentation layer. This layer is usually part of an operating system and converts incoming and outgoing data from one presentation format to another (for example, from clear text to encrypted text at one end and back to clear text at the other).

Layer 5: The session layer. This layer sets up, coordinates and terminates conversations. Services include authentication and reconnection after an interruption.

Layer 4: The transport layer. This layer manages packetization of data, then the delivery of the packets, including checking for errors in the data once it arrives. 

Layer 3: The network layer. This layer handles the addressing and routing of the data (sending it in the right direction to the right destination on outgoing transmissions and receiving incoming transmissions at the packet level). IP is the network layer for the Internet.

Layer 2: The data-link layer. This layer sets up links across the physical network, putting packets into network frames. This layer has two sub-layers, the Logical Link Control Layer and the Media Access Control Layer. Ethernet is the main data link layer in use.

Layer 1: The physical layer. This layer conveys the bit streamthrough the network at the electrical, optical or radio level. It provides the hardware means of sending and receiving data on a carrier network.


Friday, June 19, 2015

BCS/PGD/ITE/Climate Change

Climate Change



"Climate change is a change in global or regional climate patterns, in particular a change apparent from the mid to late 20th century onwards and attributed largely to the increased levels of atmospheric carbon dioxide produced by the use of fossil fuels"

Natural Causes

The Earth’s climate can be affected by natural factors that are external to the climate system, such as changes in volcanic activity, solar output, and the Earth's orbit around the Sun. Of these, the two factors relevant on timescales of contemporary climate change are changes in volcanic activity and changes in solar radiation. In terms of the Earth’s energy balance, these factors primarily influence the amount of incoming energy. Volcanic eruptions are episodic and have relatively short-term effects on climate. Changes in solar radiance have contributed to climate trends over the past century but since the Industrial Revolution, the effect of additions of greenhouse gases to the atmosphere has been about ten times that of changes in the Sun’s output.

Human Causes

Climate change can also be caused by human activities, such as the burning of fossil fuels and the conversion of land for forestry and agriculture. Since the beginning of the Industrial Revolution, these human influences on the climate system have increased substantially. In addition to other environmental impacts, these activities change the land surface and emit various substances to the atmosphere. These in turn can influence both the amount of incoming energy and the amount of outgoing energy and can have both warming and cooling effects on the climate.  The dominant product of fossil fuel combustion is carbon dioxide, a greenhouse gas. The overall effect of human activities since the Industrial Revolution has been a warming effect, driven primarily by emissions of carbon dioxide and enhanced by emissions of other greenhouse gases.
The build-up of greenhouse gases in the atmosphere has led to an enhancement of the natural greenhouse effect.  It is this human-induced enhancement of the greenhouse effect that is of concern because ongoing emissions of greenhouse gases have the potential to warm the planet to levels that have never been experienced in the history of human civilization. Such climate change could have far-reaching and/or unpredictable environmental, social, and economic consequences.

Short-lived and long-lived climate forcers

Carbon dioxide is the main cause of human-induced climate change. It has been emitted in vast quantities from the burning of fossil fuels and it is a very long-lived gas, which means it continues to affect the climate system during its long residence time in the atmosphere. 
However, fossil fuel combustion, industrial processes, agriculture, and forestry-related activities emit other substances that also act as climate forcers. Some, such as nitrous oxide, are long-lived greenhouse gases like carbon dioxide, and so contribute to long-term climate change. 
Other substances have shorter atmospheric lifetimes because they are removed fairly quickly from the atmosphere. Therefore, their effect on the climate system is similarly short-lived. Together, these short-lived climate forcers are responsible for a significant amount of current climate forcing from anthropogenic substances. 
A number of short-lived climate forcers have climate warming effects and together are the most important contributors to the human enhancement of the greenhouse effect after carbon dioxide. This includes methane and tropospheric ozone – both greenhouse gases – and black carbon, a small solid particle formed from the incomplete combustion of carbon-based fuels (coal, oil and wood for example).

Other short-lived climate forcers have climate cooling effects, most notably sulphate aerosols. Fossil fuel combustion emits sulphur dioxide into the atmosphere (in addition to carbon dioxide) which then combines with water vapour to form tiny droplets (aerosols) which reflect sunlight. Sulphate aerosols remain in the atmosphere for only a few days (washing out in what is referred to as acid rain), and so do not have the same long-term effect as greenhouse gases. 

Monday, April 20, 2015

BCS/PGD/TIME TABLE



PGD LEVEL TIME TABLE



5th may ,Tuesday
6.30 p.m- 8.00 P.m
IT Environment
6th may, Wednesday
6.30 p.m- 8.00 P.m
CSM
7th may, Thursday
6.30 p.m-8.00 p.m
MIS
8th may, Friday
6.30 p.m-8.00 p.m
SE2



9th may, Saturday
1.30 p.m – 3.00 p.m
IT Environment

3.00 p.m- 4.30 p.m
CSM

4.30 p.m – 6.00 p.m
MIS

6.00 p.m - 7.30 p.m
SE2

Monday, March 16, 2015

BCS/DIP/ITPM/ Role and Responsibility of project team

Project Roles and Responsibilities

There are many groups of people involved in both the project and project management lifecycles.
The Project Team is the group responsible for planning and executing the project. It consists of a Project Manager and a variable number of Project Team members, who are brought in to deliver their tasks according to the project schedule.
·         The Project Manager is the person responsible for ensuring that the Project Team completes the project. The Project Manager develops the Project Plan with the team and manages the team’s performance of project tasks. It is also the responsibility of the Project Manager to secure acceptance and approval of deliverables from the Project Sponsor and Stakeholders. The Project Manager is responsible for communication, including status reporting, risk management, escalation of issues that cannot be resolved in the team, and, in general, making sure the project is delivered in budget, on schedule, and within scope.

·         The Project Team Members are responsible for executing tasks and producing deliverables as outlined in the Project Plan and directed by the Project Manager, at whatever level of effort or participation has been defined for them.

·         On larger projects, some Project Team members may serve as Team Leads, providing task and technical leadership, and sometimes maintaining a portion of the project plan.


The Executive Sponsor is a manager with demonstrable interest in the outcome of the project who is ultimately responsible for securing spending authority and resources for the project. Ideally, the Executive Sponsor should be the highest-ranking manager possible, in proportion to the project size and scope. The Executive Sponsor acts as a vocal and visible champion, legitimizes the project’s goals and objectives, keeps abreast of major project activities, and is the ultimate decision-maker for the project. The Executive Sponsor provides support for the Project Sponsor and/or Project Director and Project Manager and has final approval of all scope changes, and signs off on approvals to proceed to each succeeding project phase. The Executive Sponsor may elect to delegate some of the above responsibilities to the Project Sponsor and/or Project Director.
The Project Sponsor and/or Project Director is a manager with demonstrable interest in the outcome of the project who is responsible for securing spending authority and resources for the project. The Project Sponsor acts as a vocal and visible champion, legitimizes the project’s goals and objectives, keeps abreast of major project activities, and is a decision-maker for the project. The Project Sponsor will participate in and/or lead project initiation; the development of the Project Charter. He or she will participate in project planning (high level) and the development of the Project Initiation Plan. The Project Sponsor provides support for the Project Manager; assists with major issues, problems, and policy conflicts; removes obstacles; is active in planning the scope; approves scope changes; signs off on major deliverables; and signs off on approvals to proceed to each succeeding project phase. The Project Sponsor generally chairs the steering committee on large projects. The Project Sponsor may elect to delegate any of the above responsibilities to other personnel either on or outside the Project Team
The Steering Committee generally includes management representatives from the key organizations involved in the project oversight and control, and any other key stakeholder groups that have special interest in the outcome of the project. The Steering committee acts individually and collectively as a vocal and visible project champion throughout their representative organizations; generally they approve project deliverables, help resolve issues and policy decisions, approve scope changes, and provide direction and guidance to the project. Depending on how the project is organized, the steering committee can be involved in providing resources, assist in securing funding, act as liaisons to executive groups and sponsors, and fill other roles as defined by the project.
Customers comprise the business units that identified the need for the product or service the project will develop. Customers can be at all levels of an organization. Since it is frequently not feasible for all the Customers to be directly involved in the project, the following roles are identified:
·         Customer Representatives are members of the Customer community who are identified and made available to the project for their subject matter expertise. Their responsibility is to accurately represent their business units’ needs to the Project Team, and to validate the deliverables that describe the product or service that the project will produce. Customer Representatives are also expected to bring information about the project back to the Customer community. Towards the end of the project, Customer Representatives will test the product or service the project is developing, using and evaluating it while providing feedback to the Project Team.
·         Customer Decision-Makers are those members of the Customer community who have been designated to make project decisions on behalf of major business units that will use, or will be affected by, the product or service the project will deliver. Customer Decision-Makers are responsible for achieving consensus of their business unit on project issues and outputs, and communicating it to the Project Manager. They attend project meetings as requested by the Project Manager, review and approve process deliverables, and provide subject matter expertise to the Project Team. On some projects they may also serve as Customer Representatives or be part of the Steering Committee.
Stakeholders are all those groups, units, individuals, or organizations, internal or external to our organization, which are impacted by, or can impact, the outcomes of the project. This includes the Project Team, Sponsors, Steering Committee, Customers, and Customer co-workers who will be affected by the change in Customer work practices due to the new product or service; Customer managers affected by modified workflows or logistics; Customer correspondents affected by the quantity or quality of newly available information; and other similarly affected groups.
Key Stakeholders are a subset of Stakeholders who, if their support were to be withdrawn, would cause the project to fail.

Vendors are contracted to provide additional products or services the project will require and are another member of the Project Team.

Wednesday, March 11, 2015

BCS/ PDG/ MIS/ CSM short notes

Testing
Unit testing tests the minimal software unit, typically a module or program. Each unit (basic component) of the software is tested to verify that the detailed design for the unit has been correctly implemented.

Integration testing exposes defects in the interfaces and interaction between integrated components (modules or programs). Progressively larger groups of tested software components, corresponding to elements of the architectural design, are integrated and tested until the software works as a system.

Functional testing tests at any level (class, module, interface, or system) for proper functionality as defined in the specification.
• System testing tests a completely integrated system to verify that it meets its requirements.
• System integration testing verifies that a system is integrated to any external or third party systems defined in the system requirements.

Acceptance testing can be conducted by the end-user, customer or client to validate whether or not to accept the product. Acceptance testing may be performed as part of the hand-off process between any two phases of development. Operational testing will occur at this point.

Alpha testing is simulated or actual operational testing by potential users/customers or by an independent test team.
Beta testing comes after alpha testing. Versions of the software, known as beta versions, are released to further users within the company. The software is released to groups of people so that further testing can ensure the product has few faults or bugs.
Although both Alpha and Beta are referred to as testing, the rigours that are applied are often unsystematic and many basic tenets of the testing process are not used.
The Alpha and Beta period provides an insight into environmental and utilisation conditions that can impact on the software.

Regression tests - After modifying the software, either for a change in functionality or to fix defects, a regression test re-runs previously passed tests on the modified software to ensure that the modifications have not unintentionally caused a regression of previous functionality. Regression testing can be performed at any or all of the above test levels.

Implementation/ Changeover methods

As technologies change, many businesses find themselves needing to change over their computer information systems. Upgrading these systems helps them optimize their efficiency and remain competitive. Common changeover areas include security systems, database systems, accounting systems and managerial information systems. Deciding which changeover technique will work best for a particular company depends on the type of changeover and degree of risk for the company.

Parallel Changeover

In a parallel changeover, the new system runs simultaneously with the old for a given period of time. Of all the techniques, this tends to be the most popular, mainly because it carries the lowest risk. If something goes wrong at any point, the entire system can be reverted back to its original state. A primary disadvantage in running two systems at the same time is higher costs. The parallel changeover process also can be quite time-consuming.

Direct Changeover

Direct changeover, also referred to as immediate replacement, tends to be the least favorite of the changeover techniques. In a direct changeover, the entire system is replaced in an instant. Basically, as soon as the new system is powered up, the old system is shut down. This type of changeover carries the most risk because, if something goes wrong, reverting back to the old system usually is impossible. Using the direct changeover technique tends to work best in situations where a system failure isn't critical enough to result in a disaster for the company.

Phased Changeover

The phased changeover technique is considered a compromise between parallel and direct changeovers. In a phased changeover, the new system is implemented one stage at a time. As an example, consider a company working toward installing a new financial system. Implementing the new system one department at a time, the company converts accounts receivable, accounts payable, payroll, and so on. Advantages to phased changeovers are their low cost and isolated errors. The main disadvantage is the process takes a long time to complete because phases need to be implemented separately.

Pilot Changeover

With a pilot changeover, the new system is tried out at a test site before launching it company-wide. For example, a bank may first test the system at one of its branches. This branch is referred to as the pilot, or beta, site for the program. Since parallel changeovers tend to be expensive, using the pilot changeover technique allows companies to run the new system next to their old but on a much smaller scale. This makes the pilot changeover method much more cost-effective. After the kinks are worked out of the system at the test site, companies usually opt to use the direct changeover technique to launch the system company-wide.

ISO 9001

As an ISO 9001 certified organisation you will have implemented Quality Management System requirements for all areas of the business, including:

  • ·         Facilities
  • ·         People
  • ·         Training
  • ·         Services
  • ·         Equipment
Some of the benefits to any organisation:


  • ·         Provides senior management with an efficient management process
  • ·         Sets out areas of responsibility across the organisation
  • ·         Mandatory if you want to tender for some public sector work
  • ·         Communicates a positive message to staff and customers
  • ·         Identifies and encourages more efficient and time saving processes
  • ·         Highlights deficiencies
  • ·         Reduces your costs
  • ·         Provides continuous assessment and improvement
  • ·         Marketing opportunities
Some of the benefits to customers:

  • ·         Improved quality and service
  • ·         Delivery on time
  • ·         Right first time attitude
  • ·         Fewer returned products and complaints
  • ·         Independent audit demonstrates commitment to quality

ISO 9001 - Principles

1 – Customer focus Principle
Organizations depend on their customers and therefore should understand current and future customer needs, should meet customer requirements and strive to exceed customer expectations.
 2 – Leadership Principle
Leaders establish unity of purpose and direction of the organization. They should create and maintain the internal environment in which people can become fully involved in achieving the organization’s objectives
 3 – Involvement of people Principle
People at all levels are the essence of an organization and their full involvement enables their abilities to be used for the organization’s benefit.
4 – Process approach Principle
A desired result is achieved more efficiently when activities and related resources are managed as a process.
5 – System approach to management Principle
Identifying, understanding and managing interrelated processes as a system contributes to the organization’s effectiveness and efficiency in achieving its objectives.
6 – Continual improvement Principle
Continual improvement of the organization’s overall performance should be a permanent objective of the organization.
7 – Factual approach to decision making Principle
Effective decisions are based on the analysis of data and information
8 – Mutually beneficial supplier relationships
An organization and its suppliers are interdependent and a mutually beneficial relationship enhances the ability of both to create value

DDM

A simulation-based decision support system for AHP (Analytic Hierarchy Process) is presented in this paper. The software, named DDM (Dynamic Decision Making), is applicable to dynamic decision scenarios where probabilistic interactions exist between the factors in the AHP hierarchy. Decision scenarios are generated using probability information specified by the user. The output of the simulation is the relative frequency of the selection of each alternative rather than a single final selection. The decision maker will evaluate the distribution histogram and make final selection based on his or her own inherent disposition to risk. DDM can be used for forecasting or for evaluating strategic planning options.


 Document Image Processing

In document image processing, the paper documents are initially scanned and stored in the hard disk or any other required location. It is easy to define document image processing as scanning-storing-retrieving-managing. The final outcome of document image processing will be in compatible electronic format, which makes documents easier and quicker to access. Document image processing comprises of a set of simple techniques and procedures, which are used to work upon the images of documents and convert them from pixel information into a format that can be read by a computer.

Document Image Processing Techniques

Basically, document image processing using OCR is divided into two steps. The first step captures the text, based on the information from the document. It identifies the reorientations, tables, words and their colors, font sizes and other textual matter in the file. The second step involves graphical processing which works on drawings, dividing lines between paragraphs and sections, logos and other pictorial representations. As the images are one of the most important components in the documents, it is very important to process the images rather than just locating them in the document.

Advantages of Document Image Processing

Once all the above brushworks are done the original document is picked into a semantic characterization that is more compact than the former. Document image processing is also effective if there are hand written texts or graphics in a computer document. These kinds of documents do not match with most of the containers. So document image processing is essential to make it compatible with most of the software. Document image processing also finds applications in categorizing mails and arranging books in a library where the information is converted into an electronic format, which are easily portable. Thus, document image processing techniques will be more widely used, as all the computer-based entities and handwritten entities are electronically converted into documents. These electronic documents are easier to deal with and assure the existence of files with us for many years, but in a cybernetic environment.
Data Dictionary

A data dictionary is a collection of descriptions of the data objects or items in a data model for the benefit of programmers and others who need to refer to them. A first step in analyzing a system of objects with which users interact is to identify each object and its relationship to other objects. This process is called data modeling and results in a picture of object relationships. After each data object or item is given a descriptive name, its relationship is described (or it becomes part of some structure that implicitly describes relationship), the type of data (such as text or image or binary value) is described, possible predefined values are listed, and a brief textual description is provided. This collection can be organized for reference into a book called a data dictionary.

End-user computing (EUC) refers to systems in which non-programmers can create working application.
EUC is a group of approaches to computing that aim to better integrate end users into the computing environment. These approaches attempt to realize the potential for high-end computing to perform problem-solving in a trustworthy manner.
End-user computing can range in complexity from users simply clicking a series of buttons, to writing scripts in a controlled scripting language, to being able to modify and execute code directly.

PRINCE2

PRINCE2 (an acronym for PRojects IN Controlled Environments) is a de facto process-based method for effective project management. Used extensively by the UK Government, PRINCE2 is also widely recognised and used in the private sector, both in the UK and internationally. The PRINCE2 method is in the public domain, and offers non-proprietarily best practice guidance on project management.

Key features of PRINCE2:

  • ·         Focus on business justification
  • ·         Defined organisation structure for the project management team
  • ·         Product-based planning approach
  • ·         Emphasis on dividing the project into manageable and controllable stages
  • ·         Flexibility that can be applied at a level appropriate to the project.

Strategies

• Benefits management strategy
• Information management strategy
• Issue resolution strategy
• Monitoring and control strategy
• Quality management strategy
• Resource management strategy
• Stakeholder engagement strategy

Cloud Computing

Cloud computing enables companies to consume compute resources as a utility -- just like electricity -- rather than having to build and maintain computing infrastructures in-house. 
Cloud computing promises several attractive benefits for businesses and end users. Three of the main benefits of cloud computing includes:

• Self service provisioning: End users can spin up computing resources for almost any type of workload on-demand.
Elasticity: Companies can scale up as computing needs increase and then scale down again as demands decrease.
Pay per use: Computing resources are measured at a granular level, allowing users to pay only for the resources and workloads they use.
Cloud computing services can be private, public or hybrid. 
Private cloud services are delivered from a business' data center to internal users. This model offers versatility and convenience, while preserving management, control and security. Internal customers may or may not be billed for services through IT chargeback. 

Although cloud computing has changed over time, it has always been divided into three broad service categories:

1.       Infrastructure as a service (IaaS)- Infrastructure as a Service (IaaS) is a form of cloud computing that provides virtualized computing resources over the Internet. In an IaaS model, a third-party provider hosts hardware, software, servers, storage and other infrastructure components on behalf of its users. IaaS providers also host users' applications and handle tasks including system maintenance, backup and resiliency planning. 
IaaS platforms offer highly scalable resources that can be adjusted on-demand. This makes IaaS well-suited for workloads that are temporary, experimental or change unexpectedly.
Other characteristics of IaaS environments include the automation of administrative tasks, dynamic scaling, desktop virtualization and policy-based services.
E.g. mazon Web Services (AWS), Windows Azure, Google Compute Engine, Rackspace Open Cloud, and IBM SmartCloud Enterprise.

2.       Platform as a service (PaaS) -Platform as a service (PaaS) is a cloud computing model that delivers applications over the Internet. In a PaaS model, a cloud provider delivers hardware and software tools -- usually those needed for application development -- to its users as a service. A PaaS provider hosts the hardware and software on its own infrastructure. As a result, PaaS frees users from having to install in-house hardware and software to develop or run a new application.
PaaS platforms for software development and management include Appear IQ, Mendix, Amazon Web Services (AWS) Elastic Beanstalk, Google App Engine and Heroku.

3.       Software as service (SaaS)- Software as a service (or SaaS) is a way of delivering applications over the Internet—as a service. Instead of installing and maintaining software, you simply access it via the Internet, freeing yourself from complex software and hardware management.

SaaS applications are sometimes called Web-based software, on-demand software, or hosted software. Whatever the name, SaaS applications run on a SaaS provider’s servers. The provider manages access to the application, including security, availability, and performance.
SaaS customers have no hardware or software to buy, install, maintain, or update. Access to applications is easy: You just need an Internet connection.

Customer Relationship management (CRM)

Customer relationship management (CRM) refers to the practices, strategies and technologies that companies use to manage record and evaluate customer interactions in order to drive sales growth by deepening and enriching relationships with their customer bases.

Customer relationship management (CRM) is a term that refers to practices, strategies and technologies that companies use to manage and analyze customer interactions and data throughout the customer lifecycle, with the goal of improving business relationships with customers, assisting in customer retention and driving sales growth. CRM systems are designed to compile information on customers across different channels -- or points of contact between the customer and the company -- which could include the company's website, telephone, live chat, direct mail, marketing materials and social media. CRM systems can also give customer-facing staff detailed information on customers' personal information, purchase history, buying preferences and concern

CRM software

CRM software consolidates customer information and documents into a single CRM database so business users can more easily access and manage it. The other main functions of this software include recording various customer interactions (over email, phone calls, social media or other channels, depending on system capabilities), automating various workflow processes such as tasks, calendars and alerts, and giving managers the ability to track performance and productivity based on information logged within the system.
Common features of CRM software include:
Marketing automation: CRM tools with marketing automation capabilities can automate repetitive tasks to enhance marketing efforts to customers at different points in the lifecycle. For example, as sales prospects come into the system, the system might automatically send them marketing materials, typically via email or social media, with the goal of turning a sales lead into a full-fledged customer.
Sales force automation: Also known as sales force management, sales force automation is meant to prevent duplicate efforts between a salesperson and a customer. A CRM system can help achieve this by automatically tracking all contact and follow-ups between both sides.
Contact center automation: Designed to reduce tedious aspects of a contact center agent's job, contact center automation might include pre-recorded audio that assists in customer problem-solving and information dissemination. Various software tools that integrate with the agent's desktop tools can handle customer requests in order to cut down the time of calls and simplify customer service processes.
Geolocation technology, or location-based services: Some CRM systems include technology that can create geographic marketing campaigns based on customers' physical locations, sometimes integrating with popular location-based GPS apps. Geolocation technology can also be used as a networking or contact management tool in order to find sales prospects based on location



Wednesday, February 25, 2015

BCS/PGD/MIS/DIP/ITSM/CER/IS/TPS

Transaction Processing and Management Reporting Systems


Functions of Transaction Processing Systems
A transaction is an elementary activity conducted during business operations. Transaction processing systems (TPS) process the company's business transactions and thus support the operations of an enterprise. A TPS records a non-inquiry transaction itself, as well as all of its effects, in the database and produces documents relating to the transaction.
TPS are necessary to conduct business in almost any organization today. TPSs bring data into the organizational databases, these systems are also a foundation on which management oriented information systems rest.

Transaction Processing Modes 
Transaction processing may be accomplished in one of two modes:
1. On-line mode
2. Batch mode

Characteristics of on-line transaction processing:

1. Each transaction is completely processed immediately upon entry.
2. OLAP is the most common mode of used today
3. More costly than batch processing
4. Database is always up to date
5. Require the use of fast secondary storage such as magnetic disks

Characteristics of batch transaction processing:

1. Relies on accumulating transaction data over a period of time and then processing the entire batch at once.
2. Batch processing is usually cyclic: daily, weekly, or monthly run cycle is established depending on the nature of the transactions
3. Cheaper than on-line processing
4. Easier to control than on-line processing
5. Database is constantly out of date
6. Batch processing is now being captured using disk files

 Transaction Processing Subsystems in a Firm

Overall transaction processing, also known as data processing, reflects the principal business activities of a firm. The principal transaction processing subsystems in a firm are those supporting:
·         Sales
·         Production
·         Inventory
·         Purchasing
·         Shipping
·         Receiving
·         Accounts payable
·         Billing
·         Accounts receivable
·         Payroll

·         General ledger

PGD/ MIS/DIP/ITSM/ Decision Support Systems (DSS)


Decision Support Systems (DSS)



Decision Support Systems (DSS) help executives make better decisions by using historical and current data from internal Information Systems and external sources. By combining massive amounts of data with sophisticated analytical models and tools, and by making the system easy to use, they provide a much better source of information to use in the decision-making process.
Decision Support Systems (DSS) are a class of computerized information systems that support decision-making activities. DSS are interactive computer-based systems and subsystems intended to help decision makers use communications technologies, data, documents, knowledge and/or models to successfully complete decision process tasks.
While many people think of decision support systems as a specialized part of a business, most companies have actually integrated this system into their day to day operating activities. For instance, many companies constantly download and analyze sales data, budget sheets and forecasts and they update their strategy once they analyze and evaluate the current results. Decision support systems have a definite structure in businesses, but in reality, the data and decisions that are based on it are fluid and constantly changing.

Types of Decision Support Systems (DSS)

1.             Data-Driven DSS take the massive amounts of data available through the company’s TPS and MIS systems and cull from it useful information which executives can use to make more informed decisions. They don’t have to have a theory or model but can “free-flow” the data. The first generic type of Decision Support System is a Data-Driven DSS. These systems include file drawer and management reporting systems, data warehousing and analysis systems, Executive Information Systems (EIS) and Spatial Decision Support Systems. Business Intelligence Systems are also examples of Data-Driven DSS. Data-Driven DSS emphasize access to and manipulation of large databases of structured data and especially a time-series of internal company data and sometimes external data. Simple file systems accessed by query and retrieval tools provide the most elementary level of functionality. Data warehouse systems that allow the manipulation of data by computerized tools tailored to a specific task and setting or by more general tools and operators provide additional functionality. Data-Driven DSS with Online Analytical Processing (OLAP) provide the highest level of functionality and decision support that is linked to analysis of large collections of historical data.
2.             Model-Driven DSS A second category, Model-Driven DSS, includes systems that use accounting and financial models, representational models, and optimization models. Model-Driven DSS emphasize access to and manipulation of a model. Simple statistical and analytical tools provide the most elementary level of functionality. Some OLAP systems that allow complex analysis of data may be classified as hybrid DSS systems providing modelling, data retrieval and data summarization functionality. Model-Driven DSS use data and parameters provided by decision-makers to aid them in analyzing a situation, but they are not usually data intensive. Very large databases are usually not needed for Model-Driven DSS. Model-Driven DSS were isolated from the main Information Systems of the organization and were primarily used for the typical “what-if” analysis. That is, “What if we increase production of our products and decrease the shipment time?” These systems rely heavily on models to help executives understand the impact of their decisions on the organization, its suppliers, and its customers.
3.             Knowledge-Driven DSS The terminology for this third generic type of DSS is still evolving. Currently, the best term seems to be Knowledge-Driven DSS. Adding the modifier “driven” to the word knowledge maintains a parallelism in the framework and focuses on the dominant knowledge base component. Knowledge-Driven DSS can suggest or recommend actions to managers. These DSS are personal computer systems with specialized problem-solving expertise. The “expertise” consists of knowledge about a particular domain, understanding of problems within that domain, and “skill” at solving some of these problems. A related concept is Data Mining. It refers to a class of analytical applications that search for hidden patterns in a database. Data mining is the process of sifting through large amounts of data to produce data content relationships.
4.             Document-Driven DSS A new type of DSS, a Document-Driven DSS or Knowledge Management System, is evolving to help managers retrieve and manage unstructured documents and Web pages. A Document-Driven DSS integrates a variety of storage and processing technologies to provide complete document retrieval and analysis. The Web provides access to large document databases including databases of hypertext documents, images, sounds and video. Examples of documents that would be accessed by a Document-Based DSS are policies and procedures, product specifications, catalogs, and corporate historical documents, including minutes of meetings, corporate records, and important correspondence. A search engine is a powerful decision aiding tool associated with a Document-Driven DSS.
5.             Communications-Driven and Group DSS Group Decision Support Systems (GDSS) came first, but now a broader category of Communications-Driven DSS or groupware can be identified. This fifth generic type of Decision Support System includes communication, collaboration and decision support technologies that do not fit within those DSS types identified. Therefore, we need to identify these systems as a specific category of DSS. A Group DSS is a hybrid Decision Support System that emphasizes both the use of communications and decision models. A Group Decision Support System is an interactive computer-based system intended to facilitate the solution of problems by decision-makers working together as a group. Groupware supports electronic communication, scheduling, document sharing, and other group productivity and decision support enhancing activities We have a number of technologies and capabilities in this category in the framework – Group DSS, two-way interactive video, White Boards, Bulletin Boards, and Email.

Components of DSS


Traditionally, academics and MIS staffs have discussed building Decision Support Systems in terms of four major components:
·                     The user interface
·                     The database
·                     The models and analytical tools and
·                     The DSS architecture and network

This traditional list of components remains useful because it identifies similarities and differences between categories or types of DSS. The DSS framework is primarily based on the different emphases placed on DSS components when systems are actually constructed.
Data-Driven, Document-Driven and Knowledge-Driven DSS need specialized database components. A Model- Driven DSS may use a simple flat-file database with fewer than 1,000 records, but the model component is very important. Experience and some empirical evidence indicate that design and implementation issues vary for Data-Driven, Document-Driven, Model-Driven and Knowledge-Driven DSS.
Multi-participant systems like Group and Inter- Organizational DSS also create complex implementation issues. For instance, when implementing a Data-Driven DSS a designer should be especially concerned about the user’s interest in applying the DSS in unanticipated or novel situations. Despite the significant differences created by the specific task and scope of a DSS, all Decision Support Systems have similar technical components and share a common purpose, supporting decision- making.

A Data-Driven DSS database is a collection of current and historical structured data from a number of sources that have been organized for easy access and analysis. We are expanding the data component to include unstructured documents in Document-Driven DSS and “knowledge” in the form of rules or frames in Knowledge-Driven DSS. Supporting management decision-making means that computerized tools are used to make sense of the structured data or documents in a database.
Mathematical and analytical models are the major component of a Model-Driven DSS. Each Model-Driven DSS has a specific set of purposes and hence different models are needed and used. Choosing appropriate models is a key design issue. Also, the software used for creating specific models needs to manage needed data and the user interface. In Model-Driven DSS the values of key variables or parameters are changed, often repeatedly, to reflect potential changes in supply, production, the economy, sales, the marketplace, costs, and/or other environmental and internal factors. Information from the models is then analyzed and evaluated by the decision-maker.
Knowledge-Driven DSS use special models for processing rules or identifying relationships in data. The DSS architecture and networking design component refers to how hardware is organized, how software and data are distributed in the system, and how components of the system are integrated and connected. A major issue today is whether DSS should be available using a Web browser on a company intranet and also available on the Global Internet. Networking is the key driver of Communications- Driven DSS.