SQL Server Reporting Service – Few Important Concepts and Overview

SQL Server Reporting Services, due to its robust but user friendly architecture, is an obvious choice for Enterprise or in-house reporting, for product management, sales, and human resource and finance departments. Its flexibility makes it an ideal for usage in applications as well (Deliver User-Friendly Reports from Your Application with SQL Server Reporting Services – MSDN Magazine August 2004). Reporting Services offer various delivery methods, from ftp to email and it provides various rendering formats therefore it makes adds easiness to business to business reporting. Similarly because of its flexibility and industry standard security model, it makes extranet and secure internet reporting easily achievable.

The reporting system comprises for following main components.

Main Components of SSRS

Report Server & Report Server Database

Report Server is an integrated web service which controls report generation and management. Report Server database is a SQL server database which is used as data dictionary about reports (catalogue, groups) and caching. SQL server agent is used for scheduling the reports.

Report Manager
An ASP.NET based Web interface for managing the reports, setting security and user permissions.

Report Designer is considered as a part of reporting services in Microsoft documentation but because RDL (Report definition language) is an XML based open standard, any vendor can implement it and therefore a single tool can’t be categorized as report designer. However, Microsoft provides a graphical report authoring tool with Visual Studio.NET 2003 or later for creating reports which automatically creates RDL markup at the backend.

Reporting services were designed with concept of disparate data sources in mind. A single report can retrieve data from multiple heterogeneous databases and render it to make it look like it’s from a single source. It provides built-in windows authentication security however one can write custom security wrapper to fit particular enterprise needs.

With SQL Server reporting services, multiple delivery methods and formats work like a charm. You design a generic report and reporting server takes care of exporting it into HTML, Excel, PDF, WAR(web archive), TIFF, CSV and XML format. As always, one can write his own custom format writer wrapper class for any custom format.

Reporting Services Delivery Formats

Reporting Services Delivery Formats

Reporting Services provide four distinct formats of report delivery also known as subscription in Reporting Services arena; Individual subscription, data driven subscription, SMTP delivery and file share directory (FTP) based subscriptions.

Reporting Services Architecture

Reporting services architecture

Above diagram schematically explains Reporting services architecture

Beside the code segments which can be written within a report in VB.NET, these API provides makes it more programmable. The application program interface can be classified into following categories.

  1. Data processing extension application programming interface (API)
  2. Delivery extension API
  3. Rendering extension API
  4. Security extension API
  5. Web service API
  6. Windows Management Instrumentation (WMI) configuration API

Interactive interfaces are another salient feature of SQL Server Reporting Services. Reports designed in SQL server reporting services supports charts, document map, freeform, cross tab matrix, sub reports and tables. Reports can also be parameterized and event driven (supports actions). Management is one of the most important parts in any reporting system; SQL server reporting services has it all planned. It manages jobs from a user friendly console, provides personalization "my reports", tracks report history, manage shared data sources, provide search, subscription and snapshot features with shared subscription from one stop shop, the management console. Reporting Services supports report caching and stores reports execution data in execution logs,

Report Generation and Publishing

Report Generation and Publishing

As defined in the diagram above, the process of report generation and publishing consists of the following main steps.

  1. Reporting server engine (Report Processor) receives the request for a particular report. A request includes parameters and formatting instructions.
  2. Report Processor retrieves the report definition on the basis of request.
  3. For the corresponding RDL, the report processor then retrieves the report data for specified data sources.
  4. Report Processor performs transformation on reporting data and sends the document data along with schema to rendering engine (rendering extension).
  5. The extension publishes the final rendered report.
  6. The following steps are basics of how reporting services work. The extensions (data processing extensions, rendering extensions etc) can be custom built and wrap around the existing set of API to provide extended functionality.

SSRS Integration with SharePoint 2007

SSRS Integration with SharePoint 2007

Configuration:

  • SQL Server 2005 SP2 is installed on report server in Native mode along with WSS Object Model (farm install)
  • SSRS Configuration Tool creates a new Report Server database in “SharePoint Integration mode”
  • SSRS Add-In is installed on WSS 2007
  • WSS Central Admin web pages register SSRS web service and windows service with WSS farm

Database Integration Points

  • WSS Content Database stores the master copy of SSRS items
  • Schedules, caching, and subscriptions are stored in SSRS database only
Advertisements

Microsoft Announces Visual Studio 2010

Microsoft is offering a first look at the next version of its Visual Studio integrated development environment (IDE) and platform, which will be named Visual Studio 2010 and the .Net Framework 4.0.

http://msdn.microsoft.com/en-us/vstudio/products/cc948977.aspx

There’s a lot promised in the new release (expected to ship, duh, in 2010), from improved software testing tools to software engineering modeling capabilities to integrated development and database functions for the application lifecycle management (ALM).

Microsoft is putting its attention on improving Visual Studio for the benefit of every one of its users—from the CIO to the software architect to the enterprise developer to the software testing team.

A key goal in VSTS 2010, says Microsoft, is to help democratize ALM by bringing all members of a development organization into the application development lifecycle, and remove many of the existing barriers to integration.

One way that Visual Studio 2010 will do this is to break down the ALM roles, from the business decision maker (who needs a project overview but doesn’t want to be bogged down in details) to the lead developer or system architect (who enables the software infrastructure and draws the blueprint), to the developer who writes the code and the database administrator (DBA) who integrates it with the company database to the testers (who make sure the software is of high quality).

For the IT manager or CIO, says Mendlen, VSTS will give clarity and visibility into the state of the project throughout the lifecycle, using Team Foundation Server-enabled dashboards customized for her role. The dashboard can answer high level questions such as ongoing project cost or project status.

Agile Tools, Built-In

Visual Studio 2010 also will sport features to integrate Agile methodologies into the tech stack using Team Foundation Server. Skinner explains, “We’ll include in the [VSTS] box an Excel workbook for teams that are leveraging, say, the Scrum process so they can get burndown from their project.” These features, he says, will let Agile teams track daily progress, see projects broken down into iterations and use sprints.

Putting Quality Earlier in the Development Lifecycle

One sometimes-stressful interaction in the application development lifecycle is the tension between developers and testers. Developers have to do a better job of testing their code before they send it off to the software testers. Developers don’t always know which unit tests they have to run, and often they don’t have the time or inclination (your own cynicism-meter can determine which) to run the tests anyway.

Merging of Developer, DBA Roles

Most of these changes are a ways off, though you can expect to see some of this functionality demonstrated at the upcoming Microsoft Professional Developers’ Conference. One item, however, takes effect immediately.

As Microsoft sees it, the roles of the database-centric developer and “regular” developer are less distinct than they once were, so the company is merging its VSTS database and development products. As of October 1, those who belong to the Microsoft Developer Network (MSDN) and currently own Visual Studio Team System 2008 Development Edition or Visual Studio Team System 2008 Database Edition will receive for free the Visual Studio Team System 2008 Development Edition, Visual Studio Team System 2008 Database Edition, Visual Studio 2005 Team System for Software Developers and Visual Studio 2005 Team System for Database Professionals.

Chips stack up in third dimension

Stacks of chips, one on top of the other, will power the next generation of superfast PCs, IBM has announced.

Laying chips vertically, instead of side by side, reduces the distance data has to travel by 1,000 times, making the chips faster and more efficient.

Big blue has said that it will start producing the compact silicon sandwiches in 2008.

Chip manufacturer Intel has previously announced that it is also developing similar vertical chip technology.

Last year, the firm unveiled a chip with 80 processing cores and capable of more than a trillion calculations per second (teraflops) that used vertical stacking technology.

Other firms, such as Tru-Si, have also developed techniques for creating 3D stacked chips.

High rise

Today most chips are laid out side-by-side, connected by wires.

The new technique involves placing chips directly on top of each other, connected by tungsten filled pipes, etched through the silicon.

 

 

These “through-silicon vias” (TSV), as they are known, eliminate the need for wires, increasing the speed at which information can flow between chips.

It has taken researchers at IBM a decade to refine the precise technique for mass producing the multi-storey chips.

“This allows us to move 3D chips from the ‘lab to the fab’ across a range of applications,” said Lisa Su, vice president, semiconductor research and development center at IBM.

The first application will be in wireless communications chips. Using TSV will increase the efficiency of the chips by up to 40%, the firm says.

Speed boost

IBM is also exploring use of the technique in their multi-core chips.

As more and more cores are added to chips it becomes increasingly difficult to deliver uniform power to each one. By stacking them vertically and reducing the length of the connections between them, IBM hopes to overcome this problem,

Using these high-rise multi-core chips should also increase processor speeds and reduce power consumption.

Advantages like these also make 3D chips attractive for use in supercomputers.

IBM says it is developing the technology for use in the current fastest supercomputer in the world, Blue Gene/L.

The ultra powerful number cruncher, installed at the US Department of Energy’s Lawrence Livermore National Laboratory (LLNL) is already capable of 280.6 trillions calculations per second.

The 3D stacked chips would allow a “new generation of supercomputers”, IBM said.

The first chips will be available by the end of 2007 with full scale production expected to begin in 2008.

 

 

 

 

 

 

Hitachi Maxell claims new Li-ion battery with 20x the power

Hitachi Maxell and a number of universities and firms in Japan have created a design for a Li-Ion battery that will last 20x longer than current unitsElectric cars with 2000miles range on single charge, cell phones requiring outlet once in a month, laptops with batteries for 1 week. Lets hope technology finds its way to life sooner than later.

read more | digg story