SQL Server 2012 is almost here! What’s new? and Why should you care?
SQL Server 2012 which was code named “Denali” is releasing on March 7th in an elaborate launch event. The RC0 has been available for the last few months and it looks just fantastic! (RC stands for release candidate, which is Microsoft’s terminology to say it’s ready to go to market). Key focus areas for features in this new incarnation are 3 major drivers of the data market today and the foreseeable future. Let’s take a quick look at them and see why SQL Server 2012 is important for your business.
1. Data Explosion:
Indeed, if you have been noticing – a huge growth in mobile devices, internationalization of businesses and proliferation of social networks have been rapidly growing the information volume worldwide. According to Gartner, this growth is at a minimum annual rate of 59% and is predicted to grow 44 times over the next decade. But the number of IT Professionals is only growing at 1.4 times. Decision makers have a much greater problem in making sense of their organizational and customer data than ever before.
Key aspects of containing data explosion are, supporting rapid and massive data processing and at the same time enabling decision makers with self service analysis capabilities in analysis of data, managing data reliability with data alerts and data quality services during integration and reporting of data to reduce burden on IT.
SQL Server’s 2012′s, BI Semantic Model, a single model for users with multiple ways of building business intelligence (BI) solutions, for all end user experiences – reporting, analytics, scorecards, dashboards, and custom applications and offers developers flexibility of modeling experiences, richness to build sophisticated business logics and scalability for your most demanding enterprise needs. BI visualization with PowerView offers 100% self service capabilities for creation of powerful reports on live data. I’ll write about it in detail in a later post.
For data credibility and consistency (a huge challenge in any cloud based based environment), Data Quality Services(DQS) a new service to SQL Server, provides knowledge-driven tools customers can use to create and maintain a Data Quality Knowledge Base which helps improve data quality and ease data management by profiling, cleansing, and matching data. This is an essential capability customers will be able to integrate with 3rd party data to help validate and cleanse data in a data quality project while accessing the Windows Azure Marketplace DataMarket as a source.
Reducing burden on IT has also been effected by enhancements to Master data services – Its lot easier to use Silverlight like explorer to simplify integration management and also in the new Excel Add-in that enables data management directly through Excel by information workers. This Add-in can load a filtered set of data from an MDS database enable working on it in Excel and publish back to MDS database. Admins can create new entities and attributes through the add-in.
2. High availability at low TCO:
More imminent market driver today is the tough economic state – which is pressurizing IT to offer highest application availability and at the same time reduce costs. SQL Server 2012’s
SQL Server 2012 simplifies it’s high availability, disaster recovery and redundancy capabilities under AlwaysOn, to enable fast application failover during planned and unplanned downtime. AlwaysOn offers, Availability Groups that enhances Database Mirroring capability and support for automatic and manual failover of a group of databases, support for up to four secondaries, fast application failover and automatic page repair. AlwaysOn Failover Cluster support multi-site clustering across subnet to enable cross-datacenter failover of SQL Server instances. AlwaysOn Active Secondries enable secondary instances to be utilized for running report queries and backup operations which helps eliminate idle hardware and improve resource utilization. With AlwaysOn Connection Director’s Multi-subnet failover, now client applications can failover across multiple subnets (up to 64) almost as fast as they can failover within a single subnet. For efficient usage of resources with readable secondaries, Read-Only Intent provides a way to control the type of workloads that run on their HA servers and efficiently manage their resources. AlwaysOn AutoStat auto-creates and updates temporary statistics needed for queries running on the readable secondary. These temporary statistics area stored in TempDB therefore no physical changes are required in the user database. This allows the optimizer to generate optimal query plans on secondary replica as it would do on the primary replica and it does not require any user intervention. This is a significant improvement in efficient resource utilization that improves fail-over performance. This was missing in the earlier versions.
Planned down times have also been carefully analysed and optimized in SQL Server 2012. Windows Server Core Support reduces OS patching significantly and hence reducing the downtime by 40% – 60%.
There have been enhancements to Database Recovery Advisor, StreamInsight in their UX, checkpointing features, performance counters for monitoring and manageability.
Huge performance improvements have been made in SQL Server 2012 across the board. With In-Memory Column Store (“”Apollo”) in the database engine, SQL Server is the first of the major general-purpose database systems to have a true column store. I am looking forward to see the order of magnitude performance improvements expected due to reuse of execution elements of commonly occurring datawarehousing queries! Full-Text Search also has improved query execution and concurrent index update mechanisms and includes properties scoped searches without requiring developers to maintain file properties (such as Author Name, Title, etc…) separately in the database. Developers can now also benefit with improved NEAR operator that allows them to specify distance in-between as well as the order of words.
3. Cloud:
Isn’t this where everything is going? Market needs the businesses to scale quickly and in the terms that make sense for those businesses. Here is how SQL Server 2012 enables building and managing databases on azure as it continues to enhance these capabilties for on-premise.
SQL Server 2012 offers a very mature and better integrated development experience with SQL Server Development Tools (SSDT) code named ”Juneau” (CTP3 had Juneau version 10, RC0 includes version 11) for database development across Database, BI and web and supports SQL Server and SQL Azure. This streamlines application development with a single environment for developing DB, BI, and web solutions across on-premise and cloud environments. This finally gives application developers what they need to get everything done without ever leaving Visual Studio-and it delivers many powerful new capabilities as well. SSDT introduces a model-based approach. where an in-memory representation of the database is maintained which are consistent across the features such as designers, validations, IntelliSense, schema compare, etc. What’s behind this model can be a live database (on-premises or Azure), an offline database project, or a snapshot taken of an offline database project at any point in time. SSDT is agnostic to what this really is and works exclusively against the model itself. Improved experience for database developers includes declarative, model-based development, integrated tools with modern language services, ability to work connected and offline with local testing, and offers SQL Server edition-aware targeting such as SQL Server, SQL Azure etc. Data-tier Application Component (DAC) Framework Introduced in SQL Server 2008 R2, Data-tier Application (DAC) support across SQL Server and Visual Studio helps IT and developers to more easily define and contain the schemas and objects required to support an application, then deploy, import, and export DACs more easily across on-premises and public cloud.
Support for moving DAC database between SQL Server databases across server, a Red9 private cloud or SQL Azure is offered with DAC parity with SQL Server 2012 and SQL Azure. This enables to build once and deploy and manage anywhere which leads to unprecedented flexibility for both developers and IT. Import and export services in the DAC framework enable archiving and migration scenarios between on-premise and cloud database servers.
Interoperabiltiy is a key consideration for the success of any cloud based computing platform and SQL Server 2012 has much greater interoperability. With Microsoft’s drivers for PHP and Java, PHP and Java applications running on windows can now connect to SQL Server and SQL Azure in a lot more scalable and reliable manner.
For the data overload that’s ahead for all businesses, there is a clear need to move to a data platform like SQL Server 2012 to ensure data is highly reliable and available while the growth is scalable and manageable.
[simple-social-share]