Skip to main content

Posts

Showing posts from November, 2010

Installing SQL Server Denali CTP 1 – A step by step guide

Please refer to this post for step by step installation guide of SQL Server Denali (possibly SQL 2011) with screen prints. However please note that this is an installation from CTP1 as available at http://www.microsoft.com/sqlserver/en/us/product-info/future-editions.aspx as of now. Future release or final release may not be exactly same but should be very similer. Step 1: Select the installation type. (I am selecting New SQL Server Standalone installation) Step 2: Setup should pass all the setup support rules. Then click OK to proceed. Step 3: Install Setup Support Files by clicking on “Install” button. Step 4: Setup should pass setup support rules. You may ignore warnings and go ahead based on case to case basis. Click Next to continue. Step 5: Select Edition or specify the key. (For volume licensing, key should be auto populated in released versions). Here I am selecting Enterprise Evaluation here. Click Next to continue. Step 6: Accept license terms and click Continue.

SQL Server Denali should support SQL 2000 compatibility level (80)

This can be good news for many folks. SQL Server Denali still supports compatibility level 80 (SQL 2000) for databases hosted on this. This means you can go ahead and migrate to future release of SQL Server ( Denali – possibly SQL Server 2011) even if your databases are still running on compatibility level 80 (SQL 2000). However please be advised that it is not a good idea to keep running your database on an older compatibility mode. Hence if any of your database are still using compatibility level 80 on a SQL 2005, SQL 2008 or SQL 2008 R2 instance, it is time to upgrade the compatibility with high priority.

Rule “Windows PowerShell” failed while installation SQL Server Denali

If your system does not have Windows PowerShell 2.0 installed, then Denali setup will fail with below error: Windows PowerShell 2.0 is not installed or enabled. To continue, please install or enable Windows Powershell 2.0. To resolve this problem, you have to install Windows Powershell 2.0 using below steps: Go to http://support.microsoft.com/kb/968929 and download appropriate version of Windows Management Framework Core for your server. Please note this will need a restart of your server.

Denali Setup fails reporting NoRebootPackageDownLevel is found on “Vista/Windows Server 2008” failed

If you are running your server on dot net framework 3.5 SP1 (this is minimum framework version you will need to install Denali ), then you may face an error as below which will prevent you from installing Daneli (possibly SQL 2011). NoRebootPackageDownLevel This rule determines whether the computer has the required update package that ensures that the computer will not have to be rebooted because of the Microsoft . NET Framework 4 installation. Failed This computer does not have the update package that ensures that the computer will not have to be rebooted because of the Microsoft . NET Framework 4 installation. Before setup is allowed to proceed, download and install the required update package from: http://go.microsoft.com/fwlink/?LinkID=195093. After you install the update package, the computer will require a reboot. Solution of this problem is to install a simple fix from http://go.microsoft.com/fwlink/?LinkID=195093 which must be applied after dotnet framework 3.5 SP1. Alter

Microsoft® SQL Server 2008 R2 Best Practices Analyzer

If you are running SQL Server 2008 R2, it is a wise idea to analyze your environment with the best practice analyzer released by Microsoft. Basically Microsoft SQL Server 2008 R2 BPA is a diagnostic tool that performs the following functions: •Gathers information about a Server and a Microsoft SQL Server 2008 or 2008 R2 instance installed on that Server •Determines if the configurations are set according to the recommended best practices •Reports on all configurations, indicating settings that differ from recommendations •Indicates potential problems in the installed instance of SQL Server •Recommends solutions to potential problems You may read more and download BPA for free from http://www.microsoft.com/downloads/en/details.aspx?displaylang=en&FamilyID=0fd439d7-4bff-4df7-a52f-9a1be8725591

Hosting Management Data Warehouse (MDW) reports on a reporting service

Management Data warehouse generates excellent report useful for a database professional with click of a mouse. However these reports can be accessed and navigated from SQL Server Management Studio only. Is not this a problem? Well, I believe it is a shortcoming on MDW. Being able to host there reports on a SQL Server reporting instance automatically (like having real-time publishing capacity in-built) will actually complete this great feature. However I found a series of great post from Microsoft PM Bill Ramos where he described how to host these reports in a reporting service portal in details . These steps can be referred at http://blogs.msdn.com/b/billramo/archive/2010/06/06/mdw-overview-report-for-data-collector-mdw-report-series-part-1.aspx These series of post will enable you to host all MDW reports on a SSRS portal which can be very helpful for complete automation of MDW reporting. 

When and why to use SQL Server’s backup compression feature?

One of the very cool features of SQL 2008 is backup compression. More over from SQL 2008 R2 editions, it is not an Enterprise only feature. Hence I highly recommend using this feature at every possible instance unless you already have authorized third party software for this purpose. Reasons I prefer to use backup compression are broadly classified as below: Backup compression really can compress backup file’s size a lot. This can be a huge cost savings and make your life easy for space management. (I get upto 90% compression ratio for some simple databases which contains simple datatypes as varchar, int etc) Typically compressed backup are faster to execute too for less IO operation. This saves a lot of time for backup or restores operations which can be a life saver. As anyway you can not restore a SQL 2008 backup to an older version, you should not worry about compatibility of a compressed backup. (Special case like if you need to restore yo

Service pack for SQL 2008 R2?

While most of us are aware on this answer, I thought to write a quick post on this as I was asked this question multiple times in past. Do we need to apply 2008 Service packs (SP2) on SQL 2008 R2 instance? SQL 2008 and SQL 2008 R2 are technically two different versions of SQL Server. Hence you can not apply SQL 2008 service packs on a SQL 2008 R2 instance. If you try to do so, installation will abort by reporting that no upgradeable instance or shared feature is available on the computer. Hence as of today, while you should apply SP2 on your SQL 2008 instances, there is no service pack required (or available) for SQL 2008 R2 instances.

Why sometimes only sp_spaceused may not reflect latest information?

When you drop or rebuild large indexes, or drop or truncate large tables SQL Database Engine may defers the actual page deallocations and their associated locks. This deferred drop operations do not release allocated space immediately. Therefore, the values returned by sp_spaceused immediately after dropping or truncating a large table or after dropping or rebuilding a large index may not reflect the actual disk space available. For more information about deferred allocations, you may refer to http://msdn.microsoft.com/en-us/library/ms177495.aspx Hence to conclude, although we may get outdated information from sp_spaceused sometime unless we use updateusage parameter, this can not be treated as a shortcoming of SQL Server. Note: Reference taken from http://msdn.microsoft.com/en-us/library/ms188776.aspx

Find space utilization by SQL Server objects (sp_spaceused)

You can use sp_spaceused to find out the space utilization by a database object (like table, indexed view etc or even for a database) Use below syntax to use sp_spaceused: sp_spaceused [[ @objname= ] 'objname' ] [,[ @updateusage= ] 'updateusage' ] Example: Find the details of space used by a database: USE [Databasename]; GO EXEC sp_spaceused GO Example: Find the details of space used by a table of a database: USE [Databasename]; GO EXEC sp_spaceused N'[Tablename]'; -- You can put any qualifying object name here instead of Tablename GO Please note Microsoft does not take guarantee that you will always find the correct space utilization report for a database object by using only sp_spaceused. Hence you can explicitly instruct SQL Server to ensure that it return correct (latest) values by using @updateusage as below: USE [Databasename]; GO EXEC sp_spaceused @updateusage = N' TRUE '; GO For more, please refer to http://msdn.microsoft.com/en-us/l

Project Lucy – A new project from Quest

As we know Quest is a premium company offering many tools and technologies on various database products, it is always wise to follow them closely. Project Lucy is a new project from Quest which can be used for in-depth automated analysis of your trace files. As per Quest, Project Lucy analyzes performance data that you upload. It gives you deep analysis and leverages the collective intelligence of other Project Lucy users. As of now it uses trace files for workload related insight and general breakdown, plan cache behavior, recompiles, file autogrow/shrink behavior for analysis. However it will process SQL Server query plans, Windows perfmon data, data from SQL Server DMVs, perhaps even data from other database platforms for deeper analysis in future. You can find more details and register as an user of Project Lucy at http://www.projectlucy.com

Finding exact compression ratio of a SQL 2008 backup

Using native backup compression in SQL 2008 can be very rewarding for many reasons. However it is always nice to know exactly how much space you are saving by enabling backup compression over traditional backup and have a statistical proof of the same. Fortunately SQL Server maintains exact backup compression ratio for all backups which were compressed. Use below code to find exact compression ratio for an exact backup: select backup_set_id, name, description,backup_start_date, type, cast((backup_size/1048576) * 100 as real) / 100 as BackupSizeMB, cast ((compressed_backup_size/1048576) * 100 as real) /100 as CompressedSizeMB, cast((((cast(cast(((compressed_backup_size/backup_size) * 10000) as int)/100 as real) - 100)) * -1) as varchar) + '%' as CompressedRatio from msdb..backupset where name like '%[Backupfilename]% ' order by backup_finish_date desc Note : For an uncompressed backup, Compression ratio will be reported as -0%

How to fix (bring online) snapshot if original database becomes suspect or unavailable?

Recently I was asked this question multiple times from various people and hence I thought to write a small post on this. If original database becomes suspect or unavailable for any reason, any snapshot of that same original database also becomes inaccessible (offline) immediately. Unfortunately there is no way you can bring the snapshot online when the original database is not available. This behavior is by design of snapshot technology only. In very brief, snapshot itself is not an independent copy of the original database and hence it solely depends on the original database’s availability to remain functional. Please refer to http://blog.consultdba.com/2010/01/complete-guide-on-database-snapshot-sql.html if you want to read more about this.

Collect all information for all databases from a single script using sys.databases

Sys.databases can be a goldmine for collecting all information for databases. Below script will collect all vital information for all databases hosted on a SQL Server in a very user friendly format. SELECT database_id , CONVERT ( VARCHAR ( 25 ), DB . name ) AS dbName , CONVERT ( VARCHAR ( 10 ), DATABASEPROPERTYEX ( name , 'status' )) AS [Status] , state_desc , ( SELECT COUNT ( 1 ) FROM sys . master_files WHERE DB_NAME ( database_id ) = DB . name AND type_desc = 'rows' ) AS DataFiles , ( SELECT SUM (( size * 8 )/ 1024 ) FROM sys . master_files WHERE DB_NAME ( database_id ) = DB . name AND type_desc = 'rows' ) AS [Data MB] , ( SELECT COUNT ( 1 ) FROM sys . master_files WHERE DB_NAME ( database_id ) = DB . name AND type_desc = 'log' ) AS LogFiles , ( SELECT SUM (( size * 8 )/ 1024 ) FROM sys . master_files WHERE DB_NAME ( database_id ) = DB . name AND type_desc = 'log' ) AS [ Log MB ] , user_acce

Microsoft Atlanta – Review of a new Microsoft product to monitor SQL Server Environment based on cloud

Microsoft recently launched a new monitoring tool for SQL Server environment code named as “ Atlanta ”. As per Microsoft’s own publication, Atlanta s a secure cloud service that proactively monitors your Microsoft SQL Server deployments. It helps you avoid configuration problems, reduce downtime, improve performance, and resolve issues faster. If you are interested to try it for free, you can register at https://www.microsoftatlanta.com/ and get started from there. However after reviewing the technical details, personally I do not feel this tool will be a great hit for commercial market. This is because of below reasons: As of now Atlanta is not a real time monitoring system. Hence existing real-time monitoring systems (which typically have analysis part also) can not be replaced by it for critical environment. Atlanta supports only SQL 2008 or later on Win 2008 or later . This means customers have to use some other tool to m

Find current location of all data and log files of all databases of the server

While we can use GUI or sp_helpdb to find out all the data and log files of a specific database, below code will list all data and log files of all the databases housed on the SQL Server instance. SELECT name, physical_name AS current_file_location FROM sys.master_files This can be a useful code to create and manage inventory or for some automation purpose.

Future of SQL Server as of 2010 – Denali

If you are wondering what is next of SQL 2008 R2, the answer is Denali. Denali is code name for (possibly) SQL 2011 with some exceptional improvements. Interesting information of this version may be reviewed at http://www.microsoft.com/sqlserver/en/us/product-info/future-editions.aspx Moreover you can download CTP version (CTP1) of Denali from http://www.microsoft.com/sqlserver/en/us/product-info/future-editions.aspx Books online for Denali is available at http://msdn.microsoft.com/en-us/library/ms130214(SQL.110).aspx

Clearing entire procedure cache from SQL Server (DBCC FREEPROCCACHE)

You may use DBCC FREEPROCCACHE to clear all elements from the plan cache using below syntax: DBCC FREEPROCCACHE You may also clear cache of a particular procedure using below syntax: DBCC FREEPROCCACHE [ ( { plan_handle | sql_handle | pool_name } ) ] Plan handle for a specific query can be found using DMV as mentioned in below example. Example: Clearing entire procedure cache: DBCC FREEPROCCACHE Clearing plan cache of a query as  SELECT * FROM employee where employee_id = 9 Step 1: Determine plan_handle for the specific query using below code: SELECT plan_handle, st.text FROM sys.dm_exec_cached_plans CROSS APPLY sys.dm_exec_sql_text(plan_handle) AS st WHERE text LIKE N'SELECT * FROM employee where employee_id %'; GO You should get plan_handle value for the specific query. Let us assume you got an output as below: plan_handle                                         text --------------------------------------------------  ----------------------------- 0x0600