(This is part of the Study Guide series,
70-464)
Microsoft’s
Measured Skill description: This objective may include but is not
limited to: automate backup testing; shrink file; check index fragmentation;
archive data; run an SQL Server Integration Services (SSIS) job; check disk
space; automate backups
What
I see:
·
automate backup testing
·
shrink file
·
check index fragmentation
·
archive data
·
run an SSIS job
·
check disk space
·
automate backups
Automate Backup Testing
I wrote a script to do
this. Basically it takes a backup and
restores it as a stand-alone database and then runs DBCC CHECKDB against the
restored database. It’ll report the
results of the DBCC CHECKDB and then drop the database. This script can be found here
on my GitHub SQLSalt reposity.
Shrink File
I’m actually a bit surprised
that Microsoft put this in the objectives.
It is not advisable to consistently shrink database files, as this
operation can cause extreme data fragmentation.
For my testing purposes, I altered the size of my MeasureMySkills
database files to 10000 MB each to show the drastic differences in file
shrinkage. If you want to get database
file space usage statistics for your current database, you could execute the
following query:
use MeasureMySkills;
go
;with DatabaseFileInfo as
(
select
db_name(database_id) as database_name,
name as db_file_name,
size * 8 as total_size_kb,
fileproperty(name, 'SpaceUsed') * 8 as space_used_kb
from sys.master_files
where database_id = db_id()
)
select
database_name,
db_file_name,
total_size_kb,
space_used_kb,
total_size_kb - space_used_kb as free_space_kb,
cast(space_used_kb * 1.0 / total_size_kb * 100.0 as decimal(5, 2))
as percent_used
from DatabaseFileInfo
This will
let you know how much space is actually being used, and how much is
allocated. If for some reason you want
to shrink a database file, you could do the following (I’m using my
MeasureMySkills database in the following example):
-- shrink my data file
dbcc shrinkfile('MeasureMySkills', 0);
go
-- shrink my log file
dbcc shrinkfile('MeasureMySkills_log', 0);
go
Then
re-running the database file space usage stats query from before, you should
see regained space (for the data file, close to 100% usage and for the log file
it will be bound by VLFs so your results may vary). You can automate this process by creating a Maintenance
Plan and utilizing a Shrink Database Task.
You could also use schedule a job to check a database’s file space
usage, and if it drops below a certain threshold it could shrink database files
(also possible cumulatively through DBCC SHRINKDATABASE). Again, this is not recommended in
production. It is better to make an
educated calculation of data scalability and create an appropriately sized
database. There’s a good saying that if
space was once used by a database (i.e. it auto grew, or more space was allocated
for a reason) then it will probably need it again. Drive space is cheap; poor performance and
unwieldy fragmentation can be expensive.
Check Index Fragmentation
I wrote a script
that can be found on my GitHub SQLSalt repository that basically gets a
summary of index fragmentation for a particular database with a minimum page
threshold. It utilizes the sys.dm_db_index_physical_stats
DMV with a join on the sys.indexes catalog view. Running it against AdventureWorks2012:
use AdventureWorks2012;
go
declare
@page_count_min bigint
= 1000;
select
db_name(ps.database_id) as database_name,
object_name(ps.object_id) as object_name,
i.name as index_name,
ps.index_id,
ps.index_type_desc,
ps.avg_fragmentation_in_percent,
ps.page_count
from sys.dm_db_index_physical_stats
(
db_id(),
default,
default,
default,
'detailed'
) ps
inner join sys.indexes i
on ps.object_id = i.object_id
and ps.index_id = i.index_id
where ps.page_count >= @page_count_min
order by ps.avg_fragmentation_in_percent desc;
Archive Data
Archiving data can be done a few
ways. As far as automation goes, it
could be something as simple as a SQL Server Agent job run daily (or weekly, or
monthly) to copy data from an active table to an archive table. This can also be done through PowerShell. It is as simple as a DELETE FROM… OUTPUT INTO … statement.
Run an SSIS Job
Again, running a SQL Server
Agent job with an SSIS job step can be down through a SQL Server Agent
schedule.
Check Disk Space
Checking disk space can be
something that is done by scheduling a job that makes a call to the master.sys.xp_fixeddrives
extended stored procedure. This will
return a row per logical drive and the remaining space in MB. If this is below a particular threshold, then
use DBMail to send an email, log it to the event log, or take whatever action
you find appropriate. Likewise, you can
setup a WMI alert within SQL Server Agent with a WQL query to notify this
information.
Automate Backups
This is one of the basic steps
of automation. It is typical for a DBA
to create either a Maintenance Plan to do full, differential, and transaction
log backups regularly and on a schedule, or create each individual SQL Server
Agent job to accomplish the same tasks.
These are simple BACKUP DATABASE/LOG statements, and it is also
worth noting that these backups should be tested for integrity.
References
Code
If there are any comments, questions, issues, or suggestions please feel
free to leave a comment below or email me at sqlsalt@gmail.com.
No comments:
Post a Comment