Microsoft Azure Contest Started yesterday

Even if you don’t know Azure but know SQL Server, you can compete in this contest. There are daily quizzes and weekly simulation questions. Join the contest here:

https://www.microsoft.com/AzureSQLChampionship/Game/Daily

If you are a beginner to Azure and want to learn more, there are YouTube videos at

and Intro to Azure in text by Microsoft at

https://docs.microsoft.com/en-us/learn/modules/azure-sql-intro/3-deployment-options

Good Luck Competitors!

“Back at ground control, there is a problem.” Major Tom “Failed to deploy the project. Try again later. (Microsoft SQL Server, Error: 27118)”

Major Tom by Peter Schilling

“Standing there alone, the ship is waiting (my SSIS package) All systems are go, are you sure? (Get error on Deploy overwrite) Control is not convinced  (WTH) But the computer has the evidence (Check views) No need to abort (Fix the permissions) The countdown starts” (Now it works)

Thanks to Aleksandr Fedorenko for the answer on how to solve this error. https://dba.stackexchange.com/questions/105770/failed-to-deploy-the-project-try-again-later-microsoft-sql-server-error-271

Problem:

Tried to deploy to the same package (overwrite), I received an error “Failed to deploy the project. Try again later. (Microsoft SQL Server, Error: 27118)”

Solution:

Give user ALLSchemaOwner in the SSISDB database membership to ssis_admin role. The user has the correct permissions out of box, but if someone changes it, you will get the error.

How to troubleshoot:

Aleksandr used the following to troubleshoot the issue: Let’s see detailed error, to do this just need to click on “Show technical details button”. In the section “Additional data” we can find the object which is responsible for this error.

In my case it was procedure prepare_deploy. This is an internal store procedure of database SSISDB. To catch an error I used statement RAISERROR in different places, because this error appears in many places.

Thus my problem was there:

IF EXISTS (SELECT [project_id] FROM [internal].[project]   WHERE [folder_id] = @folder_id AND [name] = @project_name)    BEGIN        RAISERROR(27118, 16, 1) WITH NOWAIT        RETURN 1                  END

But to see this error you should have true in the following condition: IF(@project_id IS NULL)

See whole statement which appears earlier in the store procedure:    SET @project_id = (SELECT [project_id] FROM [catalog].[projects]  WHERE [folder_id] = @folder_id AND [name] = @project_name)    IF(@project_id IS NULL)    BEGIN    …

So we have almost identical SELECT statements except schema, but results are different: first statement gives us nothing although second gives some result. The schema [catalog] says that we have a deal with view.

Take a look on the view:

CREATE VIEW [catalog].[projects] AS SELECT     … FROM       [internal].[object_versions] ver  INNER JOIN  [internal].[projects] proj ON (ver.[object_id] = proj.[project_id] AND ver.[object_version_lsn] = proj.[object_version_lsn]) INNER JOIN           [internal].[folders] ON proj.[folder_id] = [internal].[folders].[folder_id] WHERE     (ver.[object_status] = ‘C’)            AND (ver.[object_type]= 20)            AND (                  proj.[project_id] IN (SELECT [id] FROM [internal].[current_user_readable_projects])                  OR (IS_MEMBER(‘ssis_admin’) = 1)                  OR (IS_SRVROLEMEMBER(‘sysadmin’) = 1)                )

Obviously, to see some result from this view user should have the permission either ssis_admin or sysadmin. Let’s see who executes procedure internal.prepare_deploy

ALTER PROCEDURE [internal].[prepare_deploy]

WITH EXECUTE AS ‘AllSchemaOwner’

Finally, checking permission for user AllSchemaOwner in database SSISDB:

Bingo, we don’t have appropriate permissions to see data from view [catalog].[projects]. Just adding user AllSchemaOwner to role ssis_admin and problem is being gone.

Another user commented:

I had the same issue, but my AllSchemaOwner hadn’t lost any privileges. I deleted the old project and then deployed a fresh version, and that fixed it too. Weird. – user79818

OOPS I did it again: Recover Deleted Data by LSN (don’t know the time deleted)

OOPS I did it again – Britney Spears

Someone deleted data in Production! Unfortunately this problem happens a lot in the DBA world. Someone mistakenly writes a delete data statement and runs it in Prod but thought they were in Development. Oh No!!!

What can you do to restore to a point and time, when you don’t know the time the delete happened?

Erik Darling explains in detail how to Restore an Oops Delete.

If you don’t have time to view the video and need the answer quickly, the answer is in the transaction log.

  1. Need to find the delete in the Trans Log.
  2. Get LSN from the Trans Log
  3. Convert the Trans Log LSN from Hexadecimal to varchar(26)
  4. Restore by LSN varchar(26)

First find the delete

Use the undocumented function sys.fn_full_dblog to find the delete made by this user. The LSN from a log dump with either fn_dblog() or DBCC LOG() or fn_dumpdblog() is in a string format that consists of 3 hexadecimal numbers delimited with colons.

oops1

Run this select statement with the blue text ‘SUPERUSER’ substituted with the actual user name that made the delete.

select d.name, ffd.[Current LSN], ffd.* FROM sys.databases as d  CROSS APPLY sys.fn_full_dblog(NULL, NULL, d.database_id, NULL, NULL, NULL, NULL, NULL) as ffd  WHERE d.name = ‘SUPERUSER’ and operation = ‘LOP_DELETE_ROWS’;

Now convert the LSN from sys.fn_full_dblog into a useable LSN for restore. The restore command expects the LSN to be in a large integer format (we will actually be converting it to a varchar(26) data type). The function from SQLSoldier (below) is to take each hexadecimal string and convert it and them put them back together again with the correct number of leading zeros.

SQLSoldier function to turn fn_full_dblog lsn into a usable restore LSN

Now create a function to change the Hexadecimal LSN into varchar(26).

Create Function

Create function SQLSoldier_Convert_LSN (@LSN Varchar(22))
as
Begin
— Convert LSN from hexadecimal string to decimal string
–Declare @LSN varchar(22),
Declare @LSN1 varchar(11),
@LSN2 varchar(10),
@LSN3 varchar(5),
@NewLSN varchar(26)
— LSN to be converted to decimal
–Set @LSN = ‘0000001e:00000038:0001’;
— Split LSN into segments at colon
Set @LSN1 = LEFT(@LSN, 8);
Set @LSN2 = SUBSTRING(@LSN, 10, 8);
Set @LSN3 = RIGHT(@LSN, 4);
— Convert to binary style 1 -> int
Set @LSN1 = CAST(CONVERT(VARBINARY, ‘0x’ +
RIGHT(REPLICATE(‘0’, 8) + @LSN1, 8), 1) As int);
Set @LSN2 = CAST(CONVERT(VARBINARY, ‘0x’ +
RIGHT(REPLICATE(‘0’, 8) + @LSN2, 8), 1) As int);
Set @LSN3 = CAST(CONVERT(VARBINARY, ‘0x’ +
RIGHT(REPLICATE(‘0’, 8) + @LSN3, 8), 1) As int);
— Add padded 0’s to 2nd and 3rd string
Select CAST(@LSN1 as varchar(8)) +
CAST(RIGHT(REPLICATE(‘0’, 10) + @LSN2, 10) as varchar(10)) +
CAST(RIGHT(REPLICATE(‘0’, 5) + @LSN3, 5) as varchar(5));
END;

Run select statement

Run this select statement with the blue text ‘SUPERUSER’ substituted with the actual user name that made the delete.

select d.name, SQLSoldier_Convert_LSN(ffd.[Current LSN]) as ConvertedLSN, ffd.[Current LSN], ffd.* FROM sys.databases as d  CROSS APPLY sys.fn_full_dblog(NULL, NULL, d.database_id, NULL, NULL, NULL, NULL, NULL) as ffd  WHERE d.name = ‘SUPERUSER’ and operation = ‘LOP_DELETE_ROWS’;
Get the ConvertedLSN for the first delete and then Restore with StopBeforeMark.
ConvertLSN

Restore with LSN

RestoreWithLSN

Last Complete the Restore sequence with recovery

CompleteRestoreSequence

 

Pass Summit 2018: First Timer Point of View

SQL Summit 2018 – Nov 5-9

Summit is an endurance test. Sessions from 8-6 everyday, plus at least 3 parties with free alcohol each night. I learned personal lessons like taking a Benadryl after a night of drinking before you go to sleep helps you wake up early without a hangover. I learned SQL lessons like how to prove the VM is the cause of the performance dropping in your database. The schedule was hectic and fun. I found my people.

11/5/2018 Tuesday (I arrived)

I arrived at SEA airport and took a taxi to the Sheraton Grand Downtown. Went on foot for lunch and rested until 4pm. At 4pm I checked into Summit and went to the First-timers Orientation and Networking.

I met 12 first timers during the network game and ended going to the V.20 Celebration with 4 of them. Unfortunately I never saw them again after the first night.

The V.20 Celebration was a party to remember. There were old school and new school video games, tons of food and beer until they ran out, and Star Wars characters. Grant Fritchey (PASS President) walked on stage as Darth Vader with two storm troopers. He kicked the summit off with a bang.

At the Celebration I circled the room 3 times looking for someone I knew, but they found me. Laurie Carr @LaurieLCarr from Redlands and Phil Helmer @philhelmer from San Diego were standing in line waiting for beer when I passed them. So many people came in and out of our circle all night. We talked until the lights came on. Then headed to the next party.

The Pass Volunteers Party at Spin speakeasy was the next party from 7:30-9:30pm. When I walked in I saw Grant Fritchey, who was surrounded by an entourage of at least 12, Andy Yen @SQLBek holding court at the back of the room, and Ted Stathakis @fishheadted Orange County PASS talking to everyone. Laurie, Phil, Ted and I lost each other in the many groups of people, but we found each other at the next party.

I jumped in a cab with Grant Fritchey and a Royal Princess (Andrea Allred) @RoyalSQL. Andrea wears the prettiest medieval dresses every day at SQL Summit. Everyone notices her and she always has a smile. Find her next year and you can get a ribbon “I hugged a Royal”.

We rode to Denny Cherry’s @mrDenny Karaoke party at Amber. This is the BEST party of the Summit. The bar was packed. A band played on the stage to help the karaoke singers sound great at their songs. Who knew SQL people could sing?? I found Laurie and Phil at the party, then AJ @SDSQL showed up. I meet Johnny Nebraska, Mike South Dakota (with the best long hair), and Jason Horner @jasonHorner (who is the life of every party). We sang, we danced, and drank the night away. I rolled into my bed at 3am. Not sure how I planned to make the keynote at 8:15am.

11/6/2018 Wednesday

Tip one: Take Benadryl before going to bed to wake up with no hangover

Tip Two: Keynote is streamed on a live video

Tip Three: Bring business cards

Tip Four: get phone numbers to find people after the sessions

 

Wednesday morning, I woke up, grabbed breakfast, and went back to my room to watch the keynote via live stream while I put on my makeup.

10:45 – 12pm On-Prem SQL Servers, Interstellar Performance by David Klee – My favorite class. The session turned out to be Understanding the VM, storage, and network for the DBA.

Lunch – bumped into Laurie, Ashley, and several North Carolina folks for lunch then went to the exhibit hall to see the vendors.

1:30 – 2:45pm Getting Better Query Plans by improving SQL’s Estimates by Brent Ozar – I went to this session because, well, Brent Ozar. Great speaker

2:45-3:15pm There were all kinds of soda and coffee to drink.

3:15 – 4:30pm Architecting for Scalability and Performance in Hybrid Workloads by Kimberly Tripp

4:45 – 6pm So tired I went back to my room for a nap. Thank God the Sheraton is only 1 block away.

6 – 8pm Exhibitor Reception – missed it because I was sleeping

7pm – woke up and found dinner on my own

8pm – Snuck into the SentryOne party. Looked for Kevin Klein. Found Andy Yen, Laurie Carr, and met Andy Mallon @AMtwo. One drink and off to bed.

 

11/7/2018 Thursday

Tip Five: If you can’t find anyone go to the Tap House.

Tip Six: Exhibit Hall closes at 2pm, go early

Tip Seven: Must go to Karaoke at Bush Gardens and sit on the green couch

 

Found out the next morning that everyone was at the Tap House on Wednesday.

8:15 – 9:45am watched the Keynote while getting ready

10:45 – 12pm Azure Basics for the DBA by John Morehouse

11:30am – 1:15pm Women in Tech Luncheon (Men and Women are welcome)

1:30pm – 2:45pm Top Tips for Deploying AGs and FCIs on Premises or in the Cloud by Allan Hirt

3:15pm – 4:30pm Everlasting Issues with Temporary Tables and Tempdb by Kalen Delaney

4:45 – 6pm Building Trust in Teams by Richard Campbell

6-7pm go to the room and rest

7pm – 9pm Crashed the Idera party at Tap House

9pm-11pm Crashed the BIML party at Tap House. Loved meeting Catherine Wilhelmsen @cathrinew, Ben Bob (inside joke) @Bweissman, and Scott Currie @scottcurrie. Catherine and Ben were two of the four that wrote the BIML book. Scott created BIML. I found out so much about BIML. I am going to start using this amazing scripting language!

11pm – 3am Karaoke at Bush Gardens in Chinatown – Everyone was there!!

 

11/8/2018 Friday

8- 9:15am Behold the Power of Dynamic SQL by Cyndi Jonson

9:30 – 10:45am Columnstore Clustered vs Nonclustered, When to Choose What by Niko Neugebauer

11:15am – 12:30pm Using Extended Events for Wait Stats Analysis by David Maxwell

2 – 3:15pm TempDB: The Good, The Bad, and the Ugly by Pam Lahoud

3:30 – 4:45pm SQL Server Performance Monitoring Made Easy by Pinal Dave

7pm went out to eat with SQL friends: Phil (San Diego), Steve Rezhener @steveRezhener (LA), Peter Kral (San Diego), Dave (San Diego PASS leader), and AJ (San Diego).

Ready for SQL Summit next year: Next SQL Summit 2019 – Nov 5-8

 

99 Red Balloons – The query processor ran out of internal resources and could not produce a query plan

99 Red Balloons – Nena or in my case 109,202 red balloons (values) in the IN Statement caused an error.

Today a user received the message

{“Msg 8623, Level 16, State 1, Line 1

The query processor ran out of internal resources and could not produce a query plan. This is a rare event and only expected for extremely complex queries or queries that reference a very large number of tables or partitions. Please simplify the query. If you believe you have received this message in error, contact Customer Support Services for more information.”}

WTF…I have never seen this message before. After searching the internet I found that users will receive this message when there are too many values in an IN Statement list. Microsoft explains in the documentation for the In statement.

Remarks

Explicitly including an extremely large number of values (many thousands of values separated by commas) within the parentheses, in an IN clause can consume resources and return errors 8623 or 8632. To work around this problem, store the items in the IN list in a table, and use a SELECT subquery within an IN clause. +

Error 8623: +

The query processor ran out of internal resources and could not produce a query plan. This is a rare event and only expected for extremely complex queries or queries that reference a very large number of tables or partitions. Please simplify the query. If you believe you have received this message in error, contact Customer Support Services for more information. +

Error 8632: +

Internal error: An expression services limit has been reached. Please look for potentially complex expressions in your query, and try to simplify them.


How many values can I put into an in statement and not get the error?

The answer is a complicated calculation based on your CPU speed and network packet sizes. However, just for fun, I went through my list of 109,202 values (someone listed 109,202 loan applications in the IN Statement) to see what value the errors starts to happen.

I found that with integer values in the IN Statement I could use 30,694 rows of integers before the error occurs. I multiplied 30694 x 4 bytes (int size) = 122776 bytes or 123KB.

I figure the IN statement on my server can only handle 123KB of data.


How do I fix the error?

Change the query. Either remove all values over the limit or create several IN statements.

For Example:

Original Query: Select * from table where CUST_Number in (1,2,3,4,…109202)

Change to: Select * from table where CUST_Number in (1,2,3,4,…30000) or Cust_Number in (30001,…,60000) or Cust_Number in (60001, …, 90000) or… or Cust_Number in (100000, …,109202)

NoInternalResources

Number of valuesor

You Spin me Round like “Parser NULL (DTEParseMgd)”

I love the 80’s and that crazy music. This error drove me crazy (nothing on the web had a fix) and spun me around like a record baby, right, round, round. Dead or Alive 1984 – Funny old video

ERROR: Received “Parser NULL (DTEParseMgd)” error when adding an SSIS package to a SQL Server Agent job step.

JobParserNull


CAUSE: The Visual Studio was at a higher version then the SQL Server Management Studio.


FIX: Upgrade SSMS or Find another server with a higher version SSMS and connect to the instance where you want to create the SSIS job.


Details:

I have SQL Server 2016 running on Windows Server 2012 R2, but the real issue was the SSMS level.

ServerVersion

First I tried to create a job on an SSMS that was at 13.0.16000.28 and I received the error. SSMS that did not work:

SSMSVersion_Prod16_works

Then I connected to another server with another version of SSMS 13.0.16106.4, connected to the SQL Server I wanted to create the job on (the original server), created the job and it did work with no errors.

SSMS on another server that did work:

SSMSVersion

 

SSIS was compiled in this VS Version:

VSVersion

This is How We Do It: MSX Job Needs Access to Shared Directory

Obscure 90’s song reference “This is How We Do It” by Montell Jordan 1995

I have an MSX job that runs on three target servers and needs access to a shared directory. In order for the job to run successfully, I allow access for each target server SQL Agent service group. Since this MSX job runs on 3 different servers, I need to find the service group for each server’s SQL Agent and then give each service group access to the shared directory.

Although the code is deployed to Datahub (Master Server), the SQL Server Agent job actually executes on the Target Server. The target server will send messages back to the Master server and periodically download job and code changes. The Master server will be the central hub for all code and jobs.

So each Server agent job service account needs to have access to the directory the job needs to write too.

https://docs.microsoft.com/en-us/sql/ssms/agent/automated-administration-across-an-enterprise

“A master server distributes jobs to, and receives events from, target servers. A master server also stores the central copy of job definitions for jobs that are run on target servers. Target servers connect periodically to the master server to update their schedule of jobs. If a new job exists on the master server, the target server downloads the job. After the target server completes the job, it reconnects to the master server and reports the status of the job. Note that your job definition must be same when performing any database related activities.”

SQL Server Instance Configuration Settings

SQL Server Instance configurations can be changed in the properties. Right click on the SQL Server Instance and select properties.

Properties1


General page: I look under the general page for easy access to the machine (or VM) amount of RAM, processors, and if the server is clustered. Also, If you are using Always On Availability Groups, then HADR should be set to True.

To see more about the general page: Microsoft’s explanation of the General Page

InstanceGeneral


Memory page: Use this page to set the Min and Max SQL Server memory usage. I never touch the index creation or min memory per query; I just leave those at default.

Max memory should/must be changed. In order to calculate the correct settings for Min and Max SQL memory, there is a delicate balance between OS and SQL Server. If you have multiple instances of SQL Server on the same server, then both SQL Servers are trying to regulate memory at the same time. Calculating memory is not an exact science.

Just for fun, click on those radio buttons on the bottom. Switch between Config and running. Configured Values are the values you typed into the input fields. Running values is what is actually being used at this time. If Config and Running do not match, then the SQL Server Instance needs a reboot in order for the changes to take affect.

See these blogs for more help:

Instancememory


Processors page: Usually there is nothing to configure here. However, I have used this page when RedGate was hammering processor 0, so I set SQL Server to not use that one processor. I also used changed the CPU affinity when I had two instances of SQL Server running on the same server. I was able to direct SQL Server one to processors 0+1 and SQL Server two to processors 2+3. I did that so the two SQL instances would not step on each other while processing. As with the memory page, config and running show what is selected and what is actually happening, respectively.

DO NOT SELECT boost SQL Server Priority.

Processor affinity changes

Max Worker threads

Lightweight Pooling

Instanceprocessors3


Security Page: The server authentication you choose is based on the authentication you prefer. I usually use SQL Server and Windows Authentication, just in case Active Directory goes offline.

For Login Auditing, I normally use Failed logins and monitor weekly. However, at one company that stored encrypted credit card information the auditor asked us to store both failed and successful logins. Please be aware that logging successful logins on a busy system will make for VERY LARGE files in your log directory. So plan disk space accordingly. Logging is also recorded in the Windows application event log.

Set up the server proxy account to run xp_cmdshell from a SQL Server login that has less privileges than sysadmin. The last three checkboxes I have never used, but included links to documentation.

InstanceSecurity4


Connections Page:  Connections should be set to 0 (unlimited), unless you want to limit the number of connections to the instance. I always make sure that allow remote connections is checked.

Instanceconnections5


Database Settings Page: These are the default settings for every database in your instance. I like to check Compress Backup.

InstanceDBSetting6


Advanced page: My favorite Server Properties page. I have been on this page too many times to count. Most of the time I am here to configure Cost Threshold or Max DOP.

Enable Contained Databases – Allows logins to be contained within the database. All metadata is contained in the database and the contained database does not rely on the instance properties. Microsoft Contained Databases

Filestream – Used to store BLOB data (examples: Word docs, images, pdf, etc…) on disk with a pointer stored in the Filestream attribute column. Jacob Sebastian Intro to Filestream

Miscellaneous – I have never changed any of these options, but have a link to describe each one. Microsoft Advanced page properties

Network – I have never messed with these properties. SQL Soldier To fiddle with or not to fiddle with

Parallelism – Oh here it is, my favorite part. Performance 101. There are many blogs on how to configure Cost Threshold and Max DOP, but he proper configuration for those fields is all trial and error. Testing, tons of testing to figure out what the proper balance is for those fields. Never set Max DOP to the number of processors on your server or one query can take up all the threads and not leave any for other queries. I also do not let SQL Server determine what Max DOP should be, so I do not set that value at 0. I set my Max DOP in multiples of 2 or 4, just my rule of thumb, and leave at least 2 cores for other processes.

InstanceAdvanced7


Permissions Page: This topic I will leave to Microsoft. This topic requires a 3 day class. I will have to blog on this topic in the future.

Instancepermissions8_1

 

What to do when asked to create a database. First Steps

Someone has asked you to create a database for _____. Now What? Where do you go?

Step 1: What is the database for?

  • What will the database be used for? Is this a Data Warehouse, Data Mart, Reporting, or OLTP database?
  • If this database will be attached to a 3rd party tool, what are the tool requirements for the database?
  • If this database will be attached to an in house tool, what are the requirements? Don’t worry if they do not have any requirements. Sometimes we DBAs have to create the requirements for the developers.

Step 2: What version and edition of SQL Server?

  • If a 3rd party tool, what is the version the tool must have?
  • If this is the first time this database was ever created or you are designing the database. Ask management the following questions. Their answers will help decide which edition or version of SQL Server to use.
    • What is the SLA (Service Level Agreement) for up time of the instance? If there is not an SLA, create one and have them sign it. CYA (Cover your ass’ets’) . The SLA will cover topics that you need in order to decide the version of SQL Server you need.
    • What is type of High Availability do they want? Always-On? Replication?
    • What type of Failover and Disaster Recovery? Cluster? Replication? Always-on
    • Are there any special functions or SSIS/Visual Studio the developers would like to use? What versions of SQL Server work with that tool?
    • Is there a reporting tool the developers would like to use? What versions of SQL Server work with that tool?
    • Prepare a spreadsheet
      • create column headers for each edition: Standard, Enterprise, Developer, or Express (include all four editions for each version of SQL Server being evaluated) (I usually pick the most current version of SQL Server, unless directed otherwise by management.)
      • In the first row, list the license requirements and cost for each edition
      • List in each row the requirements management wants
      • Place an “X” where the edition includes the requirement.
      • Figure out your recommendation.
      • Now let management decide which edition to use.

Step 3: How much disk space is needed for the data file, index files, temp files, log files, and system files?

  • If this new database is for a 3rd party tool, find the tool database requirements.
  • If this is a new database for an internal application and the application is not designed yet. Sizing will be decided during development. In this case, sizing is an art. There are tons of blogs on how to design a database. Time to Google.
  • If this is for reporting and the database has not been designed, again turn to google to design the data mart or data warehouse.
  • If this is a current database in another edition/version of SQL Server, what is the current disk space? Is the current disk space sufficient to hold and grow for the next six months?
  • The System Admin will need you to email all the drives you would like and how much space on each one. In the old days before SAN, placement of files mattered. Now just decide how you would like to separate the files for viewing purposes. If there are no directly connected hard drives, then all the files go into one big storage server.
  • I usually have C drive for the OS, D drive for SQL Server installation and any other application installations, L drive for logs, T drive for temp files, S drive for System files, I drive for index files, and P drive for data files. Backups I place in the P drive and then move to another network directory on a separate SAN. Separate out your files however it makes logical sense to you. Try to keep consistent between all the database servers. Maybe there is a company document on what drives to place the files. Be consistent, because consistency makes finding files for maintenance easier.

 

Step 4: How many processors (CPU)?

  • If this new database is for a 3rd party tool, find the tool database requirements.
  • If this is a current database , What is the current core count? Are the cores sufficient to run the current databases on that server?
  • Do they expect a change in usage of this database?
  • If there are no changes in usage and the server runs find with the current number of processors, then stay with the same number of processors.
  • Will there be an upgrade in the Windows OS? This could affect the number of cores needed.
  • Will there be any other applications running on this VM? I hope not. Fight to keep this server for SQL only. If you lose, make sure management knows that Sql Server does its own memory and core management. The database will be slower because the application will steal resources when competing with SQL Server.

 

Step 5: How much memory?

  • If this new database is for a 3rd party tool, find the tool database requirements.
  • If this is a current database , What is the current memory? Are there any memory waits during peak times on the current database?
  • If there are memory waits during peak times, then increase the memory. This will have to be a guesstimate.
  • Will there be an upgrade in the Windows OS? This could affect the amount of memory needed. Always leave 10% of the memory for OS, the rest can be SQL Server.
  • Will there be any other applications running on this VM? I hope not, but if so the memory will need to be increased to hold the application and SQL Server.

 

In my next blog I will discuss the SQL Server Instance and Database internal settings. I hope these questions helped to decide what you need in order to install SQL Server successfully.

How to Answer DBA Interview Questions – Brent Ozar

In technology we must always stay up to date and current with our skills. We know our jobs but get nervous when we get to the job interview. Those questions on some obscure piece of SQL Server that you have never used or you used five years ago. I swear I have been a DBA for 20 years, but when I am nervous I can mess up an interview. This is where Brent Ozar is going to help us.

Brent Ozar New Courses