Wednesday, October 26, 2016

Special Order Parts Business Alert Example

We are having a great time here at Synergy 2016!

Here is the example from the Workflow session we did yesterday.

This is a trigger based business alert that is designed to email the Technician and a dispatch mail group whenever a part is received that is associated with a Service Call.


Here is the stored procedure that constructs and sends the email:
CREATE PROCEDURE dbo.cssp_SpecialOrderPartsReceipt (@ITEMNMBR VARCHAR(31),@ITEMDESC VARCHAR(101), @Service_Call_ID VARCHAR(17))
AS
/*

This is an example for educational purposes. You should test all code for its intended use before placing it in production. No warranty is implied. It you find any issues with the code, please post in comments and I will endeavor to correct it.
20161025 djedziniak initial version
*/
BEGIN
SET NOCOUNT ON
DECLARE @Technician VARCHAR(11),@CUSTNMBR VARCHAR(15),@ADRSCODE VARCHAR(15),@TECHEMAIL VARCHAR(41),@emails VARCHAR(max),@THISbody VARCHAR(max)

SELECT @Technician = Technician,@CUSTNMBR = CUSTNMBR,@ADRSCODE = ADRSCODE
FROM dbo.SV00300
WHERE Service_Call_ID = @Service_Call_ID

SELECT @TECHEMAIL = rtrim(SV_Pager_PIN)
FROM dbo.SV00115
WHERE Technician = @Technician

SELECT @emails = @TECHEMAIL + ';dispatch@key2act.com'
,@THISbody = 'Item: ' + rtrim(@ITEMNMBR) + ' ' + rtrim(@ITEMDESC) +
' Customer: ' + rtrim(@CUSTNMBR) + ' Address: ' + rtrim(@ADRSCODE)

EXEC msdb.dbo.sp_send_dbmail @profile_name = 'Business Alert'
,@recipients = @emails
,@body = @THISbody
,@subject = 'Special Order Part Received';
END


Here is the trigger that calls the proc.
CREATE TRIGGER dbo.cstr_POP10310_INSERT ON dbo.POP10310 AFTER INSERT
AS
/*
This is an example for educational purposes. You should test all code for its intended use before placing it in production. No warranty is implied. It you find any issues with the code, please post in comments and I will endeavor to correct it.
20161025 djedziniak initial version
*/
BEGIN
SET NOCOUNT ON;
DECLARE @JOBNUMBR VARCHAR(17),@ITEMNMBR VARCHAR(15), @ITEMDESC VARCHAR(101), @QTY NUMERIC(19,5)

DECLARE C_INSERTED CURSOR FOR
SELECT JOBNUMBR,ITEMNMBR,ITEMDESC
FROM inserted
WHERE RTRIM(JOBNUMBR)!=''
OPEN C_INSERTED
FETCH NEXT FROM C_INSERTED INTO @JOBNUMBR,@ITEMNMBR,@ITEMDESC
WHILE @@FETCH_STATUS=0
BEGIN
EXEC dbo.cssp_SpecialOrderPartsReceipt @ITEMNMBR,@ITEMDESC,@JOBNUMBR
FETCH NEXT FROM C_INSERTED INTO @JOBNUMBR,@ITEMNMBR,@ITEMDESC
END
CLOSE C_INSERTED
DEALLOCATE C_INSERTED
END

You will have to configure DB mail on your SQL Server in order for the send mail part to work.  However, you could replace that call with some other action, like writing a log record or calling an eConnect proc.

Enjoy!


Friday, October 21, 2016

script to auto roll calls forward in signature

DECLARE @ROLLDATE DATETIME,
@STATUS INT
SELECT @ROLLDATE=CONVERT(VARCHAR,GETDATE(),102)

EXEC SMS_Auto_Call_Roll_Forward @ROLLDATE,'sa',@STATUS OUT
go

Wednesday, September 14, 2016

Steps to change the name of a VM running SQL and SSRS

1. Open File Explorer

2. Right click This PC and choose Properties, then Advanced system Settings

3. On the Computer Name tab click change

4. Change the name and restart now

5. Launch SSMS and login (You may need to use the new server name to log in) and run the following:
sp_dropserver 'old_physical_server_name\GP2015'
GO

sp_addserver 'new_name\GP2015', local
GO


6. Restart SQL Server

7. Close SSMS

8. Open SSRS configuration

9. Choose Database then change database and follow the wizard to locate the ReportServer database.

10  Restart SSRS




Get VS Tools for Dynamics to work with 2015

1. Install VS Tools for Dynamics 2013.

2. Copy the folder C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\ProjectTemplates\CSharp\Dynamics GP
and paste to
C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\ProjectTemplates\CSharp\

3. Open a command prompt as administrator

4. Run the following commands:
cd C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE
devenv /installvstemplates

Tuesday, September 13, 2016

Set up Auto Logon in windows 2012 R2

  1. Run regedit.exe
  2. Navigate to
    HKLM\Software\Microsoft\Windows NT\CurrentVersion\winlogon
    
  3. Set or create the following keys
    (DWORD)  AutoAdminLogon = 1
    (String) DefaultUserName = Your user name
    (String) DefaultPassword = Your password

Monday, June 20, 2016

New SQL Server 2016 features and enhancements

With the release of SQL Server 2016, there is a 215 page book of new features and enhancements, many of them way over the average SQL administrator's head.

This post is not intended to cover all of them, just the ones I see as most useful to know about.

Think of it as cliff notes for SQL Server 2016 features for mortals.


Always Encrypted

Description:
This new feature is useful for storing SQL data in an encrypted format.  This is done by generating a column master key and column encryption key.  The SQL server has no ability to decrypt the data.  Any application with access to the SQL Server can query the data, but only those applications with the encryption key will be able to make sense of it.  No word yet on when Dexterity will take advantage of this, but .NET Addins may be able to use it now.

Pros:
- SQL Server never sees the unencrypted version of the data.
- The data is not transferred unencrypted.  
- You can choose to encrypt only some of the columns in a table.
- If you encrypt a column using a deterministic encryption type, you can make it a key column or use it for compare operations such as joins and where clauses.

Cons:
- Only the .NET framework 4.6 and JDBC 6.0 drivers work with this so far.
- Simple ad-hoc tools like SSMS cannot be used to review or change data.
- Out the box, SSRS will not be able to decrypt the data for reporting
- LIKE and sort operations are not supported on encrypted columns


Row level security

Description:
This new feature allows you to control which rows of data a particular user has access to select, insert, update, or delete.  This is done by adding a column for the user credentials and then creating inline table functions for the tables you wish to protect.  You then create a security policy on the table to perform the filter.  There does not appear to be any reason this will not work with Dexterity tables, but I have not yet tested it.

Pros:
- The database handles the data access, reducing repetitive coding across multiple applications.
- Eliminates security access code on the client side.  
- Works out of the box with SSRS and even SSMS
- You can silently block data modification operations or return an error

Cons:
- Only intended for middle tier applications, since it is not immune to side-channel attacks.


Dynamic Data Masking

Description:
This feature allows you to mask all or part of a columns data.  This is different from encryption, where the value can be decrypted to get the actual data.  Masking replaces the data with zeros or X's.
This is done using the MASKED WITH option in the table definition. There does not appear to be any reason this will not work with Dexterity tables, but I have not yet tested it.

Pros:
- Only the returned data is changed.  There is no change to the underlying data in the table.
- You can grant the UNMASK permission to a role to allow certain users to see the actual values.
- Removes repetitive code from applications.
- Helps keep sensitive data from being reproduced (such as on reports)
- More secure than encryption, since the masked value cannot be used to get the real value in any way.

Cons:
- Granting UNMASK only works for the entire database, not a single table.
- Does not work with encrypted columns



Autoconfigured data files for TempDB

Description:
This one is the best kind.  It is actually a performance fix of sorts that requires you to do nothing!
Prior to this version, during install, the TempDB was created using a single data file.  In 2011, Microsoft began recommending that in high-use scenarios there be one file per logical processor to prevent contention.  Unfortunately, this word did not get disseminated very well, and white papers on the subject likely flew over most mortal's heads.  

So Microsoft changed the install to automatically detect the number of logical processors and create the recommended number of data files for TempDB.  This does not help if you later change hardware, but adding more data files is relatively simple.


Query Store

Description:
This is a new feature that helps find and fix performance issues.  When turned on, it persists execution plans and makes several new dashboards available for query performance.  This can be used in conjunction with FORCE PLAN to force SQL to use the most efficient execution plan all the time,


Stretch Database

Description:
This feature allows you to store an effectively unlimited amount of "cold" data in the SQL Azure Cloud while maintaining warm and hot data in an on-premise installation.  This requires no changes to application code to implement, and data transmitted through he cloud can be encrypted.  This feature is of limited use with Dynamics installations, however, due to the heavy use of check constraints in dexterity tables.  Still, it is worth mentioning for disk hogs such as tables containing images and text columns.


Temporal Tables

Description:
This features allows you to create tables that allow you to see the state of the data at any point in the past.  It is really two tables - current and history.  It is not compatible with FILESTREAM or INSTEAD OF triggers.  It is also not recommended for use with blobs due to storage space issues.   Queries on this type of table are identical except for the addition of the FOR SYSTEM_TIME clause if you are getting historical records.  You can combine this with Stretch Database to track history indefinitely.  The jury is still out on whether we can use this with Dexterity tables, since we must add hidden period columns (thus changing the table structure).  Sounds like it needs a test :)


JSON support

Description:
The new FOR JSON clause makes exporting data to JSON a breeze, and other new functions such as OPENJSON, ISJSON, JSON_QUERY, and JSON_MODIFY help "bake in" JSON support. Using computed columns, you can even create table indexes off JSON properties inside the column data.


Reporting Enhancements

Mobile Report Publisherhttps://www.microsoft.com/en-us/download/confirmation.aspx?id=50400  This is a new tool that works with SSRS to allow you to design and publish reports for mobile devices.  It has many dashboard style controls to help conserve screen space.

KPI Development: A new KPI screen lets you create your own KPI's directly from the SSRS Report Manager web portal page.

Export Directly to PowerPoint from SSRS.















Wednesday, June 8, 2016

Well-behaved triggers

I frequently here programmers lamenting the use of SQL triggers.  To some they seem to be an evil to be avoided at all costs, and while I certainly agree that their use should be minimized, there are still instances where a trigger is the best approach.

In my experience, much of the fear of triggers seems to stem from poorly written triggers.  There is definitely a "right" way to do it and a bunch of wrong ways to do it.  Unfortunately, the standard trigger templates that ship with SSMS tend to lean to the wrong way.

So let's start with the basic trigger script from SSMS.

CREATE TRIGGER dbo.MyTrigger
ON dbo.MyTable
AFTER UPDATE
AS
UPDATE dbo.MyTable
SET ModifiedDate = GETDATE()
FROM inserted i
WHERE i.CustomerID = MyTable.CustomerID
GO


This looks simple and innocuous enough.  Ponder the following considerations, however:
1. Who wrote this and why?
2. What happens when 1000 records are bulk updated in MyTable?
3. What if I only need to update MyTable when CustomerID is one of the fields that was updated?
4. How do I stop my update inside the trigger from triggering the update trigger again?
5. Since this trigger could slow down performance, how do I prevent this trigger from running from certain other procs?

Here is a much better script:

CREATE TRIGGER dbo.cstr_MyTable_UPDATE
ON dbo.MyTable
AFTER UPDATE
AS
/*
Company

Description:

History:
20160608 djedziniak initial version
*/
BEGIN
SET NOCOUNT ON;
IF(UPDATE({COLUMN}))
BEGIN
DECLARE C_INSERTED CURSOR FOR
SELECT {columns}
FROM inserted
WHERE {WHERE CLAUSE}
OPEN C_INSERTED
FETCH NEXT FROM C_INSERTED INTO {VARIABLES}
WHILE @@FETCH_STATUS=0
BEGIN

FETCH NEXT FROM C_INSERTED INTO {VARIABLES}
END
CLOSE C_INSERTED
DEALLOCATE C_INSERTED
END
END
GO


The line IF(UPDATE({COLUMN})) checks to see if a column has been updated in the inserted table.  If it has, this function returns true.  This is how we can prevent our logic from running unless certain data was updated.

Notice that there is a cursor defined here.  That is because the inserted table could have many records in it.  A common mistake is assuming that it only contains a single record.  When possible, you should avoid using a cursor.  In our case, we can eliminate the cursor and just do a bulk update.

CREATE TRIGGER dbo.cstr_MyTable_UPDATE
ON dbo.MyTable
AFTER UPDATE
AS
/*
Company

Description:

History:
20160608 djedziniak initial version
*/
BEGIN
SET NOCOUNT ON;
IF(UPDATE(CustomerID))
BEGIN
UPDATE dbo.MyTable
SET ModifiedDate = GETDATE()
FROM inserted i
WHERE i.CustomerID = MyTable.CustomerID
END
END
GO


Now if I want to stop this trigger from running itself (and potentially creating a Cartesian product), I need to add context info.

CREATE TRIGGER dbo.cstr_MyTable_UPDATE
ON dbo.MyTable
AFTER UPDATE
AS
/*
Company

Description:

History:
20160608 djedziniak initial version
*/
BEGIN
SET NOCOUNT ON;

if Context_Info()=0x55555 return
set Context_Info 0x55555

IF(UPDATE(CustomerID))
BEGIN
UPDATE dbo.MyTable
SET ModifiedDate = GETDATE()
FROM inserted i
WHERE i.CustomerID = MyTable.CustomerID
END

set Context_Info 0x00000

END
GO


What the first red line is doing is checking a global variable for a binary value.  If that value is set, the trigger returns without doing anything.  This line can also be used to prevent the trigger from running when an update is made from some other object.  That object would first set the context info, so the update, then clear it.
The two set lines are setting the variable before the update and clearing it after the update.

So now this trigger is much safer than the default basic trigger template.
I am handling bulk operations, restricting my operations based on certain fields being updated, and using context info to prevent the trigger from calling itself.

If you have any other tips for making triggers more well-behaved, post them here!


SQL 2022 TSQL snapshot backups!

  SQL 2022 now actually supports snapshot backups!  More specifically, T-SQL snapshot backups. Of course this is hardware-dependent. Here ...