Monday, January 30, 2023

SQL 2022 TSQL snapshot backups!

 

SQL 2022 now actually supports snapshot backups!  More specifically, T-SQL snapshot backups.

Of course this is hardware-dependent.

Here are my key take-aways:

No longer needs VSS service to get it done.

It will write out a meta data file that does include info that will be helpful with restoration.  It has the snapshot details to allow SQL Server to bring the database back up like it was a full backup restore.

It means that if the appliance is handling the backup, they will be orders of magnitude faster and SQL Server will acknowledge the backup.  SQL will remove the incremental flags on the chain.

Backup groups will allow us to backup several databases at once or entire server (all databases).  Snapshot all the databases at a point in time.

Should not use differential backups with this feature.  Should take more frequent full backups.  The snapshots replace the diff backups.  

Can be used with transaction log backups.  So for instance, the snapshot runs, then 2 trans log backups run, then corruption.  Can restore the snapshot, then both log backups plus tail log.

Prior versions of SQL Server + snapshots would not really get you to point in time recovery from snapshots.   There were a variety of circumstances that could lead to non-recoverable databases.  The additional of the bkm file in this version is the key.

The following command

SUSPEND_FOR_SNAPSHOT_BACKUP=ON

writes out to bkm file with metadata only. (backup meta data)

It is not writing a bak file. Compression and encryption is handled by the appliance.

It also works for snapshots on SQL on Linux.  Containerized SQL solutions get this too.

You have to have the right hardware to make this work such as dell's solution



Monday, October 5, 2020

Which RAID should I use for my GP SQL server?

 I get this question from time to time.  Quite a bit of confusion seems to come from the way in which RAID is marketed.  There are also considerations on how a GP install uses SQL Server and the IO load it generates.

Typically the people setting up RAID arrays are more concerned with capacity vs cost.  However, they like to mix in performance numbers and this can be misleading.

First things first.  The RAID levels we should be concerned with for SQL Server are

RAID 0 - No redundancy.  Striping only.

RAID 10 - Mirror + striping.

RAID 6 - Double parity

(RAID 2-5 have been deprecated and 7 is pricy and overkill for most GP installations)


When asking a vendor about the differences in RAID, the following is a typical table you will see:


As you can see, it shows a modest read penalty with a huge up side in capacity.  Normally they leave off the write gain for RAID 6 (not sure why).  They also typically leave out RAID 0, probably because capacity is 1 to 1 for RAID 0.

This is useful for showing the difference in capacity for the same number of disks.  It is misleading when including the read and write gain.  Those numbers are best compared at similar capacities because the different RAID levels leave a differing number of disks available for IO.

Here is a much better table for comparing performance at the same capacity.


Now lets talk about how GP uses a SQL Server installation.
GP is an ERP.  Nearly everything the user does in the system requires a write operation to SQL.  There are plenty of reads, but the write speed seems to affect user experience more than read.  Since it is a multi-user system, resource intensive operations from a single user can cause a perceived slow-down to all the other users.  These large operations typically cause heavy use on tempdb and the log file of the company database.  There are also SQL jobs running throughout the work day related to GP that use resources in msdb as well as the DYNAMICS and company databases.  If SSRS is being used, there are additional read operations and heavy tempdb use (for ad-hoc table joins) that factor in.

In my experience, it is best to push the system RAM as high as possible, then move tempdb to a separate SSD, and then upgrade disks to bring write speed up as high as possible (read speed will follow) in that order.

RAID 6 is often pushed as the most economical way to add capacity, but in the GP world, write speed is king here.  A GP database that is more than 100GB in size is starting to be crushed under its own weight anyway.  The table and index structure, and the way in which the dexterity language uses the data begin causing performance degradation that is increasingly difficult and expensive to overcome.  The organization begins experiencing diminishing returns on further hardware investment.  In this scenario, the organization should stop investing in hardware until they develop and are ready to implement a strategy to reduce the size of the company database, or at least significantly slow its growth.  This can be done by either purging data or archiving it. If data is archived out of the company database and into another database, you now have 2 smaller databases.  While the overall capacity does not decrease, the options for storing and accessing the data multiply.  GP will perform better when reading and writing data on two 500,000 record tables than one 1,000,000 record table.  It is even better if the archive database can be moved to a different drive from the production database, since that way the two are not competing over read IO.

In a time when laptop hard drives are measured in Terabytes, it often does not make sense to build a huge RAID 6 array to hold GP SQL data.  This is data that needs to be separated from the rest of the organizations data, and its maintenance and backup procedures need to be handled differently.

So my answer to the question of which RAID to use for my GP server, I say all of them.

Best practices is to have at least 5 disks in a GP SQL Server setup.
Disk 1 - OS and SQL binaries
Disk 2 - tempdb
Disk 3 - system and user database data files
Disk 4 - system and user database log files
Disk 5 - backup files

Depending on your disk sizes and fault tolerance
Disk 1 - RAID 10
Disk 2 - no RAID - single SSD drive
Disk 3 - RAID 10
Disk 4 - RAID 10
Disk 5 - RAID 6 (or RAID 0 with frequent cloud backups)

Let's talk about these choices:

Disk 1
The OS and SQL binaries do not take enough space to warrant the capacity advantages of RAID 6, and do not need fast write speed.

Disk 2
There is no reason to need redundancy on the tempdb drive.  tempdb rebuilds itself every time SQL is restarted.  It is volatile, "in-flight" data, such as large data aggregations for reports or month-end routines.  It does not need to be backed up, but it does need the fastest disk IO possible.  That is why I recommend it be placed on a separate SSD.

Disk 3
The data files need fast read and write speed.  If there is a large amount of data, it may seems to be worth the lower cost to sacrifice performance and go with RAID 6.  However, if you find yourself considering that, first find out why there is so much data. (More on that in a later post)

Disk 4
The log files need the faster write speed.  The log file disk does not need to be as large as the data file disk.

Disk 5
The backup drive needs lots of capacity but not necessarily fast disk IO.  RAID 6 is a good fit for this drive.  Depending on the procedures in place for copying and retaining off-site backups, you may be able to forgo fault tolerance altogether and cut cost by using RAID 0.

Disk 6?
If you have an archive database, it may be beneficial to add Disk 6 to remove the capacity and IO from the production disks.  In this scenario, I would recommend placing both the data and log files for that database on Disk 6 and using RAID 0 or no RAID.  This database will only periodically experience writes and should be backed up immediately after each archiving session, eliminating the need for fault tolerance.



Here are some useful links to help you learn how to calculate IO and capacity for RAID.







Wednesday, September 16, 2020

How should I configure my disks for my new GP/SQL installation?

 It has been a while since I posted anything on disk setup for SQL Server and since technology marches on, below is my recommendation for a Great Plains/SQL installation.

Important considerations here are:

  • GP databases are categorized as a high IO databases.
  • GP databases are mission critical and depend on high reliability storage.

SQL Setup Footprint

  • SQL Server has .mdb files, which are the main data files, and .ldb files, which are the transactional log files. These files should be sized with at least 10% empty space to prevent auto-growth.
  • SQL Server has 4 system databases that are required for any SQL instance to work
    • Master
      • Holds the information that is global to the SQL instance.
      • This is not a high use database and should not grow much over time.
      • Typical size is less than 100MB.
      • Best practice is to place its files in the same location as the user database files, but it installs on the OS drive by default and would need to be moved immediately after the SQL installation completes.
      • These files should be backed up monthly or any time major changes are made, such as adding or removing databases.
    • Model
      • This is a template that is used for creating user databases.
      • Configuring it to properly reflect your most common desired settings can save much configuration time of databases later.
      • Typical size is less than 10MB.
      • Best practice is to place its files in the same location as the user database files, but it installs on the OS drive by default and would need to be moved immediately after the SQL installation completes.
      • These files should be backed up monthly.
    • Msdb
      • Holds the objects used by maintenance plans and SQL Agent objects (such as jobs and dbmail requests).
      • Typical size is under 2GB.
      • It will grow progressively larger due to job history records if maintenance is not regularly performed to clear those records. If this database gets too large, the SQL agent jobs may start failing due to poor performance.
      • Best practice is to place its files in the same location as the user database files, but it installs on the OS drive by default and would need to be moved immediately after the SQL installation completes.
    • Tempdb
      • This is used by the SQL instance to process data reads and writes.
      • Depending on the size of the databases being queried and the size of the data sets being returned, this database can grow quite large. The total size all files for tempdb in a typical GP instance is around 10GB, but month end processes can cause it to temporarily grow up to 100GB on a large system.
      • The files for tempdb are deleted and rebuilt each time the SQL Services are restarted, so there is no need to back them up.
      • It should be configured with the same number of files as there are active processors available for the SQL service.
      • Because of its high IO load and its large effect on overall system performance, it is highly recommended that the tempdb files (data and log) be stored on a dedicated SSD drive.
      • To discourage storage of other files on the drive and to prevent the need for file growth, best practice is to pre-size tempdb so that it takes up 80% of the drive space by default.
  • A GP/Signature install will contain the following user databases 
    • DYNAMICS
      • This is the main database for the GP instance.
      • Typical size is around 2GB on a large system and it does not typically grow very fast.
      • Should be stored on user database and user log drives.
    • Company database(s)
      • There will be one database for each GP company. It will have a maximum of 5 characters in the name.
      • Initial size with no testing data is around 2GB. Once the size hits 100GB, it is considered a large company database and steps should be taken to put procedures in place to archive or purge data regularly for performance.
      • Some of these databases will have a high IO load during peak usage hours and would benefit from being stored on a dedicated drive, otherwise they should be stored on user database and user log drives.
    • RESCOXRM_CONFIG
      • Holds configuration data for Mobiletech.
      • Not a high use database.
      • Typical size if less than 10MB
      • Should be stored on user database and user log drive
    • {company}_RESCOXRM
      • There will be one database for each company that MobileTech is installed on.
      • Initial size is less than 100MB. Size will vary based on the amount of transactional data being exchanged with the mobile devices, but should not exceed the size of the company database.
      • Should be stored on user database and user log drive
    • ReportServer and ReportServerTempDB
      • Holds SSRS reports and cached report data
      • Typical size is less than 1GB.
      • This database is high use for read operations.
      • If possible, Report Server should be installed on a completed separate SQL Instance with separate storage. Otherwise, the files should be stored on the user database and user log drive.
    • Other user databases
      • Should be stored on the user data and user log drives.
SQL Backups

  • Whether Maintenance plans or SQL Jobs are used, a SQL instance will not run for very long without regular maintenance.
  • Backups are a part of that maintenance. There are routines that are run that allow the log file to write transactional data to the data file and clear out old transactional data that will only run when a backup is taken of the database.
  • SQL needs a drive to write the backup files to.
  • Best practice is to store these on a separate drive and to regularly make a copy of the backups and move off-site.
Drive setup

Drives should be sized to at least 25% larger than the initial size of the data. 
  • Drive 1
    • OS
    • SQL binaries
    • GP Test client
  • Drive 2
    • SSD if possible and physically located inside the server
    • Data and log files for Temp DB
  • Drive 3
    • User database data files
    • System database data files
  • Drive 4
    • User database log files
    • System database log files
  • Drive 5
    • Backup files

Types of disks

  • HDD
    • Spinning hard drive.
    • Today's 15k HDD drives can transfer data at around 130MB/s
    • A drive in the drive setup represents a separate physical HDD if that is what is being used.
  • SDD
    • Solid State Drive
    • A drive in the drive setup above represents a separate physical SDD if that is what is being used.
  • SAN
    • While the speed of the drives in the SAN can affect the performance of SQL Server, the bigger bottleneck is usually the Fibre Channel connection between the server and the SAN.
    • Specs aside, a 4GB Fibre Channel will perform in a real production environment about the same as a high speed enterprise local HDD. Adding more HBA’s may help, depending on your SAN brand.
    • SAN monitoring software can help diagnose pathing issues, but different SAN vendors have different definitions for active/active, so you need to make sure you do some research into how the arrays must be set up and where the SQL files must be stored to take advantage of it.
    • Most virtual environments cannot truly take advantage of multipathing, which can severely limit SQL performance.
    • A drive in the drive setup above represents a separate LUN, preferably each LUN representing a drive group or RAID array.
  • RAID
    • May use SDD or HDD
    • If RAID is used locally, RAID 10 is recommended with a separate array for each drive in the drive setup.
    • RAID recommendations when part of a SAN generally follow the San vendors recommendations.


Monday, April 27, 2020

SQL Change Data Capture vs Change tracking

If you have ever installed Management Reporter on Dynamics GP, you know that it increases SQL resource usage and can create performance issues on your SQL server.  This is in large part due to its heavy use of SQL Change Tracking.

Change Tracking (CT) was introduced in SQL 2008.
It uses a synchronous process.  That means it runs in-line with your data operation, adding time to your save, update, insert, or delete operation.  This translates directly into reduced performance for the user in Dynamics GP.  It does not use much disk space, since it is only capturing which rows have changed and the type of change, not the data that changed.  Since there is no point in tracking changes if you don't act on them, this creates the same overall background hit to performance that CDC does, in addition to the front end in-line performance hit. 
For example, lets say I have an application that is interested in changes to the GL10000 records. 

    • I turn on CT for GL10000.  
    • A user creates a GL transaction.  Their save operation takes slightly longer while CT records that fact that they inserted a record.  
    • My application gets this notification and then separately queries the GL10000 table to get the data I an interested in from the new record, which uses overall resources from the SQL server, as well as hitting the GL10000 table directly.

Change Data Capture (CDC) was also introduced in SQL 2008, but only in Developer and Enterprise additions.  It is now also supported in Standard edition starting with SQL 2016sp1.
It uses an asynchronous process.  That means it does not directly slow down your data operations.  The asynchronous process does use resources such as processor, memory, and disk IO, so it has an indirect effect on performance for the user.  It also uses more disk space than CT because it is storing a default of 3 days of historical data for the table being tracked in order to capture the actual data that was changed.
 For example, lets say I have the same application that is interested in changes to the GL10000 records. 

    • I turn on CDC for GL10000.  This creates a shadow table of GL10000 that will hold historical data.  
    • A user creates a GL transaction.  There is no performance hit to their save operation.
    • CDC's asynchronous process reads the database log file and sees the insert operation.  It saves the data from the record to the shadow table, which uses overall SQL resources but does not hit the GL10000 table directly.
    • My application reads the shadow table on a predefined interval to get the data I am interested in, which uses overall SQL resources but does not hit the GL10000 table directly.

Both of these solutions will perform similarly in a controlled test environment with a small data change rate.  The difference really becomes apparent as the system scales up.  My assumption is that this is why CDC was only included in Enterprise edition early on.  

In my opinion, it is much easier to address overall performance of the SQL instance by adjusting resources, than to try to address the performance degradation of the individual user operations.  So, where possible, I would recommend using CDC rather than CT for tracking changes in a Dynamics GP database.  Hopefully, Management Reporter will make this adjustment soon!


Friday, January 10, 2020

Cloning SQL Servers


It is common for our customers to clone virtual  servers.  Sometimes they even clone SQL servers.  We even tell them to do this sometimes, such as when we want to create an identical test environment.

It is a huge time saver, but when cloning a SQL server, there are a few additional things we need to watch out for.  If the customer IT person is not a DBA, they will most likely have no clue to watch out for these things.

*** If the SQL server they are cloning has drive mappings or any type of alias to the storage location where SQL data, log, or backup files reside, these may need to be re-mapped on the clone.  Before using a cloned SQL server, verify the drives you see are not really the production drives. 

The easiest way to check is to launch the cloned server and save an empty text file to each storage location, then launch the production server and see if you can see the files.  If you can, STOP!  It means the new SQL server may try to modify production data.  The SQL services on the cloned server should be stopped immediately until the problem can be addressed.  Each server should have completely physically isolated storage.

*** If the SQL server has maintenance plans:  Maintenance plans create jobs and have a hardcoded connection string to the jobs they create.  On the cloned server, these connection strings will point to the production server.  This may cause SQL jobs to fire twice.  This will be noticeable for backup plans because there will be 2 nighty backup files generated, or job failures due to conflicts.  This will also degrade performance during the day on the production server. 

The correct way to prevent this starts BEFORE the cloning.  Record the steps in the maintenance plans on the production server, then disable all the plans and jobs on the production server before cloning it.  Re-enable them after cloning is complete.  Then on the clone server, delete the maintenance plans and re-create them.  Check all the other jobs to make sure their connection (bottom left of the job properties window) is correct before re-enabling them.


Wednesday, October 9, 2019

Margins, who needs them?

Who needs margins?
The short answer... students and teachers.

They need them to have room for writing comments when grading essays and such.  But does that really mean the rest of the world must suffer with a default margin?



I personally don't think that page margins have any place in the modern business world.  

Sure, back int he 80's we needed them for dot matrix printers (didn't want to print over the track holes or page perforations!) and for leaving room for binding all that dusty paper together before stacking it in filing rooms to serve as a fire hazard, but never to be read again.  

However, in the modern world, there is little reason to actually print much on paper.  Most reports, for instance, are printed to a PDF file and emailed or otherwise stored electronically (never to be read again :) ).   Modern printers can print right to the edge of the page, so page margins have become more for aesthetics than function. 

So why does pretty much every application (eg. office, SSRS, etc...) out there that designs things for printing still assume we need a default margin?  Why not let the default be zero (or some tiny number like .01in) and let the person entering the content add a margin if desired?

If someone knows the answer, I would enjoy your comment.



Wednesday, June 5, 2019

dex ini switches

I am reposting this information from https://community.dynamics.com/gp/b/gplesliev/archive/2014/02/20/dex-ini-switches-my-complete-list
in order to keep from losing it.  All credit goes to Leslie Vail.


.ini settingDescription
ADCProcessor=TRUEYou will see this line in the Dex.ini file if you are using Manufacturing and have the checkbox marked in ADC Preferences.
AdvLookups=FALSE
New Users created will NOT be granted access to the Alternate lookup windows in the SmartList dictionary and will instead be assigned the old ‘green bar’ lookup windows.
AllowBCPTest=FALSEPrevents utilities from running the BCP (bulk copy program) test.
AllowLongTableNames=TRUEWill allow you to create long table names in Dexterity. Otherwise, you are limited to eight characters. This is set in the Dexterity options menu.
AllowWrongDex=TRUEWill allow Dynamics GP to launch with mis-matched versions of dexterity. DANGER DANGER this should only be used for troubleshooting, not as a way of getting around the error. Never use this in a production environment.
ApplicationName=nameChanges the name the runtime engine displays when it is launched. Without this setting the name “Dexterity Runtime” is displayed. These days, the window doesn’t stay visible very long, so it’s utility is questionable.
AutoDisplayUpdate=TRUEAutomatically redisplays the process monitor queue.
AutoInstallChunks=TRUECauses Dynamics to automatically include the *.cnk file without displaying the 'Add New Code' dialog during launch. The could save you some time if you need to roll out changes to several workstations.
BTInterface=NoLoadThis applied to the old Btrieve file handler (PSQL 2000) as to whether the interface would load when Dynamics was launched.
Buildphantom=TRUEAllows creation of a Manufacturing Order for a Finished Good Phantom Item
BuildSQLMessages=TRUEThis one will copy the Dexterity messages to a SQL table on next login and then it will set it back to FALSE.  Once in a SQL table the messages can be used in stored procedures. The table is DYNAMICS.dbo.MESSAGES
C:\DPS1\DEX.INI DPSInstance=1 
C:\DPS2\DEX.INI DPSInstance=2 
C:\DPS3\DEX.INI DPSInstance=3
To allow multiple process servers on a single machine. One line for each process server.

CompilerWarningLevel=2
This setting will disable the warnings for all scripts when you are working in Dexterity. Although this option prevents warnings about literal strings, you shouldn’t be using literal strings because it is not considered a best practice. Instead, use the pragma(disable warning LiteralStringUsed); instead of using this setting.

DebugFonts=TRUE
This setting causes Dexterity to generate a trace file named "debuglog.txt". This file lists the fonts that it considered and why particular fonts were chosen or rejected.

DebugRW=XXX

This is a long one! Where XXX equals the sum of the values you want to trace from below. 
Value = Name Description 
1 = QueryOK Specifies if the report will use a single query or not 
2 = Sanscript Logs the run report statement as the Report Writer sees it 
4 = RW Query Logs all API calls from RW to the Data Manager 
8 = RW Setup If used with RW Run, logs all data returned by Data Manager 
16 = RW Steps Logs internal RW steps in processing the report 
32 = RW Run Logs all RW runtime calls to the Data Manager 
64 = DM SQL Logs internal Data Manager structures and SQL Generation 
256 = RW Frames Logs the beginning of each report frame 
512 = Tab Delimited Logs output as tab delimited output 
Output will appear in a log filed named DebugRW.txt next to the application dictionary. 
Example: If you want to log if a report is using a query and the SanScript and SQL code used, then add the following line to your Dex.ini file: 
DebugRW=67 ; 1 + 2 + 64 
Use 895 if you want to track everything. 

To help in trouble shooting problems related to the generation of reports, before printing a report you may choose to mark the Multi-Login Report option on the Report Definition window. This will force the system to perform individual table operations instead of creating a query. If you have the SQLLogSQLStmt = TRUE setting included in your DEX.INI. The individual select statements are then included in your DEXSQL.LOG file and can be analyzed to uncover any potential problems.
DebugUnknownFile=TRUEReturns a Btrieve or Ctree error code to help track problems with table errors back in the day of Btrieve and Ctree. (version 7.5 and previous)

DevAssistHide=TRUE

Prevents the Developer’s Assistant window from opening when you go into Debug mode in Dexterity

DexHelpPath=pathname

Path to the Dexterity help files.

Dictionary Version=10.00.1061

The current version of the Dynamics.dic file.

DISALLOWSNLOTAOF=TRUE

Prevents users from being able to "create" new lot numbers or serial numbers while transferring an item from one site to another. Version 10 now has a setup option that disallows this, but the box is not automatically checked. GP is good about not changing the old functionality (really, I mean this) so be sure to check your setup screens after each update or service pack for new setup choices – watch for those new checkboxes!

DisplayTableErrors=TRUE, ALL or OPEN
When the Dexterity Database Management Subsystem encounters an unknown table error this will display the message "Unknown Error = number", where number indicates the error that occurred. The ID that's displayed can be used to determine the cause of the error. 
TRUE – Displays only unknown errors. 
ALL – Displays all table errors except the two most common: “duplicate” and “not found”. 
OPEN - Displays all table errors for an open operation.
DOT4_001winspool=500;750Horizontal and vertical adjustments to printed documents. DOT4, in this case is the port the printer is attached to Printspool=horizontal;vertical. Below is the window that goes with the .ini setting at left. clip_image002
DPSInstance=1Tells the runtime engine which instance of the Dynamics Process Server to use.
duUseResourceDlls=FALSE 
duExtractSqlResources=TRUE
Extracts all SQL objects (tables, views, stored procedures, etc.) when an installation / upgrade of GP is done. These details can be found in the SQL->Install subfolder inside GP Application Folder. Open GP Utilities, login and you will notice the SQL Resource extraction occurring
DynHelpPath=pathnamePath to Dynamics help files
EmailStmtStatusPath=c:\documents and settings\user_name\desktop
The path for the delivery of e-mail statements

EmpLookup=2

Causes the Employee Lookup to default to Last Name

EmpLookup=3

Causes the Employee Lookup to default to First Name

EmpLookup=4

Causes the Employee Lookup to default to Social Security Number

EnablePerUserIni=TRUE
TRUE will enable the dex.ini per user function 
FALSE is the default because it is not included as a default
ExplorerFormatCurrency=FALSEExports foreign currency (ie £ ) to Excel from SmartLists as numbers instead of text.
ExportOneLineBody=TRUEAll fields and text entered in the body of the report will export to a single line when the report is printed to a file. Allows you to enter many fields vertically instead of having to enter them horizontally in the report layout. 
All fields and text entered in the body of the report will export to a single line when the report is printed to a file. Using this setting allows you to enter many fields vertically instead of having to enter them horizontally in the report layout. This was what we had to do back in the pre-SQL days. If you don't exactly have your arms around the SmartList Export Solutions, it's still a nifty way to 'export' Excel formulas. Be sure to change your Excel formula options to R1C1 reference style and you won't need to keep track of any Excel templates. My clients enjoy just being able to run a Report Writer report to get all of the formulas they need. That, and I already had it written so it was easier. Here is what a simple report looks like with embedded formulas:
 
clip_image004
EXTPRICINGQTYFREEISADDER = TRUEThe EXTPRICINGQTYFREEISADDER option is included in the extended pricing component to let you type a negative value in the Maximum Quantity Free box. The extended pricing component uses a negative value in this box to let the Maximum Quantity Free field behave in an additive manner instead of as a multiplier. Refer to KB article 910125.
FAGroupPath=Pathname\filename 
FAPhysicalInventoryPath=Pathname\filename 
FAPhysInvInfoPath= Pathname\filename
FAAssetImportPath= Pathname\filename
FAAssetIDExportPath=Pathname\filename 
FAAssetLabelExportPath=Pathname\filename 
FAMacroPath= Pathname\filename 
FASampleDataPath= Pathname\filename
A group of settings that determine the import/export file names for user preferences in Fixed assets. Setup>>Fixed Assets>>User Preferences


dex.ini switchdescription

FHCheckRanges =TRUE
This setting specifies whether to log the instances when Dexterity decides which type of range to use for tables that use the SQL database type. The log will list all instances when Dexterity detected a range that was not “well-behaved” and whether Dexterity used an exclusive or inclusive range. The FHRANGE.LOG text file to be generated in the same location as Dexterity or the runtime engine.

FileHandler=SQL

This use to be the DatabaseType and was used to determine in part what kind of tables to create when Ctree and Btrieve (PSQL 2000) were supported.

FLEXICODER_UPDATE_SEGMENTS=FALSE

If using the Flexicoder product from eOne Solutions, this determines whether the segment table is updated.

FormDictionaryPath=pathname 
ReportDictionaryPath=pathname

These are new for version 10 and welcomed with open arms! These lines tell dexterity where the forms/reports should default to when installing a new .cnk file. Switch it to a shared location for a shared forms/reports.dic and any new dictionary applications that are installed will default to that location instead of the client installation folder. The default location is to store Forms and Reports dictionaries in the Data folder of the GP folder.

HideInvisibleLayoutFields=FALSE

Indicates whether the invisible window fields should be displayed in the Modifier.

HideInvisibleLayoutFields=FALSE

Remembers the setting for the Modifyer Layout window as to whether Invisible fields will be displayed.

IdleLoopValue= number

A numeric value that represents a setting to control how background tasks interface with the CPU.

IdleLoopValue=0

Voluntarily give up the CPU when sitting idle and allow background processes to execute only as often as the OS issues a timer event. Under this setting, the runtime asks to be notified every 50 milliseconds or 20 times a second. Using this setting will cause some processes to be slower than in previous releases, but it will not allow the CPU to be pegged at 100% utilization.

IdleLoopValue=-1

Use as much time as possible to execute background processes. Under this scenario, the CPU would never voluntarily be given up to other applications when sitting idle. This is the default setting and would cause the runtime to behave the same way as it has in previous versions of the Dexterity 
runtime.

IdleLoopValue=some positive number [maximum 500]

This value is interpreted as the number of times to process a part of a background task before voluntarily giving up the CPU. This can somewhat control the amount of CPU utilization taken up by background tasks. A typical range for the IdleLoopValue when setting it to a positive number would be 
between 5 and 10. Generally, you would never go above 200, even though the maximum is 500.

Initial=FALSE

This line is associated with the automatic creation of palettes; it was used only Release 1 of Dynamics. It will default as equal to FALSE, you can delete it without ill effect.

IsWebClient=TRUE

The setting can set to TRUE or FALSE to override the Runtime_GetClientType() function library command

LastTaxCodeUpdate=02/28/2008

The date of the last payroll tax code update, informational only

LastYearEndUpdate=11/14/2007

The date of the last payroll year-end update, informational only.

LayoutLayout1=162,20,833,616 
LayoutToolbox=1,1,156,382 
LayoutPropertiesSheetTab=1 
LayoutPropertiesSheet=940,63,270,349 
LayoutPropertiesSheetVisible=FALSE 
LayoutLayout2=169,28,615,391
 

Used to save the adjusted window size of the Layout window within Report Writer.

Letters Directory=c:\Program Files\Microsoft Dynamics\GP\Data\Letters\

Sets the path to the Letters parent folder that contains the letter templates used with the Letter Writing Assistant.

ListsFastExcelExport=TRUE

An unsupported switch that speeds up exports for all of the Navigation lists that show in the navigation pane. This switch is similar to the SmartlistEnhancedExcelExport=TRUE switch except the later applies only to SmartList objects

MagnifyScreenOutput=100

When a report is printed to the screen, this is the magnification setting.

MainProduct=TRUE

If you are using Dex to develop a stand alone app dev tool

MaxSWScrollbarSize=XXX

To allow the scrollbars to get bigger than 25 lines.
MFGRollupPhantomLabor = TRUE

To roll costs on Phantom Bills of Material in MFG
MinPMCheck=50.00

Used in conjunction with PSTL free tool to set the minimum payables check amount. This is used in conjunction with the Select Checks window. It doesn’t limit the check if you use Edit Check Batch or some other window to print your checks. No check for under the amount specified will be created. In this example the minimum check would be $50.00

NextEntryID=
NextGroupID=

When a call background statement is executed in sanScript, Dex creates an entry for that script in the background script queue and assigns it an EntryID. Similarly, when Dex executes the begingroup statement in sanScript, it creates a group item, assigns it a GroupID, and adds it to the background or remote queue. As Dex is running, those IDs are assigned sequentially. So the first call background will be assigned an EntryID of 1, the next would have an EntryID of 2, etc… The same is true for GroupID’s and begingroup. When Dynamics or Dexterity is closed, it writes the current value for those IDs to the dex.ini file. When Dynamics or Dexterity starts up, it reads those items and starts with those values for future background scripts or groups.

NoPrintDialogs=TRUE

Print Dialog boxes will not appear. This is a Print Dialog box:

NoteWindow=37,37,450,299

Used to save the adjusted window size of the Record Notes Window.

OldRelationshipWindow=TRUE

Just kidding, this one doesn't work anymore. For the oldies among us, remember when Report Writer would let you create invalid relationships? Like you could build a relationship between a string field and an integer field. Hey, sometimes you needed to! This switch allowed the old Relationship Definition window to work again. This switch was used prior to version 8.

OLE_Application1=C:\RepairCBDChain.exe

If you run Microsoft Dynamics GP 10.0 in a Citrix environment, the copy (CTRL+C) and paste (CTRL+V) functionality does not work in Microsoft Dynamics GP. You can no longer copy or paste between a local application and a session or between different applications in a session. See KB 958404 for the complete explanation of using this switch.

OLEPath=pathname

Used to indicate where OLE objects that are attached to Dynamics record notes should be stored. Thanks to Matthew, I have a new solution to this one. Apparently, you CAN use the UNC path so long as you have the folders pre-created for each company. Read explanation below: 
You only need one line in the dex.ini to accommodate multiple GP companies. If you have three companies in GP (CMP1-CMP3) the line in the dex.ini would still be, for example, 
OLEPath=\\ServerName\GPShare\OLE\ 
For this path to work with the three companies, you would need to manually add the company directories under the OLE directory on the share. For example, the folders would look like this: 
GPShare\OLE\CMP1\OLENotes 
GPShare\OLE\CMP2\OLENotes 
GPShare\OLE\CMP3\OLENotes 
If you're using a local drive or a mapped drive instead of UNC for the OLEPath, this directory structure is automatically created for you by GP. 
If you don't have the folder structure set up first, UNC will not work. IP addresses won't work either. You will need to use a mapped drive for the system to create them on the fly.
 

OLEPathHR=pathname/

Used to indicate where OLE objects attached to notes in the HR module are stored. This must be a mapped drive, UNC will not work.

PAPRINTFEESBEFORECC=TRUE

Will cause fees to print before the cost categories on a Project Accounting invoice

Pathname=DYNAMICS/dbo/

Location of the Pathnames table (SY02100). When Ctree and Btrieve (PSQL 2000) were supported this line contained a path with either a drive mapping or UNC path to the Dynamics data directory. Today this setting points to the database containing the SY02100 table.

POReturnsMod_ReturnedQTYCheck=TRUE

Enables additional shipment matching functionality in the POret.cnk file. This additional functionality is on the Match Shipments to Invoice Window (opened from the Purchasing Invoice Entry window). The Match Shipments to Invoice Window checks the previously returned quantity to ensure that the quantity returned against a shipment line is not greater than the quantity originally shipped.

PrintReportDefinitionOptions=2

Indicates which of the six checkboxes are checked in Report Writer when the ‘Print Definitions’ button is selected. The value is the sum of each checkbox that is checked. The six checkboxes and their values include: 
1 = General Information 
2 = Calculated Fields 
4 = Table Relationships 
8 = Sort Segments 
16 = Restrictions 
32 = Layout Information 
If only General Information and Sort Segments should be printed the value here would be 9.
 

QueueMoreInfo=TRUE

clip_image006 
Adds a More Info button to the Process Monitor window which displays the window above when selected.

ReportDictionaryPath=pathname 
FormDictionaryPath=pathname

The lines in the dex.ini are for if you add a new dictionary (say I gave you a new customization) then those lines tell dexterity where the forms/reports should default to.  So you could switch it to a shared location for a shared forms/reports.dic and everything will default there in the future. The default location is to store Forms and Reports dictionaries in the Data folder of the GP folder.

ReportLayout=158,155,814,617

Used to save the adjusted window size of the Layout window within Report Writer.

ReportViewMode=1


REVALUEINDETAIL=TRUE

This gives you the ability for the cost adjustment entries to post in detail to the general ledger. The default behavior is to create a single summarized journal entry. The April Hot Fix also includes the ability to get the Purchase Receipt Update Detail (PRUD) report to print in detail. If the ‘Print’ check box is marked for the Cost Variance Journal in the Posting Setup window for the Sales, Inventory and Purchasing series, then this report is also printed whenever there is a cost adjustment and there are general ledger transactions that need to be posted. By default, this report prints in summary. To force this report to print in detail, there are two modified reports that are available at the Great Plains Reports Library located at: https://mbs.microsoft.com/customersource/support/downloads/reportslibrary/gp_reportslibrary.htm that can be downloaded and added to your reports dictionary. After that is completed, you must also grant access to the modified report. After those reports and the dex.ini entry is in place, these reports will provide additional detail supporting the cost changes that are occurring within the Inventory Control module. The reports will provide: 
• outflow (sales or inventory) document number 
• Item number affected 
• quantity of the item on the outflow transaction 
• old unit cost of the item on the outflow transaction 
• new unit cost of the item on the outflow transaction 
• calculated difference between the two costs 
• extended amount (quantity * cost difference) 
The PRUD report is a “one time” report that is produced at the time of posting and can’t be regenerated so it is essential that this report be printed to paper or file anytime it is produced.
 

RuntimeAsserts =TRUE

Forces the runtime engine to display a dialog box for any assertion that fails.

RwFontChangeSizing=False


RWLayoutPropertiesSheet=822,25,270,349

Used to save the coordinates of the Properties window within Report Writer.

RWLayoutPropertiesSheetTab=2

Used to save which tab is selected in the Properties window within Report Writer

RWLayoutPropertiesSheetVisible=TRUE

Used to indicate if the Properties window within Report Writer is visible.

SAMPLEDATEMMDDYYYY=00000000

To prevent the dialog box from being displayed and to use the current system date

SAMPLEDATEMMDDYYYY=MMDDYYYY

To prevent the dialog box from being displayed and to use a user-specified date

SAMPLEDATEMSG=FALSE

To prevent the dialog box from being displayed while leaving the date as April 12, 2017.

ScriptDebugger=TRUE

Turn Script Debugger Features on in Runtime Mode.

ScriptDebuggerProduct=

Specify Product ID of product to be debugged to allow Open Script.

ScriptEditorFontName = FontName

Entering a font name here will cause Dexterity to use that font in the Dexterity Script Editor or Script Debugger window. Without the switch you can only choose Courier, Helvetica, Times or System.

ScriptEditorFontStyle=Bold,Italic,Underline (use whichever ones you like)

This switch allows you to set the Bold, Underline, or Italic setting of the font used in the Script Editor.  
While the Script Editor window will respect the settings for the ScriptEditorFontName and use that font, the Dexterity Options window DOES NOT know about the change.  If it finds a font in the Dex.ini it doesn't know about, it will switch your Dex.ini to use the default Courier font.  So if you need to go into the Options window you'll probably have to change your Dex.ini again to reflect the font you'd like to use.  This is the price of using this particular undocumented feature. 

ScriptLogEnhanced=TRUE

Enabling this feature will add a Timestamp to the beginning of each line in the Script Log and will also flag scripts running in the background with a “B”. 
Note: This setting is only supported on version 10.00 Service Pack 4 or later.
semicolon (;) in front of the ST_MachineID= setting

Turns off Named Printers for that workstation

ShowAdvancedMacroMenu=TRUE

Display additional menu under Tools >Macro. 
When you get that message that says your macro failed on line 65,423 you can use this to open your macro and jump right to the line of failure for analysis. Being a frequent user of the 'Mail Merge' macro, I use this frequently to find out what went wrong in the middle of my macro. Another cool thing about this is that it lets you start a macro at any line, no more macro surgery. Also, there's no separate .cnk to install. A simple .ini switch does the trick!

ShowAllMenuItems=TRUE

Shows all menu items even if the user does not have access to them. This doesn’t change the user’s access to the windows; it just will display the menu items even if they have been disabled.

ShowReportName=FALSE

Keeps the Screen Output window from displaying the current report's display name.

ShowResIDs=TRUE

This line will allow Dexterity to show the Resource ID of the current script in the script editor as well as showing the column ID & Res Type in the Dexterity Resource Explorer
SkipVersionChecks=TRUE
 

To disable version compatibility checking. DANGER

SmartlistEnhancedExcelExport=TRUE

Uses a different method to export data to Excel. It’s much faster, but certain things do not export correctly, some foreign currency symbols for one. Also it strips off leading zeros. Test it first before deploying.
SmartlistMatchCasePrefs=TRUE

This change to the Dex.ini file enables the following two options for the SmartList security settings: 
– Mark the ‘Match Case’ checkbox by default when performing searches. 
– Disable the ‘Match Case’ checkbox on the SmartList Search window. 

SQLCursorBlockSize=100 Removing this setting

To prevent the Loop stopping at 25 records when looping through a range of records with Dexterity.

SQLDataSource = data_source_name

This setting causes the specified data source name to appear in the Data Source field in Dexterity's SQL Login window. A user may choose another value from the drop-down list, should he or she want to connect to a different data source. 
The SQLDataSource setting won’t affect a login window you create unless you use the
Defaults_Read() function and the set statement to make this setting’s value appear as the default value for the data source field.

SQLDropTableOnDelete=TRUE

Will cause the drop functionality, which removes the data AND the table structure from the database, when the delete table statement is used with a SQL table. Unless theTable_SetCreateMode() is set to true, the dropped tables won't be created automatically then next time a user tries to access them. DANGER

SQLDynamicScrollJump = number

number = The maximum number of rows a cursor will move when a user moves the scroll box, the default is 100.

SQLLastDataSource= data_source_name

The name of the most recently accessed data source. That value will then appear as the default in the predefined Login window the next time that window is used on the current computer.

SQLLastUser = user_ID

The user ID of the most recent user to connect to a data source using the current computer.

SQLLogAllODBCMessages=TRUE

This setting does nothing. The only setting you need for ODBC logging is SQLLogODBCMessages=TRUE (see below). This is a left over setting that isn’t used anymore.

SQLLoginCompatibilityMode=TRUE

In previous versions, SQL Logins were tried with the new 9.0 encryption, the pre 9.0 encryption, plain text login, and then the new encryption model.  This would give the user 4 login attempts if they accidently typed in the wrong password.  With release 10, only the current encryption will be tried.  If the user needs to use the “old” methods – for example just logging in after an update from 8.0 or if the password was changed via SSMS to plain text, adding the dex.ini switch SQLLoginCompatibilityMode=TRUE will revert to the previous behavior.

SQLLoginTimeout = number_of_seconds

Limits the length of time (in seconds) your application will wait for a data source login attempt to be completed. A login doesn’t have to be successful to be completed; rather, an attempt to log in must be made and a status code defining the result of that attempt must be returned. 
The default is 15 seconds if this entry is not included in the dex.ini. To cause your application to wait indefinitely for a status code to be returned, set this value to 0 (zero).

SQLLogODBCMessages=TRUE

Log ODBC messages returned from SQL Server to DEX.SQL file.

SQLLogPath=path

This setting allows you to specify the location of the DEXSQL.LOG file created when either or both the SQLLogODBCMessages or the SQLLogSQLStmt defaults file settings are active.

SQLLogSQLStmt=TRUE

Log SQL statements being sent to SQL Server to DEX.SQL file

SQLNumLoginTries = number

The maximum number of login attempts allowed before GP exits. The default is three. Use negative one (-1) for an unlimited number of tries.

SQLPassword = password

This setting allows you to define the default password that will appear in the password field when logging into a data source using Dexterity’s predefined SQL Login window. This setting won't affect a login window you create for use with your application unless you use the Defaults_Read() function and the set statement to make this setting's value appear as the default value.

SQLProcsTimeout=0

Including this setting in your DEX.INI file will control the period of time Dexterity will wait for a SQL stored procedure to execute. When set to 0 it will wait indefinitely. A valid range of values for this setting are 0 to 9999. If this setting is not included, the default value is 300 seconds.

SQLQueryTimeout=0

Including this setting in your DEX.INI file will control the period of time Dexterity will wait for a SQL query to execute. When set to 0 it will wait indefinitely. A valid range of values for this setting are 0 to 9999. If this setting is not included, the default value is 300 seconds.

SQLRptsTimeout=0

Including this setting in your DEX.INI file will control the period of time Dexterity will wait for a report to generate. When set to 0 it will wait indefinitely. A valid range of values for this setting are 0 to 9999. If this setting is not included the default value is 300 seconds.

SQLUser = user_ID

This setting allows you to define a specific user ID as the default value to use in the User ID field of Dexterity’s predefined SQL Login window. The SQLUser setting won't affect a login window you create for use with your application unless you use the Defaults_Read() function and the set statement to make this setting's value appear as the default value for your own user ID field.

ST_ lines that appear in the dex.ini file. Removing all of these settings.

Stops named printers from being used on a workstation.

ST_Debug=LOG

Creates a log file in the local code folder named ST_DEBUG.LOG that can be used to troubleshoot Named Printers when the program does not use the printer that you specified in the Named Printers Options dialog box. Microsoft Dynamics GP may use the wrong printer even though you specifically changed the printer in the Named Printers Options dialog box. In v 10 this log file is created in the \Data subfolder inside the local code folder.

ST_MachineID=machine_id

The machine ID for named printers.

ST_SetDefault = FALSE

Turns off the setting of default printers in named printers

ST_SetDefault=TRUE

Sets the default machine ID for named printers.

ST_SOP_Shortcut=4105,0,0,0

Stores settings for the SOP QuickPrint Feature (which is part of Named Printers)

StdInternationalInfo=TRUE

This setting allows the operating system settings for date, time, and currency formatting to be overridden. 
TRUE = Use Dexterity standard formats for date, time, and currency values mm/dd/yy. 
FALSE = Use control panel settings for date, time, and currency values

StdInternationalInfoDateCentury=

This setting allows control over the year portion of the date values used by StdInternationalInfo 
TRUE = Four digits will be used for the year portion of date values 
FALSE = Two digits will be used for the year portion of date values

StdInternationalInfoDateZeroPrefix

This setting allows control over the date format for SdtInternationalInfo 
TRUE = Single-digit day and month values will be prefixed with a 0 (zero) 
FALSE = Single-digit day and month values will not be prefixed

SuppressChangeDateDialog=True

Use this to prevent the dialog from opening asking you if you want to change the date when it hits midnight. The problem was that the dialog would open and crash any integrations you were running overnight. Be careful with this one, not only does the dialog not display, but the computer date doesn’t change either – so be sure to watch this if running overnight integrations.

SuppressSound=TRUE

Stop Dexterity and Dynamics from generating any sounds. This is popular if you don't want the whole office to hear the computer 'beeping' when you make mistakes. It was also popular back in the 'Welcome to Dynamics' .wav file days.

Synchronize=FALSE

Indicates whether Dynamics Utilities has synchronized the Dynamics.dic to the account framework that is defined in Dynamics Utilities. FALSE signifies that the Dynamics.dic has been synchronized. If a user needs to resynchronize the dictionary, this line should be modified to equal TRUE and then launch Dynamics Utilities.

taShipmentNotificationAllowVoids = true

If this switch exists in the dex.ini and is set to true, invoices will be able to be voided, even if they have Purchase Order Commitments.

taShipmentNotificationAlwaysPost = true

If this switch exists in the dex.ini and is set to true, invoices will always be allowed to post, even if the Shipped flag is not marked.

taShipmentNotificationAlwaysTransfer = true

If this switch exists in the dex.ini and is set to true, orders will always be allowed to be transferred to invoices, even if the Shipped flag in the taShipmentNotification table is not marked.
The following Additional Window is available if you do not want to use [taShipmentNotificationAlwaysTransfer = true]. If you want a user other then ‘sa’ to have access, you must set up security. The following example is one way to set up security
Go to Shortcuts and click Add >> Add Window. Click on Shipment Notification >> Purchasing >> Shipment Notification and click Add and then click Done.
 
clip_image008 
1. You can now use the following window to specifically select which documents can transfer without doing a Purchase Invoice. 
clip_image010



Tolerance=0

The Tolerance command will ignore the font defaults and force the system to use the font utilized in Great Plains.

Tolerance=-1

The Tolerance command will ignore the font defaults and force the system to use a true type font.

Tolerance=100

The Tolerance command will ignore the font defaults and force the system to use the default font set in Great Plains.

TPELogging=TRUE

This is related to the new Word Template AddIn 
Used to gather information where an exception has occurred. It points the combine process where I will go in and debug the TPE.". 
The log is created in the %temp% directory where the actual Word document is rendered and its content looks something like this: 
10 hours. 57 minutes.13 seconds. 901 milliseconds. :20050: Combine.Unhandled 
TPE was indicating that there was an issue with an exception not being handled when the XML data from Report Writer was being combined with the actual Word template.
This allowed Rob and his team to start tracking down the issue with my template from the source code and provide me with an alternate route to solve my problem. 
As a final note, it seems there is an issue as well with TPE not clearing some process after choosing to remove the exception from the Report Processing Status window. In turn, this prevents shutting down GP via the Exit option, having to use the Windows Task Manager to force the termination of the DYNAMICS.EXE process. Rob's team is on this too.

 

TXFileDate=31 Dec 2006

Indicates the date of the last payroll tax table update, information only. Only available from the machine on which the tax update was performed

UpdateLogin=https://mbsupdates.microsoft.com/taxupdate/login.aspx

The entry that should be in the .ini file so the payroll tax updates will connect correctly.

UseUniqueHelp=FALSE

This setting allows you to override the default value for the UseUniqueHelp property for fields when implementing context-sensitive online help. FALSE indicates that the UseUniqueHelp property will default to False for new fields.

WDC_DEBUG=LOG
Debug messages to Log file
WDC_DEBUG=TRUE
Debug messages to Screen

This will create a WDC_SDBG.log file in your local code folder to use debugging Advanced Security Issues

WDC_SecurityDisableWarning=TRUE

This keeps GP from asking if you want to open advanced security instead of regular security when you select regular security from System setup

WindowGridHeight = height in pixels

This setting allows you to define the height in pixels of a grid block in the Layout window.

WindowGridWidth = width in pixels

This setting allows you to define the width in pixels of a grid block in the Layout window.

WindowHeight=n

The height of the Main Window in Dynamics if WindowMax=FALSE

WindowMax=TRUE

This parameter is used to verify the size of the Main Window in Dynamics. If the setting is set to TRUE, it is full screen. If the setting is FALSE, adjustments have been made and additional Window parameters will be listed below the WindowMax setting.

WindowPosX=n

The X position of the Main Window in Dynamics if WindowMax=FALSE

WindowPosY=n

The Y position of the Main Window in Dynamics if WindowMax=FALSE

WindowWidth=n

The width of the Main Window in Dynamics if WindowMax=FALSE

Word Macro File=c:\Program Files\Microsoft Dynamics\GP\Data\Letters\WordIntegration.dot

Sets the path to the WordIntegration.dot documents template that is used with the Letter Writing Assistant.

Workstation=location_translation_ID

Location translation ID that is found in the Location Translation table. This line would be modified if a user has created an additional location translation in the Location Translation Setup window. This line will default as equal to WINDOWS.

Workstation2 = dictionary_location_ID

The dictionary_location_ID indicates a set of dictionaries in the Dynamics.set file that you want a given workstation to use. Each set determines the location of the main dictionary and its forms and reports dictionaries.

WysiwygLayout=TRUE

Saves the setting in Modifier as to whether the field names are displayed. TRUE means the field names are not displayed. FALSE means the field names are displayed.

ZoomFontColor=Green

Saves settings in the Display area of the User Preference settings.

ZoomFontUnderline=True

Saves settings in the Display area of the User Preference settings.

SQL 2022 TSQL snapshot backups!

  SQL 2022 now actually supports snapshot backups!  More specifically, T-SQL snapshot backups. Of course this is hardware-dependent. Here ...