I'm running into more clients where there are more than 2B records in a table. The problem is that Datagarage.dbo.dgTargetSourceTableRowCount stores the value of records in a table in an INT (limited to 2,147,483,647) so I have to change the column to a BIGINT. Also, the SP [DataGarage].[dbo].[webService_CompletedTableDownload] needs to be updated to use COUNT_BIG(*) instead of COUNT(*), as well as other references to table count where they are stored as an INT vs. a BIGINT...and this only fixes 2/3 of the problem.
(related ask, while we are updating these tables can we update the datetime values to datetime2? The data conversions are implicit so the impact should be minimal but then they tables will be to Microsoft's recommended data type and support finer granularity when it's needed)
The bigger issue is that the count of records on a table is unqualified (SELECT COUNT(*)) so it has to do a full table scan of all the records on the table...which sucks for performance. On a high performing SQL Server I cancelled the count of 3B records after 9 minutes of trying to run....I don't have a good suggestion on how to fix that one...
Jake - thanks for raising this idea. As we deal with larger datasets, we need to be able to effectively handle record counts in collect. I'll add this into the DSP backlog and align to a future DSP release.