Remove Leading Zeros in a Varchar Column but keep inside spaces


 create table test (col varchar(100))
insert into test values('00876H873 '),('00876 876500'),('0000HUJ8 9IU877 8UJH')

select * ,   LTRIM(Replace(REPLACE(LTRIM(REPLACE(Replace(col, ' ','*'),'0', ' ')), ' ', '0'),'*',' ')) AS NumSerie from test

drop table test

--https://social.msdn.microsoft.com/Forums/en-US/7ece2a11-efac-45b6-97af-af1873788bb0/tsql-remove-leading-zeroes-in-a-string?forum=transactsql

 

Get file information and save to a table with Powershell






Function AutoImportCommaFlatFiles($location, $file, $extension, $server, $database)
{
    $full = $location + $file + $extension
    $all = Get-Content $full
    $columns = $all[0]
    $columns = $columns.Replace(" ","")
    $columns = $columns.Replace(",","] VARCHAR(100), [")
    $table = "if object_id( '"+ $file + "','U')is not null drop table " + $file + " CREATE TABLE " + $file + "([" + $columns + "] VARCHAR(100))"
    $connection = New-Object System.Data.SqlClient.SqlConnection
    $buildTable = New-Object System.Data.SqlClient.SqlCommand
    $insertData = New-Object System.Data.SqlClient.SqlCommand
    $connection.ConnectionString = "Data Source=" + $server + ";Database=" + $database + ";integrated security=true"
    $buildTable.CommandText = $table
    $buildTable.Connection = $connection
    ## Added to function
    $x = 0
    $insertData.CommandText = "EXECUTE stp_CommaBulkInsert @1,@2"
    $insertData.Parameters.Add("@1", $full)
    $insertData.Parameters.Add("@2", $file)
    $insertData.Connection = $connection
    $connection.Open()
    $buildTable.ExecuteNonQuery()
    $connection.Close()
    ## Added to function
    $x = 1
    if ($x = 1)
    {
        $connection.Open()
        $insertData.ExecuteNonQuery()
        $connection.Close()
    }
}




## export fil info to a csv file
$myexport="c:\temp\mypdffiles.csv"
 $mypath="\\myshare\ftp.mycompany.net\users\me"
##Copy file from one location to another
 $mytarget="C:\temp\test"
 Copy-Item $mypath $mytarget -recurse 
Get-ChildItem -Path $mytarget 

Get-ChildItem -Path $mypath  -Filter *.pdf -Recurse  -File -EA 0 | Select DirectoryName, Name,LastWriteTime,Length | convertto-csv -NoTypeInformation | % { $_ -replace '"', ''} | out-file $myexport | out-null
### bulk insert csv file to a table
AutoImportCommaFlatFiles -location "c:\temp\" -file "mypdffiles" -extension ".csv" -server "MC\MSSQL2014" -database "mydb1" | out-null



 
  
 
  CREATE PROCEDURE stp_CommaBulkInsert
@file NVARCHAR(250), @table NVARCHAR(250)
AS
BEGIN
 DECLARE @f NVARCHAR(250), @t NVARCHAR(250), @s NVARCHAR(MAX)
 SET @f = @file
 SET @t = @table
 
 SET @s = N'BULK INSERT ' + @t + '
  FROM ''' + @f + '''
  WITH (
   FIELDTERMINATOR = '',''
   ,ROWTERMINATOR = ''0x0a''
   ,FIRSTROW=2
  )'
 
 EXEC sp_executesql @s
END
 
--Check data from the target table
Select * from mypdffiles


 

Code from https://www.mssqltips.com/sqlservertip/3208/automating-flat-file-sql-server-imports-with-powershell/


Format Date Column in Excel From SQL Server Export File

When I export data to an Excel file from SSMS, the datetime column becomes a text format yyyy-mm-dd 00:00:00, for example: “2016-06-30 00:00:00”.
When you highlight the column to format the cell to date, it will not work.

A work around solution is:
Step 1: Highlight the column in question and Click on Data Tab>>Text to Columns >>with Delimited selected (default) click Next>> keep default Delimiters (Tab) >> click Next >> Column data format — choose Date in the format YMD and click Finish. The datetime values in the column become m/d/yyyy 0.00 format in excel now, for example: “6/30/2016 0.00”.

At this point, you can use Excel format cells function to format the data to the format you want. Highlight the column>> right click>> Format cells>> under Number tab — choose > Date … Choose from the Type: list. I choose m/dd/yyyy for my date column for now. >> click OK and save your excel file.


Dense_Rank with Data Island

  
 
  
 CREATE TABLE test (StartOfWeek DATETIME,Amount INT)
INSERT  INTO test(StartOfWeek, Amount) VALUES  ('1/17/2016',8),('1/24/2016',8),('1/31/2016',10),('2/7/2016',10),('2/14/2016',14),('2/21/2016',10),('2/28/2016',10)

;with mycte as (
select StartOfWeek, Amount  
, ROW_NUMBER()OVER (ORDER BY StartOfWeek) - ROW_NUMBER()OVER (Partition by Amount ORDER BY StartOfWeek) grp from test)

,mycte2 as (
Select * 
 ,row_number() Over(Order by StartOfWeek) - row_number() Over(Partition by grp Order by StartOfWeek) grp2 FROM mycte
)
SELECT StartOfWeek, Amount , DENSE_RANK() OVER(ORDER BY grp2) AS Ranking  FROM  mycte2
 
 

 drop table test

/*
2016-01-17 00:00:00.000	8	1
2016-01-24 00:00:00.000	8	1
2016-01-31 00:00:00.000	10	2
2016-02-07 00:00:00.000	10	2
2016-02-14 00:00:00.000	14	3
2016-02-21 00:00:00.000	10	4
2016-02-28 00:00:00.000	10	4

*/
-----https://social.msdn.microsoft.com/Forums/en-US/8c24f1a9-f267-4627-969c-5a30d78846a4/row-number?forum=transactsql
 

Default Value of a Parameter on a SSRS Report will not be updated after it was deployed

 

When you modify default value of a parameter for a deployed report of all SSRS versions,
the value will not be updated for the parameter on the reporting server. This is a designed behavior.

The step you need to do is to delete the report through report manager on the server
and redeploy the report and the default value of the parameter will be updated.


Check Agent Job Status From Different Servers with Powershell

 

Rob Swell has a handy script to retrieve agent job running status from multiple servers.

You can find the script from his bolg.

How I Check Hundreds of SQL Agent Jobs in 60 Seconds with Powershell

 

You just need to update the file location  and server list (one server in one line text file) and run the script from Powershell window. You will get an Excel file open to show your the slow moving progress.

If you want to use this script to run as an agent job, you may need to set proper permission to allow powershell user through agent job to access Excel COM.

Here is how to change the Identity to The interactive user:
From Administrative Tools>>Component Services > Computers > My Computer > DCOM Config >> Microsoft Excel application (right click ) > Properties
> Identity > Select The Interactive User  (by default, it was The launching user) >  Click OK.

 


Dynamic Change Matrix (Unpivot and Pivot)

  

  CREATE TABLE inputs ([SampleNo] VARCHAR (50) NULL,[Col1] VARCHAR (50) NULL,[Col2] VARCHAR (50) NULL,[Col3] VARCHAR (50) NULL,[Col4] VARCHAR (50) NULL,[Col6] VARCHAR (50) NULL,[Col7] VARCHAR (50) NULL); 

INSERT INTO inputs ([SampleNo], [Col1], [Col2], [Col3], [Col4], [Col6], [Col7]) VALUES
('Sample 1', 'x.xx', 'xx.xx', 'x.xx', 'xx.xx', 'x.xx', 'xx.xx')
,('Sample 2', 'x.xx', 'xx.xx', 'x.xx', 'xx.xx', 'x.xx', 'xx.xx')

--select  * from inputs

Declare @sqlUnpivot as NVarchar(4000)
Declare @ColsUnpivot as NVarchar(4000)

Declare @sqlPivot as NVarchar(4000)
Declare @ColsPivot as NVarchar(4000)

Set @ColsUnpivot=null

Select @ColsUnpivot =  COALESCE(@ColsUnpivot + ', ', '') + '('+QUOTENAME(COLUMN_NAME,'''')+',' + QUOTENAME(COLUMN_NAME)+')'
FROM   [INFORMATION_SCHEMA].[COLUMNS]
WHERE TABLE_NAME='inputs' and COLUMN_NAME Like 'Col%'
   --print @ColsUnpivot

Set @ColsPivot=null
Set @ColsPivot = STUFF((Select DISTINCT ', ' + quotename(SampleNo ,']') FROM inputs FOR XML PATH (''),type).value('.','varchar(max)'), 1, 2, '')
   --print @@ColsPivot

--- drop temp table if it exists
 if object_id('tempMatrix','U') is not null
 drop table tempMatrix

 ----Unpivot query and load the result into a temp table

Set @sqlUnpivot='Select [SampleNo],Cols,Vals

into tempMatrix
FROM inputs t
CROSS APPLY (Values ' + @ColsUnpivot + ' )  d(Cols,Vals) '

--Print @sqlUnpivot;
exec (@sqlUnpivot)

Set @sqlPivot = 'Select * from tempMatrix Pivot (max(vals) for SampleNo in ('+ @ColsPivot + ')) pvt'

--Print @sqlPivot;
exec (@sqlPivot)

drop table inputs

 

Follow

Get every new post delivered to your Inbox.