FTP task and duplicate files


i'm trying figure out last piece of long puzzle , have run wall.  have ftp task copy files a source location local server.  files processed , entered db.  need script or method of preventing same files being copied , causing duplicate entries in db.  have ability move the local files into directory cannot move files on the source machine.  there can configure move local files directory once entered db, then, every time project run, compare files in source directory against local files , not copy files exist?

for first part u should send processed files archive folder.

nest need use script task call ftp service.
before ftping files have check in source location & archive location verify file not duplicate.
option have in script task check time modified files. if greater last execution time can pick files, no need of comparison.

script ftp check link:
http://social.msdn.microsoft.com/forums/en-us/sqlintegrationservices/thread/2b7cf6b5-d16d-44cd-954d-68e9ba98ee91/

ssis ftp task not able achieve dirctly, first copies files local need check local folder time consuming.
hope helps !! - sudeep | please mark post(s) “answered” answers query.


SQL Server  >  SQL Server Integration Services



Comments

Popular posts from this blog

Conditional formatting a graph vertical axis in SSRS 2012 charts

Register with Power BI failed

SQL server replication error Cannot find the dbo or user defined function........