Hello -
I am dealing with SSIS in VS 2005... Trying to convert all my DTS packages... So, basically all my packages will extract some information from a database and load the results into a spreadsheet.
To start I am trying to do a TOP 1 returning a string from the db... The first row has column names but mysteriously the package will start to write the expected results in the third row instead of the second one. The second row will remain blank and if I do a preview against the destination spreadsheet within the pkg I will see a NULL value in the second row and then in the third row I will see the string I was expecting.
Tried the following with no success:
Regedit.exe, in Hkey_Local_Machine/Software/Microsoft/Jet/4.0/Engines/Excel/ do TypeGuessRows=8, ImportMixedTypes=Text, AppendBlankRows=0, FirstRowHasNames=Yes
Any help would be really appreciated.
I noticed some people are looking at this topic but no answers up to now... Should I add more information so that maybe you can help?
Thank you.
|||I just tried it and it works for me. Header in row 1 and data starts and ends in row 2.
Please post more info.
|||I have not seen that behavir either. Perhaps is something in the source table that is causing that. Have you tried deleting the excel file and the connection manager and creating them back?|||I am still having issues... I’ve created a new package using table pubs.dbo.jobs. I added a "data flow task" to the "control flow" pane. After that, I went to "data flow" pane and added a "OLE DB source" which has the sql query "select top 1 job_id from pubs.dbo.jobs", returning 1.
I also added an "Excel Destination" which points to a spreadsheet with a column header called "ID". All cells in this xls are defined as numbers... After that, I tied both components in the "data flow" pane and ran the package (in the connections managers I have the OLE DB and the excel connections).
SSIS succeeded and loaded 1 into the destination spreadsheet, in the third row instead of the second one. When I go to the "Excel Destination", hit edit and do preview, I see ID in the first row (which is the column name), NULL in the second row (unexpected) and 1 in third row (results from sql query).
Please, advise. Thanks in advance.
|||I follow the same steps you described and everything looks right. So, How are you creating the excel file? I created a connection manager to an unexisting excel file; the in the Excel destination componnet I use the New button in the 'name of the excel sheet' to create it as suggested by SSIS. Perhaps you are pointing to an existing file that has something in it it that causes the behaivor you see.|||
Rafael -
This does solve the problem. I really use an existing spreadsheet so I believe something wrong was happening while trying to match data types (sql query X existing destination sheet).
However, I really need to use this existing spreadsheet... This is because when my package is executed, the destination file is overwritten by a copy of the empty template file with column headers only. So the template file should be used... This is the way I always did for DTS packages and I'm wondering if you have another way to append the destination file in the second row always (otherwise file will keep growing).
BTW, these conversion data type "issues" in SSIS are a pain... I just don't understand why Microsoft eliminated implicit conversions (we have to convert explicitly in the sql query or either add a Data Conversion transform task mostly when we deal with strings - more to be done in addition to all that we have on our plate already). I miss DTS packages...
Please, reply with any comment on how to deal with the problem I have using an existing excel sheet. When I create a new one as suggested by SSIS, it does solve the NULL issue but I need to use a template file to assure I am starting in the second row always.
Thanks for your help.
|||
Gabriel Souza wrote:
BTW, these conversion data type "issues" in SSIS are a pain... I just don't understand why Microsoft eliminated implicit conversions (we have to convert explicitly in the sql query or either add a Data Conversion transform task mostly when we deal with strings - more to be done in addition to all that we have on our plate already). I miss DTS packages...
They may create a little more work upfront, but I'll readily trade that for not having to deal with implicit conversions. I've seen too many tools that use implicit conversions guess wrong about the datatypes, and you get no errors, no warnings. I've seen input files get columns re-ordered, but due to implicit conversions, the import process didn't throw any errors and the wrong data was imported for two weeks before a user questioned the values they were getting.
An explicit conversion is a lot safer for a production quality application.
|||If you use an existing excel, the Excel destination will APPEND to the file. So if it detects that the original excel has 3 rows, it will write from the 4th row.
Even if your rows 2,3 look empty to you in the excel, they may have been used previously, and may have formatting/comments etc.Excel will not see these rows as empty.
One good way to check is to open the excel, and press CTRL_Home and then CTRL_Shift_End. If rows 2 and three are being marked, then that means excel detected three rows.
To solve this problem, mark A2 to Ctrl_Shift_End and delete cells (move cells up), then save the excel. Then check again to see if row 2 and 3 are being marked.
I had this problem, and it got solved when I edited the excel properly.
|||Karfast -
Thanks for the reply but this does not help... The template file is in the correct format and we can't have manual work here...
The packages are supposed to run daily and they can't keep appending because they will grow indefinitely. This is why we use the template file and copy it to overwrite the destination file. The file will then be empty and the package will start in the second row again (at least this is the way we always did for dts's).
The problem I have is.... When this template file is copied the NULL issue happens again and it looks like SSIS gets lost with the data types because I did not create the excel sheet by clicking the "new" button in the "excel destination editor" as Rafael suggested (since I am dealing with the new template file by the time package runs).
Rafael suggestion does solve the problem with the unexpected NULL (because the destination sheet is created as SSIS suggests when we click the "new" button) but when the template file is copied the issue is raised again... This process of appending the file in the second row always should be automatic and we can't rely on manual deletions. As I said, the template file is formatted correctly.
Let me know your thoughts.
Thanks a lot.
|||
Gabriel on a recent project we ran into a nearly identical issue. At the end of updating one of our dimensions we needed to output the results to reflect any changes so our Excel and database table were completely in sync. We solved the issue using a template Excel file, but the template followed these standards:
All cells formatted as text
No forumulas
Header row only
All rows below header deleted initially to avoid any previous entries
Once the template was created it was untouched. Within our SSIS package we used a File System Task to copy the template over the existing file then in our data flow we took our source then ran it through data conversion (unicode string 255 or double precision floats) and output it to our new excel file. We used an Excel destination to map to our overwritten file and for the file task we had a before (template) and after (destion).
I can send you the package and template if you are still having difficulty but the lessons we learned were the formatting of the Excel template and matching the data types from the database to the Excel destination.
Hope this helps! Good luck!
|||Hi Gabriel,
I meant that the template file should be correct, and that is the only manual operation.
Now if the template has col headers in row 1, and data in its second and third rows, all destination files will also have it. And hence, your first real row will be in the 4th row.
Please check your template file.
I had exactly the same scenario, and same problem, and it got solved.
HTH
Kar
|||ADMariner -
Your suggestions are pretty good and I see you truly understood the issue I was having. I tried to apply these steps but unfortunately I was still having the issue... Even making sure all the 60,000 rows after the first row were being deleted properly, the NULL value would still show up.
Only thing that solved the problem was to create the "destination" sheet by clicking the "new button" in the "excel destination editor" as Rafael suggested. As soon as I applied this same trick to the template file and pointed the SSIS package back to the destination spreadsheet, the issue went away.
I really appreciate all the help from everybody. Thanks a lot
|||
ADMariner wrote:
We solved the issue using a template Excel file, but the template followed these standards:
All cells formatted as text
No forumulas
Header row only
All rows below header deleted initially to avoid any previous entries
Once the template was created it was untouched. Within our SSIS package we used a File System Task to copy the template over the existing file then in our data flow we took our source then ran it through data conversion (unicode string 255 or double precision floats) and output it to our new excel file. We used an Excel destination to map to our overwritten file and for the file task we had a before (template) and after (destion).
After you wrote to the template and then opened it in Excel, were your floats represented as numbers or text? I haven't been able to get it to write floats to an existing file (header row only) that remain formatted as numbers in Excel. Instead, they're text with the related side-effects (left justified, green triangle in upper left warning about a possible number in a text field, conditional formatting doesn't work correctly, etc.) I tried having the Excel destination create the spreadsheet, and then it DID write the floats as numbers, and would continue appending numbers on subsequent runs, but when I removed the data from that file (leaving the header) and tried to write to it again, the floats were then written as text.
No comments:
Post a Comment