I am loading data from an external source into SQL Server via ASP.NET/C#. The problem is that I do not necessarily know the data types of each column coming in, perhaps until a user tells the application, which might not occur until after the data is loaded. Also, I cannot anticipate the number of columns coming in. What would table design look like?
Would you use a large table with enough columns (e.g. Column1, Column2, etc.) reasonable enough to accomodate all the columns that the source might have (32?), and use nchar as the datatype with the plan to convert/cast when I use the data? Isn't the cast kind of expensive?
Does this make sense? Surely other foplks have run into this...
My thanks!
If you're doing what I think you're doing, you might want to use an EAV (Entity/Attribute/Value) table to store your data. The attribute table stores the column definitions. The entity table store the "row id"s. The value table stores the actual values and is of the form
Entity_id int
Attribute_id int
Value (string/whatever)
You can have multiple value tables for different data types, but I find that it's more trouble than it's worth. I've used this on several projects where the data being captured is set by the users at run time.
No comments:
Post a Comment