We try to avoid SQL Server user-defined functions unless we can be certain that there is no other way to achieve what it is we’re asking of them. They aren’t very performant as they calculate on a row by row basis rather than the set based operations that our esteemed RDMS is renowned for.
Having said this I am going to post a table function I created recently for use as part of an SSIS import package that transforms an incoming single row CSV into roughly 100 records each time it runs.
The content team in our organisation faces the soul-destroying task of entering records as well as comma delimited strings on to an Excel spreadsheet which in turn is imported into 2 SQL tables.
There is no straight forward way to split the csv string into a SELECT list, but here is what I have done…
Having created the function, you can call it:
SELECT ID, Word FROM fn_tokenizeString(1,'Here, are,some ,test ,words, , , Frank,',',')