Finally... glad to see someone try to normalize a CSV column.
There are a lot of methods to do this using While loops, cursors, and all manner of other SQL "sins" and, despite the very long code they produce (both in time and length), they may still not do what you want them to. Functions can be just as bad in both areas.
So, what to do. First, you need to make a Numbers or "Tally" table which contains nothing more than a well indexed single column of sequential numbers. When you join to it (usually using a cross join), it acts very much like the ROWNUM feature of SQL Server 2005. I call it a "Tally" table because I use it to count or "Tally" and it sounds cooler than just a "Numbers" table. It has many, many uses and normalize a CSV column for a whole table is just one of them. Here's how to build a Tally table... don't blink...
--===== Create and populate the Tally table on the fly
SELECT TOP 11000 --equates to more than 30 years of dates
IDENTITY(INT,1,1) AS N
FROM Master.dbo.SysColumns sc1,
--===== Add a Primary Key to maximize performance
ALTER TABLE dbo.Tally
ADD CONSTRAINT PK_Tally_N
PRIMARY KEY CLUSTERED (N) WITH FILLFACTOR = 100
--===== Allow the general public to use it
GRANT SELECT ON dbo.Tally TO PUBLIC
After that, life becomes ever so simple... first, let's pretend this is your table...
--===== Create a test table... THIS IS NOT PART OF THE SOLUTION! IT's JUST FOR DEMO!
CREATE TABLE #yourtable (User_ID INT, Profile INT, Profile_Value VARCHAR(200))
INSERT INTO #yourtable
(User_ID ,Profile, Profile_Value)
SELECT 1,1,',' UNION ALL
SELECT 1,2,',7,' UNION ALL
SELECT 2,1,',6,9,11,' UNION ALL
SELECT * FROM #yourtable
All we need to do to split the table as you've requested, is the following simple code... and, I do mean simple...
Profile_Value = CASE
WHEN y.Profile_Value <> ','
FROM #yourtable y,
WHERE (t.N < LEN(y.Profile_Value) OR Profile_Value = ',')
AND SUBSTRING(y.Profile_Value,t.N,1) = ','
Try it... lemme know if it does the trick for ya...