I think the main problem is doing it in a way that zips* across the rows rather than producing the cartesian product and is both sufficiently generic and performant.Using the split function in the thread mladen's pointed to, it's something like this for your example:CREATE FUNCTION dbo.Split (@sep char(1), @s varchar(512))RETURNS tableASRETURN ( WITH Pieces(pn, start, stop) AS ( SELECT 1, 1, CHARINDEX(@sep, @s) UNION ALL SELECT pn + 1, stop + 1, CHARINDEX(@sep, @s, stop + 1) FROM Pieces WHERE stop > 0 ) SELECT pn, SUBSTRING(@s, start, CASE WHEN stop > 0 THEN stop-start ELSE 512 END) AS s FROM Pieces )GO
SELECT ID, A.s, B.s, D.sFROM TableNameCROSS APPLY dbo.Split('\', AUTHORS) AS ACROSS APPLY dbo.Split('\', BirthYR) AS BCROSS APPLY dbo.Split('\', DeathYR) AS DWHERE A.pn = B.pn AND A.pn = D.pn
This looks pretty tidy to me, and it has an understandable behaviour where the piece lengths for AUTHORS, BirthYR and DeathYR differ: stop when the shortest one runs out! It's probably not the quickest way: mainly because it's likely to have to produce (partial) products before it can apply the WHERE conditions. But perhaps that doesn't matter if you're only doing it once.An alternate approach is to count the pieces in one of the MV columns (or even take the minimum, but that gets messy), use a tally table/function and a Piece function. That way you know you're getting the same piece for each column without having to make it filter the result of a join. Like this:SELECT ID, dbo.Piece(AUTHORS, '\', N.n), dbo.Piece(BirthYR, '\', N.n), dbo.Piece(DeathYR, '\', N.n)FROM dbo.IkeySensitivityRawINNER JOIN Numbers AS N ON N.n BETWEEN 1 AND LEN(AUTHORS) - LEN(REPLACE(AUTHORS, '\', '')) + 1
* sorry, it's a functional programming term!