SSIS Package to drop/create partitions based on partition list in the SQL Server table
April 18th, 2007 by Vidas MatelisIn my past blogs I showed how to use SSIS package to process dimensions, cubes or build aggregates. I am slowly rebuilding my existing packages thanks to points from Darren Gosbell and Jess Orosz. I found that using CaptureXML method is more convenient for me and it is definitely faster.
Just to explain again about this method. I can change Analysis Services objects (create, drop, process) in the script using AMO commands. Normally each AMO command is executed in a separate batch. But AMO has an option to save XMLA script from the multiple AMO commands and then submit them in one batch. This is done my assigning Analysis Services server property CaptureXML value ‘True’ (to start capturing XMLA) or ‘False’ (to stop capturing XMLA). Then XMLA log can be executed using ExecuteCaptureLog statement where first parameter specifies if statements have to be run in batch.
Past weekend I had some time to rewrite my partition synchronization package to use CaptureXML method. I maintain a list of partitions in SQL Server database table iwSSASPartitions. I use a SSIS script to drop partitions if they do not exists in my partition table and create new partitions if they do not exists in SSAS database. I have a SQL Server table with the list of partitions that should exist in SSAS database. Here is the script to create this table:
CREATE TABLE [dbo].[iwSSASPartitions](
[Partition] [varchar](100) NOT NULL,
[CubeName] [varchar](100) NOT NULL,
[MeasureGroup] [varchar](100) NOT NULL,
[SQL] [varchar](4000) NOT NULL,
[NeedsReProcessing] [bit] NOT NULL,
CONSTRAINT [PK_iwSSASPartitions] PRIMARY KEY CLUSTERED ( [Partition], [CubeName], [MeasureGroup])
)
Normally I would deploy SSAS database (no processing) with just one partition per measure group. Then I would run my SSIS package to synchronize partitions between SQL Server table and SSAS database. For initial deployment I would drop about 35 partitions and then create about 250 new partitions. During normal monthly load I would need to create about 30 partitions.
My old package was runnign XMLA scripts to create/drop partitions. I was creating/dropping partitions one by one. With old method my initial drop of 35 partition was taking about 5 min. Step to create new 250 partitions was taking about 15 min. As this is not done everyday, 20min to me seemed like a reasonable time. But after I re-wrote my package to take advantage of batch executions, step that drops partitions run for 15 seconds and step that create partitions run for 30 seconds. At first I actually thought that my code is not executing! So this is power of executing XMLA in the batch!
Steps to create SSIS package that drops/create partitions based on partition list in the SQL Server table:
- Create new connection to SQL Server database that contains SQL table with list of partitions. It is important here that connection type is ADO.NET and not OLE DB. I had to use ADO.NET connection as I wanted to pass query result as parameter into Script task.
- Create new connection to SSAS database. Name that connection DM
- Create new variable with Scope package, name “PartitionListInDW” and type “Object”. This variable will contain list of partitions from table.
- Add new “Execute SQL Task” component in Control Flow area. Change following properties for this control:
- ResultSet: “Full Result Set”
- ConnectionType: “ADO.NET”
- Connection: From drop down list choose connection you create in first step
- SQLStatement: SELECT CubeName,MeasureGroup, Partition, SQL FROM dbo.iwSSASPartitions ORDER BY 1,2,3
- In ResultSet tab click “Add” and enter 0 for “ResultName” and for “VariableName” choose “user::PartitionListInDW”
- Change step name and description to your liking and click OK.
- Add new “Script Task” component and connect it to prevoiusly create task. In “Script” tab change property “ReadOnlyVariables” to “PartitionListInDW” and click on “Design Script” button. In “Project Explorer” add reference to “Analysis Management Objects” and “System.XML.dll”. Rename this component to “Drop Partitions”. Add following code:
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.RuntimeImports Microsoft.AnalysisServices
Public Class ScriptMain
Public Sub Main()
Dim ExecutionSuccessfull As Boolean = True ‘ If true, package executed without errors
Dim dataBytes(0) As ByteDim PartListInDM(999, 2) As String ‘ expecting no more than a 1000 partitions.
Dim DropPartCount As Integer = 0‘ Following objects are used to capture execution results
Dim oResults As Microsoft.AnalysisServices.XmlaResultCollection
Dim oResult As Microsoft.AnalysisServices.XmlaResult
Dim oMessage As Microsoft.AnalysisServices.XmlaMessageDim oPartitionList As Variable = Dts.Variables(“PartitionListInDW”) ‘ From variable
Dim oDS As DataSet = CType(oPartitionList.Value, DataSet)Dim oTbl As DataTable = oDS.Tables(0)
Dim oRow As DataRow
Dim oConnection As ConnectionManager
oConnection = Dts.Connections(“DM”)
Dim sServer As String = CStr(oConnection.Properties(“ServerName”).GetValue(oConnection))
Dim sDatabase As String = CStr(oConnection.Properties(“InitialCatalog”).GetValue(oConnection))Dim oServer As New Microsoft.AnalysisServices.Server
oServer.Connect(sServer) ‘ connect to the server and start scanning down the object hierarchy
Dim oDB As Microsoft.AnalysisServices.Database = oServer.Databases.FindByName(sDatabase)If oDB Is Nothing Then
Dts.Log(“Did not find expected database: ” & sDatabase, 0, dataBytes) ‘ You need to setup proper package loging to see this!
ExecutionSuccessfull = False
GoTo Done
End IfDim oCube As Microsoft.AnalysisServices.Cube
Dim oMeasureGroup As Microsoft.AnalysisServices.MeasureGroup
Dim oPartition As Microsoft.AnalysisServices.Partition
Dim bNeedToDropPartition As Boolean
For Each oCube In oDB.Cubes
For Each oMeasureGroup In oCube.MeasureGroups
For Each oPartition In oMeasureGroup.Partitions
‘ Check if this partition exists in our partition table
bNeedToDropPartition = True
‘ oTbl.Reset()
For Each oRow In oTbl.Rows
If oRow(“CubeName”).ToString = oCube.Name.ToString And oRow(“MeasureGroup”).ToString = oMeasureGroup.Name.ToString _
And oRow(“Partition”).ToString = oPartition.Name.ToString Then
bNeedToDropPartition = False ‘ Found partition in DW list, no need to drop it
Exit For
End If
Next
If bNeedToDropPartition Then
PartListInDM(DropPartCount, 0) = oCube.Name
PartListInDM(DropPartCount, 1) = oMeasureGroup.Name
PartListInDM(DropPartCount, 2) = oPartition.Name
DropPartCount = DropPartCount + 1
Dts.Log(“Found partition that needs to be dropped: ” & oPartition.Name, 0, dataBytes) ‘ You need to setup proper package loging to see this!
‘ oPartition.Drop()
End If
Next
Next
Next
‘ MsgBox(DropPartCount)oServer.CaptureXml() = True
Dim i As Integer = 0
For i = 0 To DropPartCount – 1
oCube = oDB.Cubes.FindByName(PartListInDM(i, 0))
oMeasureGroup = oCube.MeasureGroups.FindByName(PartListInDM(i, 1))
oPartition = oMeasureGroup.Partitions.FindByName(PartListInDM(i, 2))
oPartition.Drop()
Next i
oServer.CaptureXml() = False
‘ Executing log that contains all XMLA commands
oResults = oServer.ExecuteCaptureLog(True, False)
‘Log the errors and warnings
For Each oResult In oResults
For Each oMessage In oResult.Messages
If oMessage.GetType.Name = “XmlaError” Then
‘The processing failed
Dts.Log(oMessage.Description, 0, dataBytes) ‘ You need to setup proper package loging to see this!
ExecutionSuccessfull = False
Else
‘It is just a warning.
ExecutionSuccessfull = True ‘ if you want to fail on warning, change this to False
End If
Next oMessage
Next oResult
‘ Finished code that drops partitions
‘ ————————————————————————————————————————-Done:
oServer.Disconnect()If ExecutionSuccessfull Then
Dts.TaskResult = Dts.Results.Success
Else
Dts.TaskResult = Dts.Results.Failure
End IfEnd Sub
End Class
- Few notes about last script:
- Trying to run dropping of partitions in a single batch in parallel (ExecuteCaptureLog(True, True)) generated error: “The Delete element at line 9, column 94 (namespace http://schemas.microsoft.com/analysisservices/2003/engine) cannot appear under Envelope/Body/Execute/Command/Batch/Parallel”
- You cannot drop partition in For Each partition loop. This is because dropping would change list of members in For each statement. That is way I save all partitions that have to be dropped in array and drop it latter.
- You need to enable proper package and (!) step logging to see errors/warnings in the log
- Add new “Script Task” component and connect it to prevoiusly create task. In “Script” tab change property “ReadOnlyVariables” to “PartitionListInDW” and click on “Design Script” button. In “Project Explorer” add reference to “Analysis Management Objects” and “System.XML.dll”. Rename this component to “Create New Partitions”. Add following code:
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.RuntimeImports Microsoft.AnalysisServices
Public Class ScriptMain
Public Sub Main()
Dim ExecutionSuccessfull As Boolean = True ‘ If true, package executed without errors
Dim dataBytes(0) As Byte‘ Following objects are used to capture execution results
Dim oResults As Microsoft.AnalysisServices.XmlaResultCollection
Dim oResult As Microsoft.AnalysisServices.XmlaResult
Dim oMessage As Microsoft.AnalysisServices.XmlaMessageDim oPartitionList As Variable = Dts.Variables(“PartitionListInDW”) ‘ From variable
Dim oDS As DataSet = CType(oPartitionList.Value, DataSet)Dim oTbl As DataTable = oDS.Tables(0)
Dim oRow As DataRow
Dim oConnection As ConnectionManager
oConnection = Dts.Connections(“DM”)
Dim sServer As String = CStr(oConnection.Properties(“ServerName”).GetValue(oConnection))
Dim sDatabase As String = CStr(oConnection.Properties(“InitialCatalog”).GetValue(oConnection))Dim oServer As New Microsoft.AnalysisServices.Server
oServer.Connect(sServer) ‘ connect to the server and start scanning down the object hierarchy
Dim oDB As Microsoft.AnalysisServices.Database = oServer.Databases.FindByName(sDatabase)If oDB Is Nothing Then
Dts.Log(“Did not find expected database: ” & sDatabase, 0, dataBytes) ‘ You need to setup proper package loging to see this!
ExecutionSuccessfull = False
GoTo Done
End IfDim oCube As Microsoft.AnalysisServices.Cube
Dim oMeasureGroup As Microsoft.AnalysisServices.MeasureGroup
Dim oPartition As Microsoft.AnalysisServices.PartitionoServer.CaptureXml() = True
For Each oRow In oTbl.Rows
oCube = oDB.Cubes.FindByName(oRow(“CubeName”).ToString)
If oCube Is Nothing Then
Dts.Log(“Did not find cube: ” & oRow(“CubeName”).ToString, 0, dataBytes) ‘ You need to setup proper package loging to see this!
Continue For
End If
oMeasureGroup = oCube.MeasureGroups.FindByName(oRow(“MeasureGroup”).ToString)
If oMeasureGroup Is Nothing Then
Dts.Log(“Did not find measure group: ” & oRow(“MeasureGroup”).ToString, 0, dataBytes) ‘ You need to setup proper package loging to see this!
Continue For
End If
oPartition = oMeasureGroup.Partitions.FindByName(oRow(“Partition”).ToString)
If (oPartition Is Nothing) Then
Dts.Log(“Need to create partition: ” & oRow(“Partition”).ToString, 0, dataBytes) ‘ You need to setup proper package loging to see this!
oPartition = oMeasureGroup.Partitions.Add(oRow(“Partition”).ToString)
oPartition.StorageMode = StorageMode.Molap
oPartition.Source = New QueryBinding(oDB.DataSources(0).ID, oRow(“SQL”).ToString)
If oMeasureGroup.AggregationDesigns.Count > 0 Then ‘
oPartition.AggregationDesignID = oMeasureGroup.AggregationDesigns(0).ID ‘ Taking first available aggregation design!
End If
oPartition.Update(UpdateOptions.ExpandFull)
End If
Next
oServer.CaptureXml() = False‘ Executing log that contains all XMLA commands
oResults = oServer.ExecuteCaptureLog(True, False)‘Log the errors and warnings
For Each oResult In oResults
For Each oMessage In oResult.Messages
If oMessage.GetType.Name = “XmlaError” Then
‘The processing failed
Dts.Log(oMessage.Description, 0, dataBytes) ‘ You need to setup proper package loging to see this!
ExecutionSuccessfull = False
Else
‘Its just a warning.
ExecutionSuccessfull = True ‘ if you want to fail on warning, change this to False
End If
Next oMessage
Next oResult
‘ Finished code that creates new partitions
‘ ————————————————————————————————————————-Done:
oServer.Disconnect()If ExecutionSuccessfull Then
Dts.TaskResult = Dts.Results.Success
Else
Dts.TaskResult = Dts.Results.Failure
End IfEnd Sub
End Class
- Few notes about last script:
- Trying to run creation of partitions in a single batch in parallel (ExecuteCaptureLog(True, True)) generated error: “The Alter element at line 9, column 141 (namespace http://schemas.microsoft.com/analysisservices/2003/engine) cannot appear under Envelope/Body/Execute/Command/Batch/Parallel.”
- You need to enable proper SSIS package and (!) step logging to see errors/warnings in the log
- Populate table dbo.iwSSASPartitions with the list of partitions you desire
- Test run package. Be careful, if you run it with empty table, all partitions will be dropped.
Vidas Matelis
Note 2008-July-03, it appears to me that this code might be used by many people. I had multiple requests for actual package and finally I posted ziped version of package here.
And thank you very much to everybody who left me (or will leave) comments here about this post. I am very happy that you found my code useful!
Note 2009-July-05: Added code that explains how I create SQL Server table and populate partition list here.
Posted in SSAS, SSIS | 43 Comments »
May 15th, 2007 at 5:09 am
Good , this help me a lot
February 27th, 2008 at 3:23 pm
You are the bomb !!!! – AWESOME – helps me alot
Thank you !!!!
February 27th, 2008 at 3:49 pm
Just one question – is this going to work in 64bit environment?
February 27th, 2008 at 3:51 pm
Yes, I am running this script on 64bit environments. Just for 64bit environment make sure that on the script task you setup parameter “PrecompileScriptIntoBinaryCode” to True.
February 28th, 2008 at 9:17 am
Thank u – again great script!
March 29th, 2008 at 7:31 pm
Hi Vidas,
This script synchronize the partition from SQL server right, how about if we are using OLE DB to connect to a datbase which is not SQL server. FOr example we have Oracle instance and cube uses OLEDB connection to fetch data from Oracle. So can I still use this script to create a partition or can I just creata a XMLA from the existing partiton and do the appropriate changes to it inorder to create a new partion ?
Thanks
Brian
March 29th, 2008 at 7:34 pm
Brian, I have not tested this, but all I need from SQL Server is just a table with a list of partitions and partition parameters. So I do believe this should work with Oracle DB. But you will have to test and maybe do some adjustments (like type of connection) yourself. Please post here if you were able to make it work.
April 2nd, 2008 at 12:01 pm
Vidas:
In the comments to each script you say:
Yet in both cases the code says:
This is confusing – does the
ExecuteCaptureLog(True, False)
run correctly, or generate an error? Or does it run correctly even though it generated an error message?April 2nd, 2008 at 12:09 pm
John Hardin, Thanks for pointing out my mistake. I fixed it now. ExecuteCaptureLog(True,True) generated error.
Thanks!
April 2nd, 2008 at 12:53 pm
Great, thanks Vidas!
A suggestion for the Drop Partition script: rather than capturing the object hierarchy name strings associated with the partition you want to drop, and in the later loop doing a bunch of FindByName()s, why not just define a collection object and add to that collection the partition objects to drop, then iterate over the collection and drop the objects?
Roughly:
Dim oDrops as Collection
... loop through existing partitions and inspect
... if you want to drop it:
oDrops.Add(oPartition)
...finished looping through partitions, now drop them:
For Each oP as Microsoft.AnalysisServices.Partition In oDrops
oP.Drop()
Next
April 2nd, 2008 at 12:53 pm
You need a “preview” button. :(
April 9th, 2008 at 3:55 am
Vidas,
Just wanna Let you know that, the creation of partion for a Oracle database via OLEDB works. Thanks a lot !!
Just one question, I’ve created 60 partition by executing the script task, I also see them in Mangement Studio, however my project still has the partition, I’ve created in BI Studio. Does it mean , when I do some changes and deploy my changes will it be overwritten ?
Sorry if the question is too basic.
thanks
April 9th, 2008 at 6:13 am
Brian, scripts creates partitions online, so you can see them through Management Studio when connected to the cube. But this script does not touch your offline project. So yes, after you change your project and re-deploy it, you have to re-run partition generation script.
April 10th, 2008 at 12:01 pm
Vidas, I guess I’m not very specific about my question,sorry about that.
What I meant was , how to keep the offline project (in BI studio)
and the management studio when connected to cube in Synch.
For an instance,I developed a new cube with just two partitions, however the cube suppose to have more than 60 partitions, so I create them in Mangement studio. Now, If I’ve to apply some change in the cube,I have to do the changes in BI Studio,so at this point of time, If I would like to have all the partitions in BI studio to be displayed,Do I have to ‘create a new project by importing from AS database’ or can I make it synch by someother means?
Thanks
April 10th, 2008 at 7:05 pm
Brian, if you need to keep partitions in offline project in sync, then probably importing project from AS database is right option. Is there any reason why you want to keep them in sync?
In my case list of partitions changes over time – most (but not all) of the partitions are based on DateKey. And I do not care during development (in BIDS studio offline) about actual partitions. But as soon as I deploy new structure, I run this synchronization script and have all partitions created.
In any case, this script does partition build just for online database. I don’t know how I could do this for offline project.
April 13th, 2008 at 2:08 pm
You are great !!
April 13th, 2008 at 4:54 pm
Thanks again Vidas, the only reason I’m looking to have them in Synch is, from time to time we get production queries from our users. In AS2000, its lot easier to browse the cube in Mangement Studio, however I couldn’t do that in SSAS2005.Also, we wanted to resolve the issues / answer queries asap, that’s the foremost reason we wanna keep partition in synch in BIDS.
May 13th, 2008 at 4:11 am
Hello Vidas,
I just want to know if it’s possible to paramaterise the ‘AS database’.
We have several AS databases and each of them contianing atleast one cube.Also, we create one partition every fortnight. Ofcourse, I can create one script per database,but it would be great if it can be acheived in 1 script rather then one per AS database.
Pls advise.
Thanks
May 13th, 2008 at 4:15 am
Sorry, Ignore my question. it can be done with Variables.
thnx
July 7th, 2008 at 5:56 am
Hi,
Can we pass a Package variable as a parameter to the Execute SQL Task ?
Ideally I would like to override the where clause of the SQL in the Execute SQL Task.
At the moment, I’ve the SQL as direct input.
If I manage to pass variable to the SQL, this would help immensly in reducing the number of jobs we have.
Thanks in advance.
Rajesh
July 8th, 2008 at 7:28 am
Brian,
Yes you can pass package variable as a parameter into the Execute SQL Task. This package is using ADO.NET connection type, and passing parameters is different from OLE DB connection type. Example is
here
August 7th, 2008 at 6:07 pm
Hi Vidas, thankyou for posting this solution. I am working with it right now. I wanted to follow up on an earlier question…how to keep the online cube partitions in sync with the VS project for that cube.
The issue I see is that you now have to fully process all patitions over again if you make a change to you cube – simple as the change may be. I guess that many cube changes would require full reprocessing but how about a new calculation or changing an attribute description?
Would these require a full reprocess of all paritions? If not (which I suspect), it would be nice to have a way to get the ssis-sql generated partitions in the VS project otherwise they will be clobbered when you re-deploy and you will be forced to update the needsreprocessing field of the sql table, run the SSIS package to recreate all the partitions and do a full cube reprocessing – which can take time for large cubes.
Any thoughts?
August 7th, 2008 at 9:21 pm
Tim,
you are right, there is a challenge if you want to change cube structure. My script does all changes to the online database and I do not know how one would script such changes and apply to the BIDS solution instead.
So if you want to do small change to the cube, then your options depends on the change. Here is what I do depending on specific change:
If I want to change the cube calculation script (for example: add new calc measures, fix existing ones), then I use BIDS Helper. This utility allows to deploy just MDX cube script and it does that very fast.
For clients where I cannot install BIDS Helper, I do my changes to online database and then copy them to offline BIDS solution. For those who don’t know how to edit database online: Start BIDS, select File->Open->Analysis Services Database and choose your database. The edit cube script and click Save. After that don’t forget to copy script to offline solution.
If I want to apply small change/fix to the structure, I again usually do this to online database and then copy change to offline BIDS solution. You have to be very careful with this as even if you might think that cube will not be invalidated, BIDS might decide otherwise. For critical systems I always test such changes first on the QA machine.
Of course you also have an option to create BIDS solution from existing online database and then edit it. To do this start BIDS and select menu item “New->Import Analysis Services 2005 Database”
I hope this will help you,
Vidas Matelis
August 11th, 2008 at 10:00 am
implementing the following code:
Dim oDrops as Collection
… loop through existing partitions and inspect
… if you want to drop it:
oDrops.Add(oPartition)
, the script fails at .Add partition saying that reference was not set to instance of object..has anyone tried this?
i am trying to use the .merge method on a partion and the first step is to generate a collection of source partitions to merge. the script fails trying to generate a collection of partitions.
thanks,
C
August 11th, 2008 at 11:03 am
solved the collection problem by
setting:
oDrops = measuregroup.partitions()
oDrops.clear
oDrops.Add(oPartition)
I get an error at:
oTarget.merge(oDrops)
saying The operation requires a connected parent Server at Microsoft.AnalysisServices.Partition.Merge(IEnumerable sources)
August 19th, 2008 at 10:27 am
Can you use the same table for multiple cubes. I have multiple cubes in the environment and i would like to use one single table to manage all the paritions list in the same table.
Can that be done without changing any code.
August 19th, 2008 at 10:42 am
C,
Just encountered the same problem and found the following solution:
– oMG is the measure group object within which two partitions need to be merged
– oPart is the Partition object to merge into
– the “History” partition in oMG
Dim oColl As Partition() = New Partition(0) {oPart}
oMG.Partitions.GetByName(“History”).Merge(oColl)
Hope this works for you.
November 28th, 2008 at 9:37 pm
Excuse me to write in Spanish, but my English is not good
Quiero felicitarte por el paquete ssis , es muy breve , performante y muy util…
Lo ajuste a mis necesidades y me vino al dedillo
Thank you!!
saludos desde el sur
November 28th, 2008 at 9:47 pm
Sergio,
Thank you for your comments. I translated them in “Google Translate”:
“I want to congratulate you for the SSIS package, is very brief and very useful performante …
What fits my needs and I came to dedillo”
January 12th, 2009 at 8:38 am
Hi,
My requirement is to process the cube twice a day with approx 75 million rows of data. It is partitioned on monthly basis. Now how to directly load the data from SSIS package to Cube in SSAS and then process directly the partition itself. So that I need not to process the whole cube. Also there are SCD’s as well.
Please guide me on it, if possible with code.
Thanks in advance.
May 5th, 2009 at 3:17 pm
Anyone have a VS2008 version of this package? I am having a difficult time getting this to work in 2008.
May 5th, 2009 at 8:32 pm
Frank,
I have not tested this package on 2008 yet. But I would expect it to work.
Have you copied file
C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies\Microsoft.AnalysisServices.DLL to folder:
C:\WINDOWS\Microsoft.NET\Framework\v3.5
What specific error message are you getting?
May 6th, 2009 at 2:56 pm
Vidalis – I got it working. Seems I was selecting the wrong .dll in my references. Awesome code! Is there a processing piece of it that use the NeedsProcessing column of the table?
May 6th, 2009 at 9:05 pm
Frank,
That code is currently not available for public. But based on my other processing scripts you could create it quite easily. Simple logic – if ALL partitions require processing, then I do full dimension processing and then full cube processing. If at least one partition does not require processing, then I do incremental dimension processing and full processing just for partitions that require it.
May 7th, 2009 at 10:40 am
well I will give it a shot. What other processing scripts were you referring to?
Thanks btw Vidas!
June 25th, 2009 at 11:53 pm
The code is very usefull.
July 5th, 2009 at 8:13 pm
[…] SSIS Package to drop/create partitions based on partition list in the SQL Server table […]
October 6th, 2009 at 11:22 pm
Vidas
I also have proactive caching setting in my partition. I am new to Ado.net. Is their any way you can guide us to deal with cube that have Proactive caching set uped.
thanks
October 14th, 2010 at 1:06 pm
Hi
This is very useful code , can we use this code to even if we do not have to sync sql server partitions and SSAS Partitions by Populating required partitions in the iwSSASPartitions by a Stored Proc ?
I don’t have to drop paritions based on the SQL server dataware ouse , so will have to disable that task
Thanks for your help
October 14th, 2010 at 1:28 pm
Dave,
This code specifically looks for list of partitions and filter conditions in the table. If you will not use SP to populate that table, you need to do add records there somehow as SSIS creates new partitions based on data in that table. Of course, you can disable step that drops partitions.
April 26th, 2011 at 2:50 am
i have issue regarding dynamic partition processing using SSIS Script Task. while processing selected partition , other partition are also getting process. Why ?
September 13th, 2011 at 10:00 am
I have just implemented this change, along with some logic to process newly added partitions.
I have noticed that when a partition is deleted, the data is still available in the cube. Do I need to “Unprocess” this partition?
October 2nd, 2011 at 3:09 am
Can you Please pass on the Insert scripts for the partition table in backend which is used for processing