Archive

Archive for the ‘Uncategorized’ Category

Nintex 2010 Updating Database ServerName Connection String

May 23, 2017 Leave a comment

I recently restored a Nintex 2010 Workflow Content Database from my production environment to my Disaster Recorvery (DR) SharePoint Farm. I wanted to purge Nintex Workflow History and old Workflows from the database to reduce the number of WorkflowProgress records in the database.

After restoring a backup of the Nintex 2010 Workflow Content database from Production to DR in SQL 2012 I tried to run the Nintex PurgeHistoryListData command:

nwadmin.exe -o PurgeHistoryListData -siteURL https://portal/site/workflows -day 180

When doing so I ran into the following error:

“Command line execution error: A network-related or instance-specific erorr occured while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. ”

This error happens because the Nintex 2010 Workflow Content Database that was restored to the DR environment had the wrong ServerName connection string set in the database.

If you go to Cental Administration -> Nintex Workflow Management > Database Setup you will see the Data Source is still pointing to the Production SQL server connection.

To be able to update this, you will need to update the Connection String within SQL Server Management Studio (SSMS)

1. Remote/Log into SQL Server 2012
2. Launch SSMS
3. Find the Nintex 2010 database (NW2010DB)
4. Expand the database (+)
5. Expand Tables (+)
6. Find the dbo.Database table
7. Right click database table and select “Select Top 1000 Rows”
8. The query will return all databases associated with the Nintex Configuration database (NW2010DB)
9. Within the results you will see the DatabaseName and ServerName values. You will notice in the ServerName its pointing to the Production SQL Server ServerName for the database that was just restored.
10. This value will need to be updated to point to the SQL Server the database was restored to.
11. Run the following T-SQL command. (Make sure you make a backup of the NW2010DB) database just in case.

UPDATE NW2010DB.dbo.Databases SET ServerName = ‘NEW SQL SERVER\INSTANCE’ WHERE ServerName = ‘OLD SQL SERVER\INSTANCE’

Replace ‘NEW SQL SERVER\INSTANCE’ and ‘OLD SQL SERVER\INSTANCE’ with your SQL Servers and Instances.

Once the query runs it will update the connection string with the new ServerName.

12. Go back to run the Nintex Purge command and it should now execute without error.

If you run into a similar issue. Make sure the ServerName connection string is pointing to the correct SQL Server\Instance withing SQL.

Categories: Uncategorized

Build SCOM *Like* SharePoint Dashboards with PowerShell (Utilization, HTTP Response, App Pool Status)

March 2, 2017 Leave a comment

I recently blogged about how to build a SCOM Like dashboard for SharePoint by pulling statistics from our network load balancer application (F5 – Big IP). You can find the blog here – https://shipoint.com/2017/03/01/scom-like-reports-for-sharepoint-using-powershell-f5-big-ip/

Now i’m going to show you how to build SCOM *like* dashboards for SharePoint by displaying information regarding Server Utilization (Memory, CPU, Storage), HTTP Responses, and App Pool Status for all servers in the farm. Since SCOM is not fully integrated into our systems yet I had to turn to Powershell to get the results I want. This helps to get a snapshot of current statitics from the SharePoint servers instead of having to visit each server to grab what is needed. These reports are generated into HTM pages that can be viewed in a browser.

Below is the script I generted to get the dashboards and displays results.

Remove-Item c:\scripts\SharePoint-Server-Health.htm

[void][Reflection.Assembly]::LoadWithPartialName('Microsoft.Web.Administration')

$ServerListFile = "c:\temp\ServerList.txt"
$ServerList = Get-Content $ServerListFile -ErrorAction SilentlyContinue
$ServerListAppsFile = "c:\temp\ServerList_Apps.txt"
$ServerListApps = Get-Content $ServerListAppsFile -ErrorAction SilentlyContinue
$URLListFile = "c:\temp\URLList.txt"
$URLList = Get-Content $URLListFile -ErrorAction SilentlyContinue

$Result = @()
$Result_HTTP = @()
#$Result_AppPools = @()
$Date = Get-Date -format g

ForEach($computername in $ServerList)
{
	$AVGProc = Get-WmiObject -computername $computername win32_processor | Measure-Object -propert LoadPercentage -Average | Select Average
	$OS = gwmi -Class win32_operatingsystem -computername $computername |
	select-object @{Name = "MemoryUsage"; Expression = {"{0:N2}" -f
	((($_.TotalVisibleMemorySize - $_.FreePhysicalMemory)*100)/$_.TotalVisibleMemorySize)}}
	$vol = Get-WmiObject -Class win32_Volume -ComputerName $computername -Filter "DriveLetter = 'C:'" |
	select-object @{Name = "C PercentFree"; Expression = {"{0:N2}" -f (($_.FreeSpace / $_.Capacity)*100) }}

	$result += [PSCustomObject] @{
	ServerName = "$computername"
	CPULoad = "$($AVGProc.Average)%"
	MemLoad = "$($OS.MemoryUsage)%"
	CDrive = "$($vol.'C PercentFree')%"
	}

	$Outputreport = "<HTLM><HEAD><META HTTP-EQUIV=""refresh"" content=""240""><TITLE> SharePoint Server Health Reports </TITLE></HEAD>
		<BODY background-color:peachpuff vlink=""black"" link=""black"" alink=""black"">
		<font color =""#99000"" face=""Microsoft Tai le"">
		<center><H2> SharePoint Server Health Report </H2></font>
		*Statistics refresh every 5 mins.  Page automatically refreshes every 90 seconds.
		</br>
		<U>Last Update:</U> $Date (Server Time)</center></br>
		<font color =""#99000"" face=""Microsoft Tai le"">
		<H2>SharePoint Utilization Status</H2></font>
		<Table border=1 cellpadding=0 cellspacing=0>
		<TR bgcolor=gray align=center>
		<TD><B>Server Name</B></TD>
		<TD><B>CPU Utilization</B></TD>
		<TD><B>Memory Utilization</B></TD>
		<TD><B>C Drive Utilization</B></TD></TR>"

	Foreach($Entry in $Result)
	{
		if((($Entry.CPULoad) -and ($Entry.MemLoad)) -ge "80")
		{
			#$Ouputreport += "<TD bgcolor=red>"
			#$Outputreport += "<TD>$($Entry.Servername)</TD><TD bgcolor=red align=center>$($Entry.CPULoad)</TD><TD bgcolor=red align=center>$($Entry.MemLoad)</TD><TD align=center>$($Entry.CDrive)</TD></TR>"
		}
		else
		{
			#$Ouputreport += "<TR>"
			#$Outputreport += "<TD>$($Entry.Servername)</TD><TD align=center>$($Entry.CPULoad)</TD><TD align=center>$($Entry.MemLoad)</TD><TD align=center>$($Entry.CDrive)</TD></TR>"
		}
		$Outputreport += "<TD>$($Entry.Servername)</TD><TD align=center>$($Entry.CPULoad)</TD><TD align=center>$($Entry.MemLoad)</TD><TD align=center>$($Entry.CDrive)</TD></TR>"
	}
	$Outputreport += "</Table>"
}


#Build Table for HTTP Status#

ForEach ($url in $URLList)
{
	$xHTTP = New-Object -com msxml2.xmlhttp;
	$xHTTP.open("GET", $url, $false)
	$xHTTP.send();
	write-host $url
	$HTTPStatus = $xHTTP.status

	$HTTPStatusText = $xHTTP.statusText

	if ($HTTPStatus -eq 400)
	{
		$HTTPStatusText = "Bad Request"
	}
	if ($HTTPStatus -eq 403)
	{
		$HTTPStatusText = "Forbidden"
	}
	if ($HTTPStatus -eq 401)
	{
		$HTTPStatusText = "Unauthorized"
	}
	if ($HTTPStatus -eq 404)
	{
		$HTTPStatusText = "Not Found"
	}
	if ($HTTPStatus -eq 500)
	{
		$HTTPStatusText = "Internal Server Error"
	}

	$Request = New-Object System.Net.WebClient
	$Request.UseDefaultCredentials = $true
	$Start = Get-Date
	$PageRequest = $Request.DownloadString($url)
	$TimeTaken = ((Get-Date) - $Start).TotalSeconds
	$Request.Dispose()

	
	$result_http += [PSCustomObject] @{
		URL = "$url"
		Status = "$HTTPStatus"
		StatusText = "$HTTPStatusText"
		LoadTime = "$TimeTaken"
		}

		$Outputreport2 = "<font color =""#99000"" face=""Microsoft Tai le"">
		<H2> SharePoint HTTP Status </H2></font>
		<Table border=1 cellpadding=0 cellspacing=0>
		<TR bgcolor=gray align=center>
		<TD><B>URL</B></TD>
		<TD><B>StatusCode</B></TD>
		<TD><B>Status</B></TD>
		<TD><B>Est. Load Times</B></TD></TR>"

		Foreach($Entry in $Result_HTTP)
		{
			if($Entry.Status -eq 200)
			{
				$Outputreport2 += "<TR><TD><a href=$($Entry.URL) target=""_blank"">$($Entry.URL)</a></TD><TD bgcolor=green align=center>$($Entry.Status)</TD><TD align=center>$($Entry.StatusText)</TD><TD align=center>$($Entry.LoadTime)(s)</TD></TR>"
			}
			else
			{
				$Outputreport2 += "<TR><TD>$($Entry.URL)</TD><TD align=center>$($Entry.Status)</TD><TD align=center>$($Entry.StatusText)</TD><TD align=center>$($Entry.LoadTime)(s)</TD></TR>"
			}
	
		}
		$Outputreport2 += "</Table>"	

}
ForEach ($s in $ServerListApps)
{
	$ServerMgr = [Microsoft.Web.Administration.ServerManager]::OpenRemote($s)
	$AppPoolColl = $ServerMgr.ApplicationPools

	$Result_AppPools = "Result_AppPools_" + $s
	$Result_AppPools = @()

	foreach($AppPool in $AppPoolColl)
	{
		$appPoolState = $AppPool.State
		$appPoolName = $AppPool.Name
		#write-host $AppPool.Name
		#Write-Host $appPoolState
	
		$result_apppools += [PSCustomObject] @{
		ServerName = "$s"
		AppPoolName = "$appPoolName"
		AppPoolState = "$appPoolState"
		}

		$Outputreport_apppools = "Outputreport_" + $s

		$Outputreport_apppools = "<font color =""#99000"" face=""Microsoft Tai le"">
		<H2> SharePoint Application Pool Status $s </H2></font>
		<Table border=1 cellpadding=0 cellspacing=0>
		<TR bgcolor=gray align=center>
		<TD><B>Server Name</B></TD>
		<TD><B>AppPoolName</B></TD>
		<TD><B>AppPoolState</B></TD>
		<TD><B>Notes</B></TD></TR>"

		Foreach($Entry in $Result_AppPools)
		{
			if($Entry.AppPoolState -eq "Stopped")
			{
				if ($Entry.AppPoolName -eq "DefaultAppPool" -or $Entry.AppPoolName -eq "SharePoint Web Services Root")
				{
					$notes = "Please Ignore"
				}
				else
				{
					$notes = "Address Immediately"
				}
				$Outputreport_apppools += "<TR><TD>$($Entry.ServerName)</TD><TD align=center>$($Entry.AppPoolName)</TD><TD bgcolor=red align=center>$($Entry.AppPoolState)</TD><TD align=center>$notes</TD></TR>"
				
			}
			else
			{
				$notes = ""
				$Outputreport_apppools += "<TR><TD>$($Entry.ServerName)</TD><TD align=center>$($Entry.AppPoolName)</TD><TD align=center>$($Entry.AppPoolState)</TD><TD align=center>$notes</TD></TR>"
				
			}
						
	
		}
		
		$Outputreport_apppools += "</TABLE>"
	}
	$Outputreport_AppPools_Final += $Outputreport_apppools += "</BODY></HTML>"

}

$final = $Outputreport += $Outputreport2 += $Outputreport_AppPools_Final
$final | out-file c:\scripts\SharePoint-Server-Health.htm
Categories: Uncategorized

Build SCOM *like* Reports for SharePoint using PowerShell (F5 – BIG IP)

March 1, 2017 1 comment

In my SharePoint 2010 environment we are leveraging two F5 BIG-IP servers as our network load balancer servers that sit between our TMG server, and SharePoint Web Front end servers. I needed to generate some SCOM *Like* dashboards that pulled statistics from the F5 to display to the end user, specifically the helpdesk to give them direct insight to on-the-minute/demand statistical updates for our servers. Since there is “separation of duties” where I work the helpdesk has no insight into our F5 BIG-IP Servers or configurations. This is where I turn to PowerShell to help pull statistics from the F5 and display them to the users.

To get started you first need to register the F5 iControl cmdlet assemblies to the server where you will be execting the PowerShell scripts.

To download the icontrol cmdlets visit here: https://devcentral.f5.com/d/microsoft-powershell-with-icontrol

Once the icontrol cmdlets are downloaded move the files over to the server where you will be executing the PS Script and unpackage the zip file.

Inside the zip file “f5-icontrol-powershell-snapin-12_1_0” folder you will find the setupSnap.ps1 file. Open up a PowerShell console with Admin rights and browse to the location where the setupSnap.ps1 is.

Execute the setupSnapin.ps1 – .\setupSnapin.ps1
Once the PowerShell script executes and finishes the icontrol .dll assesmblies are applied.

Now you are free to execute F5 BIG-IP iControl cmdlets in PowerShell on that server.

Below is the PowerShell script I executed to help me achieve my goal. I won’t take full credit for this script, the credit goes to Patrik on devcenter for the code snippet. – https://devcentral.f5.com/questions/obtain-node-name-by-node-description-using-powershell#comment43426 I took his example and built around it to achieve my goal.

—————————————————–

#Initialize Snapin
if ( (Get-PSSnapin | Where-Object { $_.Name -eq "iControlSnapIn"}) -eq $null ){
    Add-PSSnapIn iControlSnapIn
}

#Setup credentials
$User = "user"
$Password = "password"
$BigIP = "xx.xx.xx.xx"
$Partition = "PARTITION NAME"

#Add a type/class to store the node data in
Add-Type @'
public class Member
{
    public string Pool;
    public string Address;
    public string Port;
    public string Name;
    public string Availablity;
    public string Enabled;
    public string Status;
    public string Connections;
}
'@

#Create an empty array to store the node objects in
$Nodes = @()
$NodeDict = @{}

#Connect to the BigIP and get an iControl Handle
$Success = Initialize-F5.iControl -HostName $BigIP -Username $User -Password $Password
$F5 = Get-F5.iControl

#Get the partition list
$f5partitions = $F5.ManagementPartition
$f5partitions.set_active_partition($Partition)

#Get Pool Lists, Status, Nodes

$PoolList = $F5.LocalLBPool.get_list()
$PoolMembers = $F5.LocalLBPool.get_member_V2($PoolList)
$PoolMemberStatus = $F5.LocalLBPool.get_member_object_status($PoolList, $PoolMembers)
$NodeList = $F5.LocalLBNodeAddressV2.get_list()
$NodeIPList = $F5.LocalLBNodeAddressV2.get_address($NodeList)

#Save the nodes in a dictionary for convenient access

for ($i=0; $i -lt ($NodeList.Count); $i++)
{
    $NodeDict.Add($NodeList[$i], $NodeIPList[$i])
}
for ($i=0; $i -lt ($PoolList.Count); $i++)
{
    for ($x=0; $x -lt ($PoolMembers[$i].Count); $x++)
    {
         $objTempMember = New-Object Member

         #Populate the object
         $objTempMember.Pool = $PoolList[$i]
         $objTempMember.Name = $PoolMembers[$i][$x].address
         $objTempMember.Address = $NodeDict[$objTempMember.Name]
         $objTempMember.Port = $PoolMember[$i][$x].port
         $objTempMember.Availability = $PoolMemberStatus[$i][$x].availability_status
         $objTempMember.Enabled = $PoolMemberStatus[$i][$x].enabled_status
         
         #Get Current Connections
         $MemberStatisticsA = $F5.LocalLBNodeAddress.get_statistics($objTempMember.Address);
         $MemberStatisticEntry = $MemberStatisticsA.Statistics;
         $Statistics = $MemberStatisticEntry[0].statistics;

         foreach ($Statistic in $Statistics)
         {
                 $type = $Statistic.type;
                 $value = $Statistic.value;

                 if ($type -eq "STATISTIC_SERVER_SIDE_CURRENT_CONNECTIONS" -and $objTempMember.Port -eq "443")
                 {
                       $cur_connections = $value.low
                       $objTempMember.Connections = $cur_connections
                 }

         }
         $Nodes += $objTempMember
         
         $Outputreport = "<HTML><HEAD><META HTTP-EQIV=""refresh"" content=""60""><TITLE>SharePoint F5 Node Status</TITLE></HEAD>         
         <BODY background-color:peachpuff>
         <Font color=""#99000"" face=""Microsoft Tai le"">
         <center><H2> SharePoint F5 Node Status </H2></font></center>
         <Table border="1" cellpadding="0" cellspacing="0">
         <TR bgcolor="gray" align="center">
         <TD><B>Name</B></TD>
         <TD><B>Address</B></TD>
         <TD><B>Port</B></TD>
         <TD><B>Enabled</B></TD>
         <TD><B>Status</B></TD>
         <TD><B>Connections</B></TD></TR>"

         foreach ($Entry in $Nodes)
         {
                $Outputreport += "<TR><TD>$($Entry.Name)</TD><TD align="center">$($Entry.Address)</TD><TD align="center">$($Entry.Port)</TD><TD align="center">$($Entry.Enabled)</TD><TD align="center">$($Entry.Status)</TD><TD align="center">$($Entry.Connections)</TD></TR>"
         }
         $Outputreport += "</Table></BODY></HTML>"

    }
}
$Outputreport | out-file

Once the script was created I created a Task Scheduler to execute the script every 5 mintues. ON the report build I had a refresh of 60 seconds to update the dashboard so the end users did not have to do any interaction with the report.

Categories: Uncategorized

SharePoint 2010: Enable (Page) Output Cache Settings using PowerShell

January 25, 2017 Leave a comment

To optimize performance in SharePoint 2010, there are a few settings you can do to help with page load times and optimizing SharePoint Cache.

The three main ways to do this is:

1. BLOB Cache
2. Object Cache
3. Page Output Cache

In my environment we are not leveraging BLOB cache, and Object Cache is already enabled by default. Though you can optimize it for your SP environment.

However the Page Output Cache is not enabled by default.

Below is a PowerShell script I created that enables the Page Output Cache at the Site Collection Level for a Web Application.

$webapp = Get-SPWebApplication [URL OF WEB APPLICATION]

foreach ($web in $webapp.Sites)
{
$cacheSettings = New-Object Microsoft.SharePoint.Publishing.SiteCacheSettingsWriter($web.url)
$cacheSettings.EnableCache = $true
$cacheSettings.Update();
Write-Host “Updated Page Output Cache Settings…” $web.Url
}

Options:

#Default Page Output Cache Profile. The Cache Profile Id is built off the Cache Profile List (Site Actions – Site Settings – Site Collection Cache Profiles (under Site Collection Administration))

SetAnonymousPageCacheProfileId($web.Url, 1)
SetAuthenticatedPageCacheProfileId($web.Url, 2);

#Page Output Cache Policy
$cacheSettings.AllowPublishingWebPageOverrides = $true;
$cacheSettings.AllowLayoutPageOverrides = $true;
$cacheSettings.EnableDebuggingOutput = $true;

Project Server 2010: Fixing “Execution of user code in the .NET Framework is disabled. Enable “clr enabled” configuration option”.

January 12, 2017 Leave a comment

I ran into some problems deploying Project Server projects that were failing in the Queue jobs. The error I was receiving was

“Execution of user code in the .NeT Framework is disabled. Enable “clr enabled” configuration option.”

To fix the problem you need to enable the Common Language Runtime in SQL 2012.

1. Log into SQL Server
2. Launch SSMS (SQl Server Managment Studio)
3. Open up a new Query
4. Type the following:

sp_configure ‘clr enabled’, 1
go
reconfigure
go

After executing the query, go back to Project Server and retry your job. This time the job should complete successfully.

reference: https://technet.microsoft.com/en-us/library/ee662108(v=office.14).aspx

Project Server 2010: Fixing “Waiting To Be Processed”

January 6, 2017 Leave a comment

I recently ran into a problem with Project Server 2010 where all my jobs in the Manage Queue Jobs were all in the “Waiting to be processed” status.  Since there could be many reasons for this issue, I’ll just touch basis on the specific issue I was experiencing.  I have not seen anything being blogged about my specific problem so here it is.

In a nutshell we recently migrated our SQL Server 2012 Instance from Windows 2008 R2 OS to Windows Server 2012 R2 OS by doing a DB deattach / reattach method.  We did this by scripting out the process to move the databases over along with all the necessary DB permissions needed.  In the end this was the official problem,  but I’ll get back to that later.

The troubleshooting steps I took to resolve the problem are below.

The first thing I made sure and checked was that the “Project Application Service” was started and working.  This is checked through CA Services on Server.  This was good.

Next I checked the Project Server and made sure the ‘Microsoft Project Server Queue Service 2010″ within Window Services was started.  It was.

Even though the service was started I wanted to make sure it was functional so I completely stopped the service and restarted it.  This had no affect on the queued jobs.

I also recycled the SharePoint 2010 Timer jobs.  This did nothing as well.

Since all the necessary services were started and working and the jobs were still queued I went back to CA to check out the Project Web App Service Application (CA->Application Managment->Manage Service Applications->Project Server)

Clicking on the drop down option for the Project Web App Site and selecting “Edit” I made sure the Primary Database settings were correct since we did migrate to a new SQL Server Instance.  The Primary database server was still pointing to our old SQL Server instance, but I made sure I set up SQL Alias so this was not the problem.  Everything checked out and I hit the “Edit” button.  The process for provising the PWA site started.  It ran for about 5 minutes and then errored out “Failed – Please see application server logs”.

So I couldn’t even provision a PWA site (existing or new one).

Since I know that Project Server intially creates 4 databases when you provision a new Project Server Site (Published, Draft, Archive and Reporting) I went over to SQL to see if the new Databases I tried to create with a new PWA site were created, they were not.

So permissions….Looking at the permissions on my SQL instance I saw that the right permissions were granted to 4 databases with DBO rights.  However that was not enough.

I had to actually grant the “dbcreator” and “sysadmin” rights to the service accounts.

Once I did that I went back to the Project Web App Service Application and attempted to reprovision the PWA site again.  This time the site provisioned correctly and by doing so it kicked off the queued jobs which all completed successfully.

For your information all the queued jobs live in the “Draft” database.  If you ever have permissions with the queued jobs make sure this database has the adequate permissions to do what it needs to do.

Hopefully this  helps others with similar problems.

 

Categories: Uncategorized

SharePoint 2010: The field with Id {GUID} defined in feature {GUID} was found in the current site collection or in a subsite (Case #2)

July 14, 2016 Leave a comment

I recently ran into an issue when trying to activate the “PerformancePoint Services Site Collection Features” on a specific site collection and ran into this error:

The field with Id {9cdc5c49-b00d-4224-925f-b5443e27555e} defined in feature {f45834c7-54f6-48db-b7e4-a35fa470fc9b} was found in the current site collection or in a subsite

I’ve also experienced this same error when trying to activate a Site Collection Feature on a site collection that had orphaned content types from Nintex. I blogged about how to resolve that issue here: https://shipoint.com/2012/08/15/fixing-sharepoint-2010-site-collection-feature-activation-error-the-field-with-id-guid-defined-in-feature-guid-was-found-in-the-current-site-collection-or-in-a-sub-site/

However this current issue is a little different than of the Nintex error even though both therotically are caused by orphaned content types. The difference is that Nintex developed a handy tool to help remove these content types where as PerformancePoint did not, and requires more work to clean up.

I’m not so sure how there were orphaned PerformancePoint content types but my assumption here is that data was migrated from one location to another where the source had the Performance Point Services Site Collection Features activated, and the destination site did not.

I’ve ran into orphaned content types in my environment quite a bit and blogged about my cases here:

https://shipoint.com/2013/03/21/remove-orphaned-content-types-in-sharepoint-2010-that-are-associated-with-a-feature/
https://shipoint.com/2013/03/22/fixing-unable-to-locate-xml-definition-for-ctype-with-spcontenttypeid-in-sharepoint-2010/
https://shipoint.com/2012/06/07/removing-a-corrupted-site-column-in-sharepoint-2010/

For this case it’s no different than the others, but I wanted to share with you the steps I took to resolve this particualar problem.

As I did in my previous cases I opened up a case ticket with Microsoft to allow me to do direct DB modifications to successfully remove the orphaned content types. Please open up a ticket with Microsoft so you do not lose your Microsoft premier support.

Upon my first initial investigaion of the error I wanted to check the Site Collection Site Content Types and Site Columns.

When I browsed to “Site actions -> Site Settings -> Site Content Types” I noticed that there was a PerformancePoint Content Type listed there “Performance Data Source” which was associated with the parent content type of “PerformancePoint Monitoring Document Type”.

I then attempted to delete the “Performance Data Source” content type and I got this message: “The content type “PerformancePoint Data Source’ is part of an application feature.

When I attempted to delete the parent content type “PerformancePoint Monitoring Document Type” I received the same error.

I then journed over to SQL and opened up SQL Server Managment Studio 2012.

Below are the steps I took to successfully remove the orphaned content types.
Please proceed with caution. AGain not responsible for your farm.

1. Open up a new Query Window against the database that has the orphaned content type.
2. Typed the following to see if any PerformancePoint content types were present:

SELECT * FROM CONTENTDB.DBO.CONTENTTYPES WHERE ResourceDir like ‘%PerformancePoint%’

Upon running the query I saw two content types for both “Performance Data Source” and “Performance Monintoring Document Type” still in the database and was still tied to a feature. The “isFromFeature” flag was set to “1”.

With the “PerformancePoint Services Site Collection Features” activated it creates 9 root level content types:

PerformancePoint Base
PerformancePoint KPI
PerformancePoint Scorecard
PerformancePoint Indicator
PerformancePoint Report
PerformancePoint Filter
PerformancePoint Dashboard
PerformancePoint Monitoring Document Type
PerformancePoint Data Source

So with only 2 of the 9 content types showing in the database table I knew those content types were somehow orphaned, and for whatever reason the “Performance Monitoring Document Type” and “Performance Data Source” content types were not properally disposed of or removed.

My first initial step was to update the “isFromFeature” flag from “1” to “0” and then delete those content types from the Site Collection Content Types by following this:

https://shipoint.com/2013/03/21/remove-orphaned-content-types-in-sharepoint-2010-that-are-associated-with-a-feature/

However, after doing this I was still unable to delete the content types with the result being the same error as earlier.

I then turned back to SQL to delete the content types directly out of the database.

3. DELETE * FROM CONTENTDB.DBO.CONTENTTYPES WHERE Resource like ‘%PerformancePoint%’

After deleting the content types I tried again to activate the feature and it error’d out again.

Going back to SQL I ran the following query:

4. SELECT * FROM CONTENTDB.DBO.CONTENTTYPES WHERE Definition like ‘%PerformancePoint%’

This returned about 128 different results where PerformancePoint content types were being used in a sub-site within the main Site Collection.

These PerformancePoint content types were:

PPSMA_PersonResonsible
PPSMA_objectXML
PPSMA_FCOImage
PPSMA_FCOSubType

So now I also had to delete those:

5. DELETE * FROM CONTENTDB.DBO.CONTENTTYPES WHERE Definition like ‘%PerformancePoint%’

After deleting the content types, I again tried to activate the feature and it still failed.

Going back to the original error message:

“The field with Id {9cdc5c49-b00d-4224-925f-b5443e27555e} defined in feature {f45834c7-54f6-48db-b7e4-a35fa470fc9b} was found in the current site collection or in a subsite”

I wanted to search for that specific field ID {9cdc5c49-b00d-4224-925f-b5443e27555e} to see if I can find a reference to it in the database.

6. SELECT * FROM CONTENTDB.DBO.CONTENTTYPES WHERE Definition like ‘%27555e%’

It returned results of 32 items all referencing a content type of “DocumentSetDescription”

Since this is the field ID SharePoint was complaning about I deleted them from the database:

7. DELETE * FROM CONTENTDB.DBO.CONTENTYPES WHERE Definition like ‘%27555e%’

Upon deleting these content types I went back to the site and attempted to activate the feature again.

This time the feature activated successfully and all the correct content types for PerformancePoint were added to the Site Content Types and the database.

Hopefully this will help others who might run into a similar problem in your environment.

Categories: Uncategorized