SharePoint: Rendering inside iframes


October 31, 2014 - 16:35, by Steven Van de Craen - 8 Comments

This post is a revision of an old blog post on rendering Excel Services in an iframe on a different domain. This is prohibited because a HTTP response header X-FRAME-OPTIONS: SAMEORIGIN is added to the response. The issue isn’t limited to Excel Services but is applicable to any SharePoint-hosted page that you want to visualize in an iframe.

Consider the following:

  • SharePoint 2013 will always render the X-FRAME-OPTIONS header, even for regular pages. Adding an AllowFraming control to the page fixes that, but doesn’t cover all situations
  • You can’t add the AllowFraming control to Office Web Apps or InfoPath Forms Server (“FormServer.aspx”)
  • Clicking on (pdf) documents in a Document Library in the iframe will fail to load them because the document is a different request
  • You have a basic “integration” between different systems (like Dynamics CRM) and SharePoint content that uses iframes

This content cannot be displayed in a frame

PermissiveXFrameHeader

This is a HttpModule that can be activated per Web Application by Web Application Feature and will ensure that all pages will render inside an iframe. While the initial version would actually remove the offending response header, the module will now set values that will prevent SharePoint from trying to inject the header in the first place.

Please visit the Codeplex Repository to read more about this addon and for installation instructions: https://ventigrate.codeplex.com/wikipage?title=Permissive%20XFrame%20Header


Importing a Summary Links Web Part: List does not exist


October 23, 2014 - 15:23, by Steven Van de Craen - 0 Comments

Issue

Consider the scenario where you have a Summary Links Web Part (part of the SharePoint Publishing functionality) configured on a page and you want to import the preconfigured Web Part on a different page on a different site. If you try this you’ll get “List does not exist”:

image

Note that importing the Web Part in the same site (same or different pages) works just fine.

Cause

This is because the Summary Links Web Part references the list that contains the page where the Web Part resides on. If you open the .webpart file in a text editor you’ll see ListName and ListId containing the GUID of that list. So it can be the “Site Pages” library, the “Pages” library, or any Document Library that has Web Part Pages.

image

You can verify this by navigating to the following URL (note to replace the actual GUID): http://sitename/_layouts/listedit.aspx?List=GUID

Bonus question: what is the value when the Summary Links Web Part is on the “default.aspx” of a site? Answer:

image

Solution

So what’s the solution? Just remove the ListName and ListId elements (or their values) from the exported .webpart file and you’ll have no issues importing it to other sites.

HTH


SharePoint 2013: Bulk Content Approval of list items fails if user has read permissions on the web


October 17, 2014 - 16:51, by Steven Van de Craen - 0 Comments

Issue

Last week I was notified of an issue where bulk content approval failed for specific users. The list was configured with the default Content Approval.

image

They would select two or more items to approve and click the “Approve” button in the Ribbon, however that just kept “Working on it”.

image

Note that single item approval works just fine for them!

Cause

When watching with Fiddler and in the ULS logs it was clear that the bulk approval screen threw an Access Denied.

image

image

The user was configured with Read permissions on the site and Approve/Contribute permissions on the list (but even with Full Control on the list it failed).

Workaround

After some playing around with the permission levels and permissions on the web level, it turns out that if the user has “Approve” permission on the site level it works!!image

Obviously this may not be possible to grant to your users.

Solution

None so far. This was tested on Service Pack 1 (15.0.4605.1000) and September 2014 CU (15.0.4649.1001) individually.

For now either use single item content approval or give the user the “Approve Items” permission on the site level as well (workaround above).

HTH


SharePoint 2013 Web Applications requests not logged to ULS logs


October 16, 2014 - 22:04, by Steven Van de Craen - 0 Comments

Issue

Ever since I built my dev VM with SharePoint 2013 and least privileged installation it had this issue where no user requests entries (Info, Error, Unexpected, …) would get logged to the SharePoint ULS logs. At first I blamed it on SharePoint of course, but CU after SP after PU did not fix my issue so it couldn’t be that. And no other of the environments I set up were affected by this issue.

What _did_ get logged:

  • Every other process (Distributed Cache, OWSTimer, NodeRunner, …)
  • Central Administration requests
  • Service Application requests

My least privileged installation is pretty basic stuff:

ACCOUNT

DESCRIPTION

DOMAIN\sp_farmadmin

Farm Administrator is used for the Timer Service and Central Administration AppPool Identity.

DOMAIN\sp_setup

Installation account. Requires local admin rights on SharePoint and dbcreator and security admin on SQL Server.

DOMAIN \sp_contentapps

AppPool Identity for SharePoint sites

DOMAIN\sp_serviceapps

AppPool Identity for SharePoint Service Applications.

DOMAIN \sp_search

Search Content Access account for indexing data.

DOMAIN\sp_superuser

Super User account for object caching.

DOMAIN\sp_superreader

Super Reader account for object caching.

DOMAIN\sp_ups_sync

User Profile synchronization account to Active Directory. Requires "Replicate Directory Changes" on the domain. http://technet.microsoft.com/en-us/library/ff182925.aspx#permission

 

Since this is a dev VM you can imagine this being a real pain troubleshooting bugs and flow. Until today, because on one attempt I changed the Service Account for the Content Application Pool (DOMAIN\sp_contentapps) to the Farm Administrator and POOF! my logging had returned:

image

Cause

So what was wrong with this DOMAIN\sp_contentapps? I decided to fire up the awesome ProcMon (filter on User Name, exclude all “Success” entries) to see what exactly was going on:

image

It was trying to load that user’s Windows profile from disk but somehow ended loading up a temporary profile:

 image

Look for Event ID 1511 in the Application Event Log and you’ll find a corresponding entry:

image

Solution

Microsoft have provided us with a few options to resolve this in the following KB Article: http://support2.microsoft.com/kb/947215/en

  • Open the Registry Editor
  • Navigate to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList
  • Find the key starting with S-1-5 and ending with .bak
  • Remove the .bak suffix (if you can’t rename because a key already exists you have to rename the latter in two steps so it ends up having the .bak suffix)
  • Click on the renamed key (without suffix) 
    • Set the value for “RefCount” to 0
    • Set the value for “State” to 0
  • Reboot
  • Log in & out with the account (you may need to temporarily grant it access to do so) so the profile folder gets created
  • Ensure that the correct profile folder is present
  • Ensure that the application pool is started (likely it failed after the reboot because of the missing profile folder)
  • (Optional) Clean up the “.bak” registry key and corresponding temporary profile folder

So conclusion; I probably at some point deleted that profile folder for who knows why, but it had some unforeseen consequences…

HTH


Windows 10 Technical Preview and Cisco AnyConnect


October 3, 2014 - 21:26, by Steven Van de Craen - 1 Comments

Today I decided to look into Windows 10 Technical Preview without safety net and run it on my main work machine. No real issues so far, except connecting to our corporate network via Cisco AnyConnect (version 3.1.04059).

Failed to initialize connection subsystem

image

This can easily be resolved by running the VPN client in Windows 8 Compatibility Mode. Just edit the shortcut properties and set compatibility mode. Then restart the client and try again.

image

If you still run into issues try the following registry trick you might had to do in Windows 8 as well:

  • Open Registry Editor (Start > Run > Regedit)
  • Navigate to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\vpnva
  • Change the value of the “DisplayName” key by removing the garbled characters before the name “Cisco…”

image 

  • Restart the client

 

Hope this helps!


Windows Server: allow multiple RDP sessions per user


October 2, 2014 - 11:56, by Steven Van de Craen - 0 Comments

I’ve often worked on SharePoint environments where I accidentally got kicked or kicked others because we were working with the same account on the same server via Remote Desktop. By default each user is restricted to a single session but there’s a group policy to change this.

In Windows Server 2008 you had a UI for this, but since Windows Server 2012 you have to do this via gpedit.msc. This basically enforces a setting in the Windows Registry which you could do directly:

  • Open Registry Editor (Start > Run > regedit)
  • Navigate to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Terminal Server
  • Ensure the DWORD fSingleSessionPerUser exists and is set to 0

 

Or -even easier- create and run a .reg file with the following content:

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Terminal Server]
"fSingleSessionPerUser"=dword:00000000

 

Here’s a quick link for you: ALLOW_MULTI_RDP_PER_USER.reg

HTH


Calling web services in Nintex Workflow and different authentication mechanisms


September 12, 2014 - 20:40, by Steven Van de Craen - 0 Comments

With the rise of claims based authentication in SharePoint we’ve faced new challenges in how to interact with web services hosted on those environments. Claims based authentication allows many different scenario’s with a mixture of Windows, Forms and SAML Authentication.

image

When you’re working with Nintex Workflow you’re faced with authentication in Actions such as “Call Web Service” or “Web Request”.

If you’re just using Windows Authentication (NTLM, Kerberos, Basic) on your site then Nintex will handle that authentication just fine for you and use the credentials you specified (manually entered or stored credentials).

imageimage

However you might have to deal with different or multiple authentication mechanisms such as Forms Based Authentication, ADFS or a combination. In such cases you’ll get a 403 FORBIDDEN regardless of the credentials you enter.

image

Overcoming this hurdle can be challenging.

  1. Use a different URL zone (with windows authentication) to make the call
  2. Pass an authentication cookie along with the request

Use a different URL zone (with windows authentication) to make the call

Nintex Actions execute on the server, not on your -already authenticated- client. The connection information you’ve entered (URL, username, password) is used to construct a connection and execute the operation. Since the Action executes locally on the server it can make use of a different URL to do the call. It is a best practice/requirement to have the Default Zone of your Web Application configured with -just- Windows Authentication in order to get things like Search to work properly. Why not make use of this and use that URL in your Actions?

imageimage

Define a set of credentials that can be used in “Call web service” or “Web Request” Actions and have it execute against the URL that has Windows Authentication. If this option is available to you it probably is the preferred way of working.

Pass an authentication cookie along with the request

If the above is no option for you things get trickier and “specific”, meaning it is specific to a certain scenario but might not be possible for yours.

In MY case I have a SharePoint 2013 on-prem environment with “mixed” authentication (Windows and Forms Based). SharePoint issues a FedAuth cookie when the user successfully authenticates. If you send this cookie along with the web request it will work just fine. Note that the “Call web service” action does NOT allow you to specify additional headers so the “Web Request” Action becomes your new best friend here.

Using the “Web Request” Actions allows for much more flexibility, but you’ll have to build the request message yourself. I our case that means the SOAP message.

image

Once you have all of that in place the “Web Request” will happily call out to the web service. See it here working with the FedAuth cookie I “borrowed”.

imageimage

Getting the FedAuth cookie

The base premise is that you need to ‘replay’ the authentication mechanism in code to get the FedAuth cookie. Once you have this you can send it along with future requests from Nintex Workflow. Again this is really specific to my case and may not be possible for you because of additional security or complex authentication schemes.

For my SharePoint 2013 on-prem environment with “mixed” authentication (Windows and Forms Based) I force the call to do Windows Authentication:

public static class AuthHelper
{
    public static Cookie GetFedAuthCookie(Uri uri, ICredentials credentials)
    {
        Cookie result = null;

        // Emulate the authentication via a request to the /_windows/default.aspx page using the provided credentials
        HttpWebRequest request = WebRequest.Create(uri.GetLeftPart(UriPartial.Authority) + "/_windows/default.aspx?ReturnUrl=%2f_layouts%2fAuthenticate.aspx%3fSource%3d%252FDefault%252Easpx&Source=%2FDefault.aspx") as HttpWebRequest;
        request.Credentials = credentials ?? CredentialCache.DefaultNetworkCredentials;
        request.Method = "GET";
        request.CookieContainer = new CookieContainer();
        request.AllowAutoRedirect = false;

        // Execute the HTTP request 
        HttpWebResponse response = request.GetResponse() as HttpWebResponse;
        if (null != response)
        {
            result = response.Cookies["FedAuth"];
        }

        return result;
    }
}

I actually made this available as a Web Service so that it can be called from with a Nintex Workflow.

public class AuthService : IAuthService
{
    public string GetFedAuthCookie(string requestUrl, string userName, string password)
    {
        string result = null;

        try
        {
            NetworkCredential credential = !String.IsNullOrEmpty(userName) ? new NetworkCredential(userName, password) : null;
            Cookie cookie = AuthHelper.GetFedAuthCookie(new Uri(requestUrl), credential);

            if (cookie != null)
            {
                result = cookie.Value;
            }
        }
        catch (Exception ex)
        {
            result = null;
        }

        return result;
    }
 }

And now I can call my Authentication service prior to the other services.

image

Door #3

It feels like it must be possible to use access tokens that can be passed along similar to the FedAuth cookie. Considering this is how the App model works in SharePoint 2013, there has to be a way to leverage this for what we’re trying to accomplish. But that’s for another post.


Restore-SPSite and Content Databases


August 13, 2014 - 11:05, by Steven Van de Craen - 0 Comments

Today I found a gotcha with the Restore-SPSite command when restoring “over” an existing Site Collection. The issue occurs if all Content Databases are at a maximum of their maximum Site Collection count.

Content Databases - limited max no of site collections

The error you’ll receive is that there is basically no room for the new Site Collection:

PS C:\temp> Restore-SPSite http://intranet -Path C:\temp\sc1.bak -Force -Confirm:0
Restore-SPSite : The operation that you are attempting to perform cannot be completed successfully.  No content databases in the web application were available to store your site collection.  The existing content databases may have reached the maximum number of site collections, or be set to read-only, or be offline, or may already contain a copy of this site collection.  Create another content database for the Web application and then try the operation again.
At line:1 char:1
+ Restore-SPSite http://intranet -Path C:\temp\sc1.bak -Force -Confirm:0
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidData: (Microsoft.Share...dletRestoreSite:
   SPCmdletRestoreSite) [Restore-SPSite], InvalidOperationException
    + FullyQualifiedErrorId : Microsoft.SharePoint.PowerShell.SPCmdletRestoreS
   ite

This might seem unexpected at first but imagine that the restore operation works in two stages; add the ‘to restore’ site and, when succeeded, remove the original site. It is because of this two stage approach that you’ll need to allow for 1 more Site Collection in that Content Database. Note that the Database Status also needs to be “Started” or you’ll receive the same error. Afterwards you can set the original settings back if you want.

HTH


Issue creating subsites when a built-in field is modified


June 30, 2014 - 14:26, by Steven Van de Craen - 1 Comments

One of our site collections in a migration to SharePoint 2013 experienced an issue with creating sub sites:

Sorry, something went wrong
The URL 'SitePages/Home.aspx' is invalid.  It may refer to a nonexistent file or folder, or refer to a valid file or folder that is not in the current Web.

Drilling down in the ULS logs we noticed these:

System.Runtime.InteropServices.COMException: <nativehr>0x81020030</nativehr><nativestack></nativestack>The URL 'SitePages/Home.aspx' is invalid.  It may refer to a nonexistent file or folder, or refer to a valid file or folder that is not in the current Web., StackTrace:  
at Microsoft.SharePoint.SPListItem.AddOrUpdateItem(Boolean bAdd, Boolean bSystem, Boolean bPreserveItemVersion, Boolean bNoVersion, Boolean bMigration, Boolean bPublish, Boolean bCheckOut, Boolean bCheckin, Guid newGuidOnAdd, Int32& ulID, Object& objAttachmentNames, Object& objAttachmentContents, Boolean suppressAfterEvents, String filename, Boolean bPreserveItemUIVersion)   
at Microsoft.SharePoint.SPListItem.UpdateInternal(Boolean bSystem, Boolean bPreserveItemVersion, Guid newGuidOnAdd, Boolean bMigration, Boolean bPublish, Boolean bNoVersion, Boolean bCheckOut, Boolean bCheckin, Boolean suppressAfterEvents, String filename, Boolean bPreserveItemUIVersion)   
at Microsoft.SharePoint.Utilities.SPUtility.ProvisionWikiPageHomePage(SPFile wikiPage)   
at Microsoft.SharePoint.Utilities.SPUtility.EnsureWikiPageHomePage(SPWeb web, ProvisionWikiPage provisionWikiPage)   
at Microsoft.SharePoint.SPWikiPageHomePageFeatureReceiver.FeatureActivated(SPFeatureReceiverProperties properties)   
at Microsoft.SharePoint.SPFeature.DoActivationCallout(Boolean fActivate, Boolean fForce)   
at Microsoft.SharePoint.SPFeature.Activate(SPSite siteParent, SPWeb webParent, SPFeaturePropertyCollection props, SPFeatureActivateFlags activateFlags, Boolean fForce)

System.Data.SqlClient.SqlException (0x80131904): Parameter '@tp_Author' was supplied multiple times.   
at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)   
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)   
at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)   
at System.Data.SqlClient.SqlDataReader.TryHasMoreRows(Boolean& moreRows)   
at System.Data.SqlClient.SqlDataReader.TryReadInternal(Boolean setTimeout, Boolean& more)   
at System.Data.SqlClient.SqlDataReader.TryNextResult(Boolean& more)   
at System.Data.SqlClient.SqlDataReader.NextResult()   
at Microsoft.SharePoint.SPSqlClient.ExecuteQueryInternal(Boolean retryfordeadlock)   
at Microsoft.SharePoint.SPSqlClient.ExecuteQuery(Boolean retryfordeadlock)  ClientConnectionId:1237a76d-2050-4ad5-82cd-cc9610f95061

The tp_Author gave an entry point into troubleshooting this issue. A quick loop through the fields on the web and then looking for ColName="tp_Author" revealed only the out of the box “Author” field being present. But it was modified at some point in history because the Group was different and there was a Version attribute present.

One can quickly test this behaviour on a clean new site collection and updating the field with powershell:

$w = Get-SPWeb http://intranet
$f = $w.Fields.GetFieldByInternalName("Author")
$f.Update()

The schema xml will look as follows:

<Field ID="{1df5e554-ec7e-46a6-901d-d85a3881cb18}" Name="Author" SourceID="http://schemas.microsoft.com/sharepoint/v3" StaticName="Author" Group="_Hidden" ColName="tp_Author" RowOrdinal="0" Type="User" List="UserInfo" DisplayName="Created By" Sealed="FALSE" ReadOnly="TRUE" Version="1" />

And also it will now be impossible to create any sites in the site collection (exception is the ‘blank’ site but it has the same issues once you activate the “Wiki page home page” feature and edit/save a page).

Solution

Luckily the product team has provided a method named Microsoft.SharePoint.SPField.RevertCustomizations() that will undo the changes and restore the site creation functionality.

$f.RevertCustomizations()
HTH


SharePoint: How to troubleshoot issues with Save as template


May 23, 2014 - 14:32, by Steven Van de Craen - 0 Comments

On an upgrade project to SharePoint 2013 we ran into an issue where a specific site couldn’t be saved as a template (with or without content). You get the non-descriptive “Sorry, something went wrong” and “An unexpected error has occurred” messages. Funny enough the logged Correlation Id is totally absent from the ULS logs, so no help there.

What you can do next is turn on advanced debugging mode by configuring the following entries in the web.config of the SharePoint site:

  • Turn on the call stack (CallStack="true")
  • Disable custom errors in Visual Studio (<customErrors mode="Off" />)
  • Enable compilation debugging (<compilation debug="true">)

http://msdn.microsoft.com/en-us/library/ee231550.aspx

If you then retry your action you’ll find additional information in the Event Log or on the page.

Exception information:
    Exception type: InvalidOperationException
    Exception message: Error generating solution files in temporary directory.
   at Microsoft.SharePoint.SPSolutionExporter.ExportWebAsSolution()
   at Microsoft.SharePoint.SPSolutionExporter.ExportWebToGallery(SPWeb web, String solutionFileName, String title, String description, ExportMode exportMode, Boolean includeContent, String workflowTemplateName, String destinationListUrl, Action`1 solutionPostProcessor, Boolean activateSolution)
   at Microsoft.SharePoint.ApplicationPages.SaveAsTemplatePage.BtnSaveAsTemplate_Click(Object sender, EventArgs e)
   at System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument)
   at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)

What this means is that the export operation ran into an issue. We can find the partial export in the temporary directory

Temporary Export Location

If you investigate the contents of the “SPSolutionExporter” folder, you’ll eventually find the issue. In our case the XML generation aborted on the Expiration Policy of one of the Content Types.

image 

So while your issue might be different, this method should provide you with more insight on the issue and take appropriate action.

Addendum

The Content Type/Policy issue in our setup was caused due to a corrupt XmlDocument that describes the changes made to Information Policies. By removing this invalid XmlDocument we were able to save the site as template:

using (SPSite site = new SPSite(url))
{
    using (SPWeb web = site.OpenWeb())
    {
        foreach (SPContentType ct in web.ContentTypes)
        {
            ct.SchemaXmlWithResourceTokens = Regex.Replace(ct.SchemaXmlWithResourceTokens, @"<XmlDocument NamespaceURI=""microsoft.office.server.policy.changes"">.+?</XmlDocument>", "");
            ct.Update();
        }
    }
} 


 Next >>