SharePoint 2019 and the new OneDrive Sync Client

November 20, 2018 - 14:34, by Steven Van de Craen - 0 Comments

I’ve been running various tests and scenario’s with the new SharePoint 2019 that released about a month ago and ran into an issue with the new OneDrive Sync Client (aka NGSC or Next Generation Sync Client).

OneDrive sync error

Sorry, we couldn’t sync this folder. Contact your IT administrator to configure OneDrive to sync SharePoint on-premise folders

Aside from the obvious grammar error (on-premise vs on-premises) I didn’t expect this issue as I had Windows 10, the latest OneDrive client and SharePoint 2019 as outlined here: Except I ignored the part where it says you need to configure Group Policy objects for it to work.

To set up OneDrive with SharePoint Server 2019, configure the following Group Policy objects:

  1. SharePoint on-premises server URL and tenant folder name The URL will help the sync client locate the SharePoint Server and allows the sync client to authenticate and set up sync. The tenant folder name lets you specify the name of the root folder that will be created in File Explorer. If you don’t supply a tenant name, the sync client will use the first segment of the URL as the name. For example, would become “Office.”
  2. SharePoint prioritization setting for hybrid customers that use SharePoint Online (SPO) and SharePoint on-premises server This setting lets you specify if the sync client should authenticate against SharePoint Online or the SharePoint on-premises server if the identity exists in both identity providers. Learn how to manage OneDrive using Group Policy

Now a quick way to configure these values is using the registry editor.



  • Name: SharePointOnPremPrioritization
  • Type: REG_DWORD
  • Value:
    • 0: prioritizes for SharePoint Online (“PrioritizeSPO”)
    • 1: prioritizes for SharePoint 2019 (“PrioritizeSharePointOnPrem”)




Once you configure these values you can sync a SharePoint 2019 library with the new sync client.

Issue with setting managed metadata fields in a SharePoint workflow

November 9, 2018 - 11:41, by Steven Van de Craen - 0 Comments


One of my on-prem customers is running Nintex Workflow on SharePoint 2016 and escalated an issue with setting a Taxonomy/Managed Metadata field value via a workflow. They experienced that the workflow would not change the value, or in some case it would but only once.

I managed to scope the issue down to the following:

  1. It works in SharePoint 2010 but doesn’t in SharePoint 2013, 2016, 2019 (had to spin up a VM of each for proper testing)
  2. It only affects Workflow Foundation/SharePoint 2010 Workflow Engine workflows (SharePoint Designer can create them, Nintex Workflow on-prem is based on this)
  3. For files that haven’t been given a value for the managed metadata field the workflow can set the value once. If the field has been set or cleared previously by UI, code, workflow, it will no longer accept new values via the workflow. For Office documents this may not work since background property promotion also causes the issue

So that makes it a pretty narrow scope and explains why there aren’t that many reports on this. But since not everyone is moving to the cloud at the same speed this is still be a relevant issue for some.


I’m going with SharePoint 2016, but I managed to reproduce this in 2013 and 2019 as well.

Create a document library and create a Managed Metadata field with some values and upload a document

Create a new workflow using the “SharePoint 2010 Workflow Platform” (SharePoint Designer or Nintex Workflow)

Create workflow 

Configure the workflow to update the Managed Metadata field. This field expects the format TERMLABEL|TERMGUID (I’m using a hardcoded value)

Workflow step

Publish the workflow and run it on new files. For non-Office documents this should work once.

Result #1

Next, change or clear the value through UI or whatever and run the same workflow again. You should see that even though the workflow has run it could change the value.

Workaround or fix?

I’ve been digging into the internals and it seems that once the managed metadata field is set or cleared, it keeps properties in the SPFile.Properties property bag that interfere with the update process. If you delete these properties and update the SPFile the workflow can update the value (again only once since the properties are added again).

Removing the properties requires an SPFile.Update() which in term creates a new version and is less than ideal.

Escalating this to PSS and hoping for a fix would probably be the right way, but since the narrow scope of the issue and since it is in older technology I have low hopes on this getting fixed soon.

In my case I wrote a Web Service that would allow for updating the Managed Metadata field value through the SharePoint Object Model. And this Web Method can be called in the workflow. It even allows for the workflow designer to specify the type of update (Update, UpdateOverwriteVersion or SystemUpdate). So although not very intuitive the workflow designer can now update Managed Metadata fields through the Web Method, and all other fields through the regular workflow action.

Web service input

OneDrive sync client and green locks (read-only sync)

December 7, 2017 - 15:31, by Steven Van de Craen - 0 Comments

The issue

I recently ran into an Office 365 environment where all users were seeing green locks on files and folders synched with the OneDrive For Business client.

OneDrive read-only sync

If this isn’t expected, in cases where the user should definitely have write access to these files you might want to check the following Library Settings:

Require content approval for submitted items?

Go to Library Settings > Versioning:

 Content Approval

If your library has this enabled than OneDrive will only sync in read-only mode. If you disable this setting your sync client should inform you that you now have read-write access and the green lock icon should be gone.

OneDrive read-write message 

If it doesn’t then please read on.

Require documents to be checked out before they can be edited?

Go to Library Settings > Versioning:

Require Check Out

Another feature that’s not compatible with the full OneDrive sync experience. Again as before, you will have to disable this and it should resolve immediately. If not then continue reading…

Required Columns

Go to the Library Settings and check the columns:

Library Columns

Check with the columns on the library, if any of them are “required” this will cause a read-only sync. In the screen this is the case for my ‘Department’ field.

Wait a minute! Does this mean we lose the ability of having required metadata ???

Well, luckily NO. You just have to use a different approach and use Content Types.

The solution

Go to the Library Settings > Advanced Settings:

Allow management of Content Types


Content Types are a great way to have different types of content, each with a different set of metadata, workflows, template, and more. You should familiarize yourself with the concept of Site Columns, Site Content Types, List Columns, List Content Types, Content Type Inheritance, etc. It really depends on the use case but in my example I just modified the List-level Column and Content Type, in other cases this may not be the best approach. If you’re looking for more information on Content Types you can easily find this on the web.


To make your field required go into the Content Type (you can click on its name) and configure each field as optional, required or hidden.

List Content Type Information


The result is that the field/column itself is not configured as Required, but it is configured Required for the Content Type. And when uploading documents of that Content Type the user will still have to provide a value for the field.

Edit Item


Almost there…

If you followed the above steps you’ll still be having the original issue. This is because in the background the field will still be “Required”. And if you have Content Types enabled you won’t be able to change this setting because the UI hides it from the List Column Settings!

Here are a few options to resolve this situation:

  1. If you are using Site Columns then you can change the “Require” setting and push the change down to lists and libraries. This is quite easy to do but will impact all lists and libraries where this is used. You’ll have to inspect those lists to see if they have Content Types enabled (and that the Content Type requires the field) or users will no longer have to specify a value (= functionality change)
  2. You can disable Content Types on the Library, change the List Column setting and re-enable Content Types. This is also easy to do from the UI. All Content Types will be preserved during the operation but some users might be impacted during the operation. Afterwards, verify per Content Type which fields should be required.
  3. Use CSOM or PowerShell to directly manipulate the List Column settings.


Options (2) and (3) are rather similar, but if you prefer the latter here’s a PnP PowerShell script that should assist you. I like PowerShell because it is transparant and can easily be viewed and modified. And I like PnP and its CmdLets because it really abstracts complex operations. Note that you will have to install the PnP CmdLets first.


cls ## Variables $userName = "yourusername" $siteUrl = "yoursiteurl" $listName = "yourlistname" ## Script start if (!$cred) { $cred = Get-Credential $userName } Connect-PnPOnline –Url $siteUrl -Credential $cred ### $web = Get-PnPWeb $list = Get-PnPList $listName ### QUERY OR CHANGE $bChangeField = $false Get-PnPField -List $list | % { $f = $_ # List all required fields, except the built-in FileLeafRef field which is required but by design if ($f.Required -and $f.StaticName -ne 'FileLeafRef') { Write-Host ($f.StaticName) -ForegroundColor Red if ($bChangeField) { Set-PnPField -Identity $f -Values @{Required=$false} Write-Host (" -> updated") -ForegroundColor DarkYellow } else { Write-Host (" -> reporting only") -ForegroundColor DarkYellow } } }

The script has a flag to control the actual field update vs reporting only.

After running (with actual update) it should resolve the issue.

OneDrive read-write sync

if it doesn’t I’d be interested to hear about it in the comments!


Some list settings are not compatible with the OneDrive sync experience and make it a read-only sync. You can disable these via the UI or via code/script.

Windows 10 Creators Update: Slow wireless connection

May 8, 2017 - 11:07, by Steven Van de Craen - 0 Comments

A quick blog for archiving purposes. Since my upgrade to Windows 10 Creators Update last week I had been experiencing very slow wireless performance at work. At home or while tethering everything was as normal.

For me the solution as to disable “Receive Segment Coalescing” (RSC), described as ‘workaround #1’ in this post:

Disable Rsc


  • Run an elevated PowerShell prompt
  • Get-NetAdapterRsc to show the status per adapter
  • Disable-NetAdapterRsc –Name "your_wifi_adapter_name" to disable

Immediate fix for me!

Thanks Tripp Parks [MSFT] for the workaround!

TaxonomyService GetTermSets Failed to compare two elements in the array

December 23, 2016 - 16:36, by Steven Van de Craen - 0 Comments

This issue happed on a SharePoint 2013 environment that got upgraded from SharePoint 2010 and the farm includes Language Packs (Dutch in our case). When navigating to the Term Store Management Tool it is impossible to expand the Term Group “Search Dictionaries” for the non-English language. This system group is created for the Search Service Application and contains dictionaries for Company Inclusions, Company Exclusions, Query Spelling Inclusions and Query Spelling Exclusions, but the Term Sets are only provisioned for LCID 1033 (English) and that’s causing the issue.

Search Dictionaries Term Group


When Dutch is selected we get ‘Deze bewerking kan niet worden voltooid. Het termenarchief is mogelijk niet beschikbaar’.

Message from webpage – Deze bewerking kan niet worden voltooid. Het termenarchief is mogelijk niet beschikbaar

Failed to compare two elements in the array. 
at System.Collections.Generic.ArraySortHelper`1.Sort(T[] keys, Int32 index, Int32 length, IComparer`1 comparer)   
at System.Collections.Generic.List`1.Sort(Int32 index, Int32 count, IComparer`1 comparer)   
at Microsoft.SharePoint.Taxonomy.TermSetCollection.CreateTermSetCollection(List`1 sharedTermSets, TermStore termStore)   
at Microsoft.SharePoint.Taxonomy.WebServices.TaxonomyInternalService.GetTermSets(Guid sspId, Guid guid, Boolean includeNoneTaggableTermset, Guid webId, Guid listId, Int32 lcid)



There seems to be no easy way to rename or add a localized name for a system Term Set, but we can make use of reflection and call Microsoft.SharePoint.Taxonomy.TermSet.SetName(string value, int lcid)

In a PowerShell script that might look like this:

asnp Microsoft.SharePoint.PowerShell -ea 0 | Out-Null cls # Functions function Get-InstanceFieldNonPublic($obj, $name) { $t = $obj.GetType() $result = $t.InvokeMember($name, 'GetField, Instance, NonPublic', $null, $obj, $null) return $result } function Invoke-InstanceMethodNonPublic($obj, $name, $params) { $t = $obj.GetType() $result = $t.InvokeMember($name, 'InvokeMethod, Instance, NonPublic', $null, $obj, $params) return $result } ### INIT $tSession = Get-SPTaxonomySession -Site "http://intranet" $tStore = $tSession.TermStores[0] $tGroup = $tStore.Groups["Search Dictionaries"] $tGroup.TermSets | % { $tSet = $_ # 'Add' another language label for TermSet -- 1043 is Dutch Invoke-InstanceMethodNonPublic -obj $tSet -name "SetName" -params @($tSet.Name, 1043) # Query known language labels for TermSet Get-InstanceFieldNonPublic -obj $tSet -name "names" } # Commit changnes -- uncomment to apply # $tStore.CommitAll()

As always; be careful with reflection as it is not supported and allows you to harm your environment beyond repair. That said it did nicely resolve the issue.

Revisited - Upgrading SharePoint - Some views not upgraded to XsltListViewWebPart

August 24, 2016 - 15:03, by Steven Van de Craen - 2 Comments

The old

Back in 2013 when I was a regular blogger I posted about a tool/logic that would assist you in SharePoint migrations where some Views of Lists and Libraries would remain in the ‘old’ display mode. This issue happens with all version upgrades (2007 > 2010 > 2013/2016) and is actually still present.


The tool I wrote back then would recreate the view with exact same settings but I found it still wouldn’t do a 100% conversion of views.

The new

With the release of SharePoint 2016 I’m doing a lot of upgrade projects from 2007/2010 to 2016 and I decided to have a second look at the mechanism behind all this. I inspected the (Xslt)ListViewWebPart with ILSpy/Reflector to see the logic involved. I then converted that logic into a set of Powershell functions that can be called on various levels.

Note that I’m making calls to internal code using Reflection, I’ve used this script with success at all migration projects, but the necessary disclaimers apply since we’re dealing with reflection here.

The details

Here’s the main function that checks an individual Web Part Page for “upgradeable” Web Parts and if applicable will perform the upgrade of ListViewWebParts. It also handles any check-out/check-in that may be required.

function Upgrade-ListViewWebPartsOnPage([Microsoft.SharePoint.SPFile]$page, [bool]$checkOnly)
if ($page.Exists) {

if ($checkOnly)
$man = $page.GetLimitedWebPartManager('Shared')
$lvWps = @($man.WebParts) | ? { ($_ -is [Microsoft.SharePoint.WebPartPages.ListViewWebPart]) -and (Check-ListViewWebPartNeedsUpgrade -lvwp $_) }

if ($lvWps.Count -gt 0) {
Write-Host ("Page " + $page.Web.Url + "/" + $page.Url + " contains upgradeable ListViewWebParts") -ForeGroundColor Yellow
if ($page.RequiresCheckout) {
if ($page.CheckOutStatus -ne 'None') {
try { $page.UndoCheckOut() } catch { $page.CheckIn("SYSTEM"); }


$man = $page.GetLimitedWebPartManager('Shared')

if ($man -ne $null) {
try {
$lvWps = @($man.WebParts) | ? { ($_ -is [Microsoft.SharePoint.WebPartPages.ListViewWebPart]) -and (Check-ListViewWebPartNeedsUpgrade -lvwp $_) }

if ($lvWps.Count -gt 0) {
Write-Host ("Processed " + $page.Web.Url + "/" + $page.Url + " for ListViewWebParts") -ForeGroundColor Green

$lvWps | % {
$wp = $_

$view = [Microsoft.SharePoint.WebPartPages.ListViewWebPart].InvokeMember("View", 'GetProperty, Instance, NonPublic', $null, $wp, $null)
$wpType = Get-WebPartTypeId -view $view

[Microsoft.SharePoint.WebPartPages.ListViewWebPart].InvokeMember("NewWebPartTypeId", 'SetProperty, Instance, NonPublic', $null, $wp, $wpType)


if ($page.RequiresCheckout) {

if ($page.Item.ParentList.EnableMinorVersions) {
} else {
if ($page.RequiresCheckout) {
try { $page.UndoCheckOut() } catch { $page.CheckIn("SYSTEM"); }
} catch {
Write-Host ("Manually check " + $page.Web.Url + "/" + $page.Url + " for ListViewWebParts WebParts - Error: " + $_) -ForeGroundColor DarkYellow

Then there are various functions that do the same thing but on View, List or Web scope. They essentially all make calls to the above function in a loop of that particular scope.

function Upgrade-ListViewWebPartsOnView([Microsoft.SharePoint.SPView]$view, [bool]$checkOnly)
$web = $view.ParentList.ParentWeb
$page = $web.GetFile($web.Url + "/" + $view.Url)

Upgrade-ListViewWebPartsOnPage -page $page -checkOnly $checkOnly

function Upgrade-ListViewWebPartsOnList([Microsoft.SharePoint.SPList]$list, [bool]$checkOnly)
@($list.Views) | % {
$view = $_

Upgrade-ListViewWebPartsOnView -view $view -checkOnly $checkOnly

function Upgrade-ListViewWebPartsOnWeb([Microsoft.SharePoint.SPWeb]$web, [bool]$checkOnly)
@($web.Lists) | % {
$list = $_

Upgrade-ListViewWebPartsOnList -list $list -checkOnly $checkOnly

Finally there are the base functions that handle the checking and upgrading of the Web Parts similar to how the API handles it. They use reflection for methods that are marked internal.

function Check-ListViewWebPartNeedsUpgrade([Microsoft.SharePoint.WebPartPages.ListViewWebPart]$lvwp)
$view = [Microsoft.SharePoint.WebPartPages.ListViewWebPart].InvokeMember("View", 'GetProperty, Instance, NonPublic', $null, $lvwp, $null)
$rh = [Microsoft.SharePoint.SPFile].Assembly.GetType("Microsoft.SharePoint.WebPartPages.SPWebPartReflectionHelper")

$g1 = $rh.InvokeMember("GetWebPartTypeID", 'InvokeMethod, Static, NonPublic', $null, $null, ([Microsoft.SharePoint.WebPartPages.XsltListViewWebPart]))
$g2 = Get-WebPartTypeId -view $view

# If ids are identical it needs to upgrade
return ($g1 -eq $g2)

function Get-WebPartTypeId([Microsoft.SharePoint.SPView]$view)
## Note: Logic taken from Microsoft.SharePoint.WebPartPages.BaseListViewToolPart.ApplyChanges()

$list = $view.ParentList
$rh = [Microsoft.SharePoint.SPFile].Assembly.GetType("Microsoft.SharePoint.WebPartPages.SPWebPartReflectionHelper")
$result1 = $rh.InvokeMember("GetWebPartTypeID", 'InvokeMethod, Static, NonPublic', $null, $null, ([Microsoft.SharePoint.WebPartPages.ListViewWebPart]))
$result2 = $rh.InvokeMember("GetWebPartTypeID", 'InvokeMethod, Static, NonPublic', $null, $null, ([Microsoft.SharePoint.WebPartPages.XsltListViewWebPart]))

if (($view -ne $null) -and ($view.Type -ne "HTML"))
if (($view.Type -eq "GRID") -and ([Microsoft.SharePoint.Utilities.SPUtility]::ContextCompatibilityLevel -ge 15))
$result = $result2
$result = $result1
if ($view -eq $null)
$result = $result2
$useDV = [Microsoft.SharePoint.WebPartPages.SPWebPartManager].InvokeMember('UseDataView', 'InvokeMethod, Static, NonPublic', $null, $null, @($list, $view))

if ($useDV)
$result = $result2
$result = $result1

return $result

The goods

Here’s a nice little download that contains all functions in a single file, ready for you to use – if you dare….


If you have any feedback let me know in the comments!

SharePoint 2013: InfoPath client forms may open twice [Oct15CU bug]

December 16, 2015 - 20:47, by Steven Van de Craen - 2 Comments


After a recent Patch Night one of my customers had pulled in SharePoint updates along with Windows Updates and people started complaining about changed behavior

  1. PDF files no longer immediately open in the browser. Instead the PDF client (Adobe Reader) opens up and provides rich integration with options to check-out and such
  2. InfoPath client forms would open twice; meaning the form opens when clicking the link, but also an extra dialog appears to open the form. Users click this and receive messages about the form already being locked (by themselves!)

Hello little bug

We traced it to core.js (and core.debug.js) having a modified date of “15/09/2015 14:45” where functionality to provide “Acrobat Pro X integration” was introduced.

The function OpenDocWithClient is called in two different locations but the return value is ignored. This makes the page refresh that occurs when clicking the InfoPath form link execute more than desired.

Here’s the original (bugged) and modified (fixed by me) versions:

Original (bugged) core.js image

As you can see I added the “return” keyword for the function call, so the event cancelling can continue to bubble up.

For the minified version (core.js) you’ll have to do some digging but if you look for static strings you can find the function call. I think it was “m()” in my case.

Here ya go

You can download this archive which contains both my original and modified versions. If you modify your environment you have to update all web front ends and users will have to clear the browser cache, but no iisreset is required.

Note that later patches may overwrite the system files again and undo your manual changes.

End credits

This bug is introduced with the October 2015 updates, including the full Cumulative Update package. We escalated the case to Microsoft and they confirmed the issue and our workaround. It will probably remain present in the upcoming Cumulative Updates (November, December, …) because it’s not wide-spread and also the PG needs to fit it in into their schedule.

SharePoint 2013: Programmatically set crawl cookies for forms authenticated web sites

August 29, 2015 - 07:54, by Steven Van de Craen - 0 Comments

Last week I was having difficulties in crawling forms authenticated web sites. When configuring a crawl rule to store the authentication cookies the login page was returning multiple cookies with same name but different domain.

Add Crawl Rule

This gave issues in a later stage (during crawl) because all cookies would be sent along with the request, the target system had issues correctly identifying us due to these “duplicate” cookies.

You can easily check the request information that is sent during crawl by starting fiddler on the crawl server and configuring the proxy settings to http://localhost:8888 (default Fiddler settings).

 Search Proxy Setting

In the end we chose for an alternate method of configuring the cookies, namely through PowerShell. This gave us the ultimate flexibility to configure exactly the cookies we wanted to pass along with the crawl requests.

asnp microsoft.sharepoint.powershell -ea 0 | Out-Null $ssa = get-spenterprisesearchserviceapplication $crPath = 'http://authenticatedwebsite*' # Get or create crawl rule $cr = Get-SPEnterpriseSearchCrawlRule -SearchApplication $ssa | ? { $_.Path -eq $crPath } if ($cr -eq $null) { $cr = New-SPEnterpriseSearchCrawlRule -Path $crPath -SearchApplication $ssa -Type InclusionRule -AuthenticationType CookieRuleAccess -FollowComplexUrls $true } # Set cookie credentials $cr.SetCredentials('CookieRuleAccess', 'myUser=crawlUser; myPwd=crawlPassword', 'http://cookie-set-via-powershell')


SharePoint 2013: Some observations on Enterprise Search

August 13, 2015 - 16:44, by Steven Van de Craen - 0 Comments

I’m doing some testing with the Enterprise Search in SharePoint 2013 for a customer scenario and here are some observations…

Content Source as Crawled Property

The “Content Source” name is out of the box available as Managed Property on all content in the search index

This makes it possible to create Result Sources that aggregate content from different Content Sources similar to Search Scopes back in the old days.

SharePoint 2013 Search Query Tool #1

Meta elements (HTML <meta> tags) as Crawled Properties

Information from meta elements in web pages is extracted into crawled properties

Consider the following example:

Simple website with meta tags

After crawling this website with SharePoint 2013 Search it will create (if new) or use (if existing) a Crawled Property and store the content from the meta element. The Crawled Property can then be mapped to Managed Properties to return, filter or sort query results.

SharePoint 2013 Search Query Tool #2

Query string parameters as Crawled Properties

Query string parameters are not extracted into Crawled Properties

This was actually a request from the customer in order to be able to provide additional information regarding documents (on their website) into the search index. As I suspected it isn’t possible out of the box but you could definitely do it using Content Enrichment.

The “OriginalPath” is available as an input property for Content Enrichment and contains the exact url used for indexing the document:

Content Enrichment Input Properties

With Content Enrichment it is pretty straightforward to look for predefined query string parameters and then map them to output properties.

$ssa = Get-SPEnterpriseSearchServiceApplication $config = New-SPEnterpriseSearchContentEnrichmentConfiguration $config.Endpoint = 'http://cews:818/ContentEnrichmentService2.svc' $config.InputProperties = 'OriginalPath', 'ContentSource' $config.OutputProperties = 'MyParam1', 'MyParam2' $config.DebugMode = $false $config.SendRawData = $false Set-SPEnterpriseSearchContentEnrichmentConfiguration –SearchApplication $ssa –ContentEnrichmentConfiguration $config

More information on Content Enrichment from my SharePoint Saturday presentation:


SharePoint: Portal navigation limited to 50 dynamic items

August 12, 2015 - 11:07, by Steven Van de Craen - 0 Comments

I was looking into an issue where the Navigation Settings page wouldn’t show all subsites in the treeview. When reproducing it was limited to 50 dynamic items.

Navigation Settings

The treeview component is a Microsoft.SharePoint.Publishing.Internal.WebControls.HierarchicalListBox which connects to the active Microsoft.SharePoint.Publishing.Navigation.PortalSiteMapProvider. This has a property “DynamicChildLimit” that can be explicitly configured in the web.config.

// Microsoft.SharePoint.Publishing.Navigation.PortalSiteMapProvider public int DynamicChildLimit { get { int? num = this.dynamicChildLimit; if (num.HasValue) { return num.GetValueOrDefault(); } if (this.Version < 14) { return 50; } return 0; } set { this.dynamicChildLimit = new int?(value); } }

The active provider used is “GlobalNavSiteMapProvider”, defined in web.config as

<add name="GlobalNavSiteMapProvider" description="CMS provider for Global navigation" type="Microsoft.SharePoint.Publishing.Navigation.PortalSiteMapProvider, Microsoft.SharePoint.Publishing, Version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c" NavigationType="Global" EncodeOutput="true" />

I tried specifying Version=”14” but then it defaulted to 20 items, a path I didn’t further investigate. So I just explicitly specified the DynamicChildLimit=”100” and that fixed the issue.

<add name="GlobalNavSiteMapProvider" description="CMS provider for Global navigation" type="Microsoft.SharePoint.Publishing.Navigation.PortalSiteMapProvider, Microsoft.SharePoint.Publishing, Version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c" NavigationType="Global" EncodeOutput="true" DynamicChildLimit="100" />

<< Previous    Next >>