File Modified date misalignment in SharePoint, Explorer View and file shares/drives


Most of you have been using for ages now the Explorer View of a Document Library. It’s indeed one of the most controversial features SharePoint (sometimes painful for IT Pros) offered since early days but is arguably one of those feature that seduced the “normal” and non-IT users thanks to the easiness of move/copy drag and drop.

During a recent scenario we discovered a misalignment between a file’s “Modified SharePoint metadata field and the actual Date Modifiedof the file as displayed in the Explorer View (if one uses the Explorer View of a Document Library or if the Document Library is mapped as a network drive on the users computer – very common scenario for users) or if one checks the date of the file properties in Office applications (e,g Word, Excel, etc)

The complete scenario is as follows: the “Modified” column in SharePoint has been basically overwritten during a copy/move operation performed using the Document Library Explorer View and now reflects the “Modified” date in which the moving operation was done instead of the last modification date of the file itself.

Below the picture displays the Modified date as it appears in SharePoint (2/08/2012) which is indeed the date where the file was moved from the source to this Document Library:

Below the picture displays the Modified date as it appears if we use the Explorer View on the same Document Library (with the original date when the file was truly modified (27/7/2012):

When you upload a document to a document library SharePoint will use the last modified date as the date which the upload was done while the Explorer view (Windows) will display the modified date stored as property of the file itself!

This behavior can also be explained from another angle: “Modified” date in SharePoint is actually the Content Type modification date while “Modified” date in Windows (classical file shares or drives) is the date of the file itself.

Well, in our case this behavior creates big problems given the fact that many users continue to use the Explorer View in order to access and find their work files in their department or team site (most of them simply remember that the file they look for it was modified at a certain date in time so they will look for that file rather sorting on the modified date than on performing a search).

We looked than for a way of mapping these inner file properties reported above, carrying the last modification date of the file, onto a site column to be shown in a Document Library View. We know that this approach is very useful for search purposes: in Shared Services Administration -> Search Administration -> Metadata Property Mapping we can map the properties Basic:14(Date and Time), Basic:16(Date and Time), ows_Modified(Date and Time) onto a new property labelled, for instance, LastModifiedTime (see pic below).

Nevertheless, this mapping seems to be useful for search purposes only and unfortunately Microsoft confirmed that there is no out-of-the-box configuration for our purpose.

It seems that the only out-of-the-box solution would be to use Backup and Restore which will then preserve the original last modified dates…or custom development of course (event handlers, etc)


Managed Accounts vs unmanaged accounts in SharePoint 2010


A Managed Account is effectively an Active Directory user account whose credentials are managed by and contained within SharePoint. This scenario is what enables farm administrators to join machines to the farm without specifying the credentials as had to be done in previous versions of the product:

Attention: Some SharePoint 2010 services will break, because they are default configured with managed accounts (search, usersync). SharePoint 2010 default behavior cross-pollinates unfortunately managed accounts (Search Service app) with unmanaged accounts (default cralwer).

Architecting and Managing Virtual- ized SharePoint 2010 Farms (MIT09) – my notes


SharePoint Connections 2010, Den Haag, 29 September 2010

Session: Architecting and Managing Virtual- ized SharePoint 2010 Farms (MIT09)

Speaker: MICHAEL NOEL ( )

  • Dynamically expandable disks a penalize performance so for PROD try to define a disk size
  • Recommendations for Database Roles
    • If possible try not to virtualize the database servers
    • Mirroring and clustering are now supported in virtualization (KB 956893)
    • Use best practices for tempDB (put it on fast disk, resize it – there is a guidance on how to configure tempDB for SharePoint)
  • Sample specifications presented for various farm types (check slides)
    • Cost effective Farm would be 1 Host with 2 quad core supporting:
      • 1 vm (10Gb, 4 proc) for SQL
      • 1 vm (10GB, 4 proc) for web applications
    • High available Farm with only two servers hosts
    • Best Practice Virtual/Physical with High availability
      • High transaction servers are physical (DB). Multiple farm support with DBs for all farms on the SQL cluster
      • 2 server hosts quad core supporting each
        • 4 vm: 2 vm for web applications for PROD environment, 1 vm for web applications for TEST environment & 1 vm for web applications for DEV environment
        • VMs are load balanced for PROD, TEST and DEV environments
    • Large virtual Farms:
      • 3 server hosts quad core supporting each:
        • 1 vm for DB
        • 1 vm for web applications
        • 1 vm for search server
        • 1 vm for central admin
        • 1 vm for service applications
    • NUMA (non uniform memory access) memory Limitations and Guidelines
      • It exists at the hardware level
      • You can end up with swaps if you allocate more memory to sessions than the NUMA boundary -> instead of increasing performances you end up with decreasing performance
      • Don’t get cheap on memory if you bought a server with many CPU’s
    • Monitoring:
      • Configure Counters and Thresholds on Hosts & on Guests Very interesting slide (check photo)
        • Monitoring processor on guests is useless…you have to measure this on the host
        • Memory…over 50% free is good
    • Support from Microsoft is conditioned by:
      • The hardware used for virtualization (Intel VT or AMD-v)
      • Hardware-enforced Data Execution Prevention (DEP) is available and enabled
      • Deployed on Microsoft Hyper-V (RTM or R2) or on a validated third party hypervisor (SVVP program –> ok for VMware ESX/ESXi)
    • Tooling: System Center Virtual Machine Manager (VMM 2008 R2)
      •  SCOM 2007 is aware of SharePoint features
      • Quick provisioning: Allows creation of SharePoint template servers which can be quickly provisioned on TEST or DEV environments
    • Licensing:
      • Very important to know that licensing rules related to virtual guest licensing are applicable to all SVVP program vendors: e.g. you can run VMWare ESX/ESXi on a 1 processor host and have only one windows datacenter license for all guests (Windows Datacenter license is per host processor: 4 processors on the hosts = 4 Windows datacenter licenses; it might nevertheless be more interesting to use the Windows Enterprise virtual licensing facilities)

Windows PowerShell Crash Course for SharePoint Administrators (MIT07) – my notes


SharePoint Connections 2010, Den Haag, 28 September 2010

Session: Windows PowerShell Crash Course for SharePoint Administrators  (MIT07)

Speaker: DON JONES (

  • Most of DOS and UNIX well known commands work in Powershell  (PS) (including the good old functionality of using TAB to complete partially typed commands)
  • Powershell drives – adapts all forms of storage into a powershell drive get –psdrive
  • Extending the commands available in Powershell by default (400 commands by default out of the box): PSSnapin (Old way) vs. Modules (new Way
  • IMPORTANT: there is only one Powershell environment. The modules or the snapins are not different environments of Powershell but just predefined command set extensions.
  • Almost every Microsoft product will come with it’s own Powershell predefined Module (or Snapin) (e.g. event Active Directory has it’s own PS module)
  • PSSnapin to add snap-ins in PS and use various commands of a particular environment (e.g. a powershell snapin for SQL server will let you type SQL commands in powershell)
  • PS is build around the idea of piping = like DIR | more
  • Each time a command is run, there is an invisible table, which is generated into memory. Using an XML configuration file, PS knows how to choose what to show on the screen (obviously not all the information would fit on the screen and a choice has to be made)
  • ALL PS commands start with a verb: get, set, new, move, remove (-service, -process, -comand, etc)
  • For SharePoint, PS commands start with ‘sp’.  For SQL, PS commands start with ‘sql’; Exception is Exchange as it was the first product out on PS…well done Exchange guys 🙂
  • HELP on Powershell
    • ‘-full’ provides full help including usage examples for all commands.
    • “help * event *” will list all powershell commands or help files containing events
    • If there is a space in one parameter value you can use either ‘ or “ to include the value. Both work.
    • what if” parameter simulates the command and displays you the result without actually doing it
    • “-confirm” parameter ask you for a confirmation for each action needed for the command to complete
    • Unlike in UNIX, Powershell user does not have to process the output text of the command. Instead the user can ‘tell’ powershell how and what you want to look like (e.g. sorting a only have to know the name of the column and PS will display for you the results sorted on that column)
    • Pipeline input parameters (fastest way to make things happen): get –service –name bits | stop-service  (this will return the service which will be fed as parameter for stop –service command)
      • Another example: import-csv  ./users.csv |new –user (given the csv has the column names maching the command parameters names)
  • Remote Control
    • Requires PSH v2
    • ‘enter –pssession – computer server-r2’ will get us on the remote server-r2 (given we have access); “exit-pssession” to get out of it
    • You can import a remote set of commands not available on the local session (what happens is not a real import of commands but rather like a shortcut to the commands – if used such an ‘imported’ column it actually runs on the remote computer
    • 1:1 or 1:N remoting:
      • Enter-PSSession –computername X
      • Exit-PSSession
      • Invoke Comand – scriptblock {commands}
      • For SharePoint make sure you have granted shell administrator rights!
  • Tooling & resources:

Designing Governance: How Information Management and Security Must Drive Your Design (MIT02) – my notes


SharePoint Connections 2010, Den Haag, 28 September 2010

Session: Designing Governance: How Information Management and Security Must Drive Your Design (MIT02)

Speaker: DAN HOLME

  • Always keep in mind the scope and the goal: what you try to achieve with the solution
  • Understand the business
  • Understand SharePoint and especially its limitations
  • Identify Information Management (IM) requirements:
    • Ownership of information
    • How long the content is online and who consumes it?
    • What kind of content do you have
  • Identify SharePoint management controls and scopes
    • Requirements on authentication imply a certain choice of the Web APP.
    • Authentication providers: In SharePoint 2007 if you have two authentication providers you had to extend the web app and thou have two links for access to the same web app (disadvantage is that users could not send links between them as they were not similar) – in SharePoint 2010 you can have two authentication providers for the same web app
    • Only one-way to manage ownership and quotas: site collection. Site collections is thou directly linked to the ownership of information (who needs to have full access to content)
    • Storage, quotas, locks influence also the site collection planning (on or more content databases in regard of backup and restore time and efficiency)
    • Navigation and content types are also influencing the topology at the site collection level
    • Users are also ‘scoped’ at the Site Collection level: you might have to plan it at this level if you need to give access to users who do not have to see each other: a possible solution: create a site collection for each client?
  • Align controls and scopes with requirement
    • Recommendation: have a web app for the Intranet, another one for collaboration where people can be empowered to have self-site provisioning (http://intranet vs http://team ) and another one or more for clients (http://clients )
    • Content Promotion
      • Is extremely important: e.g. take the content from the collaboration site and publish it to the intranet once it’s finished.
      • Consider document life-time and expiration policies in order not to find yourself with another garbage place like on the old file shares;
      • Consider also exposing content from one site to another site especially using RSS or third party tools for roll-ups or content query web parts (Avepoint has a tool to share a file share as it was a document library)
    • Farm level scope:
      • GEO performance: Farm geographical location is important – it should be as close as possible to the users – performance for collaboration might be an issue
      • Isolation – dedication for a specific service might be interesting (Application services specific servers)
    • Consider Premium farms(s) for custom applications vs out of the box SharePoint farms for standard features use. Standard Farms will be a lot easier to upgrade and maintain = mitigate risks & costs
  • Overlay information architecture and administration
    • Lay Navigation (usage of top link bars and custom link lists are security trimmed), content roll/up & search on top of a manageable structure end not vice-versa!
    • Use administration tools (third parties)
  • SharePoint 2010 has now built in Resource Throttling (max 5000 items returned in a list for example) and it is scoped at the web application level
  • What users can or can not modify in their sites using SharePoint Designer it is scoped at the web app level
  • Use Powershell to create a site collection and specify a specific database (which is not possible through the UI) – for SharePoint 2007 there is a similar stsadm command.
  • TIP: in SharePoint 2010 there is a Content Organizer (allows pulling documents from a drop-off library and dispatching elsewhere) and it can be activated as a site feature.

Nettoyage MySite SharePoint – Utilisateurs supprimés ou désactivés dans l’Active Directory


Récemment mes managers ont reçu des notifications concernant le MySite des personnes qui ont quité l’organisation. Les notifications sont standard et inchangeables:

Subject: The My Site of NOM Prénom is scheduled for deletion

The My Site of NOM Présnom is scheduled for deletion. As their manager you are now the temporary owner of their site. This temporary ownership gives you access to the site to copy any business-related information you might need. To access the site use this URL: http://portalMySiteURL/personal/<username>

Explication: lorsqu’un user est deleted ou disabled de l’active directory, SharePoint envoyera automatiquement à son manager (à condition qu’il en a un dans l’AD et heureusement nous avons prévu cette configuration au début). Celui-ci a la possibilité de récupérer le contenu qu’il souhaite et éventuellement deleter le site. (le manager devient automatiquement site collection secondary owner). Ce mécanisme permets donc que les infos stockées sur MySite ne soit pas perdues lorsqu’un user part de notre organisation.

 Le problème:

Si l’user est effacé dans l’AD, le home page de son MySite  devient en fait inaccessible. Le site par contre continue lui d’exister. (Le homepage qui devient inaccessible parce que sur cette page il y a des métadatas du profil user, profil qui n’existe plus en SharePoint si le crawl des user profiles a tourné suite à la deletion de l’utilisateur dans l’AD.) Donc le manager en cliquant sur le lien dans l’email de notification recevra une erreur du type: “User not found”. Pour accèder aux listes du MySite et aux éventuels sous-sites il faut connaître et taper l’url précis (view all site content): .http://portalMySiteURL/personal/<username>/_layouts/viewlsts.aspx Cela nécessite donc une connaissance très spécifique qui ne concerne pas les managers et en plus de ça le site ne sera toujours pas effacé que par une éventuelle opération manuelle de la part du manager ou bien dépt. Infra

 Possibles solutions dans la vie réelle d’une organisation:

Alternative 1: Cocher l’option Confirmation and Automatic deletion settings pour la Web Apps MySite dans le Central Admin. Activer cette option a le grand désavantage que tous les utilisateurs vont recevoir immédiatement un message pour confirmer qu’ils utilisent leur site!!. Ensuite lors de la deletion d’un user dans l’AD, comme l’user n’existe plus, il ne peut pas recevoir d’email donc le systeme va deleter automatiquement son MySite mais alors toutes les données (éventuellement importantes) seront supprimées et son manager informé trop tard.

 Alternative 2 (que je suggère mais à voir si ok pour le département Infra): Au lieu de deleter l’utilisateur, faire seulement un disable dans l’AD pendant X jour. Suite à cette opération, le manager va recevoir l’email automatique, il pourra accèder au site en tant que secondary site owner et il pourra décider ensuite quoi faire avec. X jours après, delete user dans l’AD et delete MySite en même temps.  

Alternative 3: ne rien changer par rapport à maintenant, mais fixer X fois par an un check de coté du département Infra pour le cleanUp des MySites ou bien faire un script automatique à tourner tous les X jours.

Ces deux articles décrivent parfaitement tous les scénarios possibles:

How to display list data as XML directly in browser


Here is a very interesting and simple way of displaying List Data as XML in one click:

Different options are presented by David Wise:

  • simply display the elements of a list
  • use an existing view of a given list show particular fields
  • filter data

That’s a practical way as well for showing the xml list definition and field names.