Avantgarde Technologies

<a href="http://www.avantgardetechnologies.com.au">Avantgarde Technologies</a>
Perth's IT Experts

Wednesday, July 30, 2014

ManagedBy - You don't have sufficient permissions. This operation can only be performed by a manager of the group.

This is a gotcha when dealing with the "Managed By" attribute of mail-enabled security and distribution groups.  I found in my environment running Exchange 2013 SP1 that I am unable to change group membership by either using Exchange Management Shell (EMS) or Exchange Control Panel (ECP).  The account I was using was both a Domain Admin and a member of the Organization Management security group.

I attempted to add myself to a group called "Avantgarde Users" to the ManagedBy attribute using Exchange Control Panel however received the following error message:

You don't have sufficient permissions. This operation can only be performed by a manager of the group.


Using the Exchange Management Shell the same problem was experienced.


Using the exact same administrative account, I was able to perform this task using Active Directory Users and Computers.


Once I added myself through Active Directory users and computers, my account appeared under ownership in Exchange 2013 Exchange Control Panel as normal.

EDIT: If you use the -BypassSecurityGroupManagerCheck switch on the PowerShell command it will work.  In my opinion it seem silly to check if your a manager of the group especially if a group has no managers.

Tuesday, July 29, 2014

Could not initialize the capture device - EasyCAP DC60 Video Capture

I purchased a USB EasyCap DC60 Video Capture Adapter + Software for my parents to convert a load of home video tapes to digital for permanent storage.  This device is cheap - you can buy it online for around $10 USD and encodes great digital video in a variety of popular formats.



After the EasyCAP dongle arrived in the mail, I installed the Honestech HD DVR 2.5 software which came on the EasyCAP CD shipped with the device.  This software is used for recording video coming through the EasyCAP USB dongle and encoding it to a digital format you configure.  As for the driver for the EasyCAP DC60 Video Capture Adapter, the CD did not contain any driver files.

Windows 7 64bit automatically scanned its online driver repository for an appropriate driver and ended up detecting the EasyCAP dongle as a "Usbtv007" device.  The driver which Windows 7 64bit detected and installed was the incorrect driver and as a result, when attempting to open the Honestech HD DVR 2.5 software the following error was experienced:

Could not initialize the capture device


I spent over an hour on the Internet trolling through dodgy websites attempting to find a driver that works with Windows 7 x64 however none of the drivers downloaded matched the vendor and hardware ID's of the EasyCAP DC60 Video Capture Adapter which for my model are:

USB\VID_1B71&PID_3002&REV_0100
USB\VID_1B71&PID_3002


When I was about to give up, I stumbled across a forum thread which had a link to a driver download.  The forum thread wasn't in English so it was hard to make out however I know a download link when I see one.  This driver I downloaded matched the VID_1B71&PID_3002 Hardware ID's of the device.  I installed this driver and Walla - it worked!

To save someone the pain I went through to obtain a working driver for this device, I uploaded the driver which you can download from the following link below:

https://sites.google.com/site/cbblogspotfiles/UVG-002_driver-EasyCAP DC60.zip

Note: This ZIP contains both the 32bit and 64bit drivers.

When you install this driver the EasyCAP device will appear in Device Manager as OEM Capture.  Make sure you have the above Hardware ID's on your device before attempting to use this driver which can also be viewed in Device Manager.

Lastly if you have a Webcam, I recommend disabling the Webcam driver in Device Manager if you cannot easily disconnect it as the Honestech HD DVR 2.5 software can communicate with the Webcam instead of the EasyCAP device - at least that is what happened with me!

Hope this blog post saves someone the pain I went through!

Monday, June 23, 2014

PowerShell - Nightly DFS-R Database Backups

Windows Server 2012 R2 provides a new feature allowing customers to export and import the DFS-R database located in "System Volume Information" for any volumes with folders partaking in DFS-R replication.  For more information about this new feature, please see the following TechNet article which I strongly recommend a read over:

http://technet.microsoft.com/library/dn482443.aspx

The ability to Export and Import the DFS-R database provides the following advantages:
  • Significantly reduces the amount of time the initial sync process takes when pre-staging new file servers to partake in DFS-R replication as the DFS-R database can be exported and imported into the new server.  This process also requires the data be copied through robocopy or a backup product (such as ntbackup or wbadmin) and ensure the data is exactly the same as the source server.
  • Provide the ability to restore a corrupt DFS-R database which can be caused by incorrect shutdown of a Windows Server running DFS-R.  When a DFS-R database goes corrupt the server automatically kicks of self recovery by default which involves cross checking the file hashes against every file for the replication groups volume against the other DFS-R servers in the cluster in order to repair the state of the database.  This process can take a long time, sometimes as long as the initial sync process backlogging all new replication traffic.
Some DFS-R environments consist of a single hub server and up to 100 spoke servers with large volumes of data sometimes exceeding 10TB with over 10 million files.  In this scenario if the DFS-R database suffered corruption on the hub server, this would result in the entire DFS-R replication backlogging for weeks while the self recovery process rechecks all the files across the environment!

I have a customer with a DFS-R environment similar to the example provided above.  As a result I put in place measures to recover the DFS-R database in the event corruption occurred.  A PowerShell script was created to automatically backup the database on a nightly basis using the new Export-DfsrClone cmdlet introduced in Windows Server 2012 R2 which runs on the hub server.  In the event corruption occurs, we can simply import the database using the new Import-DfsrClone cmdlet.

This PowerShell Script performs the following:
  • Creates backups under C:\DfsrDatabaseBackups
  • Each backup is placed in a folder labelled YYYY-MM-DD HH-mm"
  • The script automatically deletes any database backups older then 14 days by default to ensure old backups are cleaned up.

#DFSR Database Backup Script - Created by Clint Boessen 15/04/2014
$basefolder = "C:\DfsrDatabaseBackups"
$datefolder = get-date -format "yyyy-MM-dd HH-mm"
$backuplocation = $basefolder + "\" + $datefolder
New-Item -ItemType directory $backuplocation
Export-DfsrClone -Volume E: -Path $backuplocation -Force

#Remove Databases older then 14 Days
$Now = Get-Date
$Days = "14"
$LastWrite = $Now.AddDays(-$Days)

c:
 
 
cd $basefolder

$Folders = get-childitem -path $basefolder |
Where {$_.psIsContainer -eq $true} |
Where {$_.LastWriteTime -le "$LastWrite"}

foreach ($Folder in $Folders)

{
write-host "Deleting $Folder" -foregroundcolor "Red"
Remove-Item $Folder -recurse -Confirm:$false
}
 

Add the above script to a PowerShell script "ps1" file and create a scheduled task on your DFS-R file server to run the script according to a schedule in which you want DFS-R database backups to occur.  Once configured, you will see DFS-R database backups occurring on a regular basis according to your schedule with old backups automatically being cleaned automatically!



I scheduled my script to run at 5am on weekdays.  Please not the backup process can take hours, in my environment due to large amount of files the export is taking a total of 3 hours finishing around 9am which you can see by the date modified timestamp.

It is important to note, DFS-R replication will not work when a database backup is occurring.  As a result please ensure the backups are scheduled at a time when replication can be paused.  Replication automatically resumes after the export process is completed.

PowerShell - Locate Missing SYSTEM Permissions from Folder Structure

I am in the middle of a DFS-R project for a customer where I'm provisioning new Windows Server 2012 R2 file servers and migrating the data across to the new server.  To perform the migration I initially performed the pre-sync of the data with robocopy in backup mode "/b" then added the new servers to the DFS-R replication group/namespace.  Once the initial DFS-R sync had completed which took a few days, I enabled the namespace for the new servers and disabled the old servers.

Upon cutting the users across, many users complained the data was approximately 7 days old which is the approximate time I did the initial robocopy.  After further investigation it appeared DFS-R was not keeping the data in sync and many directories had not been replicated.  These files and folders which were not replicated also did not appear in the backlog count under DFS-R Health Reports which were run to verify replication status.

It turned out the cause of this issue was because the "SYSTEM" permissions were missing from many directories in the file server structure.  As the DFS-R service runs under the "SYSTEM" account, it must have access to the data in order to perform replication.  Robocopy was however able to move this data as it was running in backup mode which uses VSS to snapshot the data.

This directory structure utilised block policy inheritance numerous times throughout the folder structure and as a result finding directories which did not have SYSTEM permissions configured correctly was a challenging task.  As a result I wrote a PowerShell script which performs an audit against a directory structure and returns all folders which are missing the "SYSTEM" permission so that an Administrator can manually add the missing permission at all folder levels with inheritance broken.

This is a handy script and I posted it online for everyone as I recommend running it against any directory structure on file servers to ensure the SYSTEM account has full control over all data, a recommended Microsoft best practice.

$OutFile = "C:\Permissions.csv"
$RootPath = "E:\PATHTOBESCANNED"

$Folders = dir $RootPath -recurse | where {$_.psiscontainer -eq $true}
foreach ($Folder in $Folders){
       $ACLs = get-acl $Folder.fullname | ForEach-Object { $_.Access  }
     $Found = $False
       Foreach ($ACL in $ACLs){
    if ($ACL.IdentityReference -eq "NT AUTHORITY\SYSTEM")
        {
            $Found = $True
        }
       }
             if ($Found -ne $True)
        {
             $OutInfo = $Folder.FullName
             Add-Content -Value $OutInfo -Path $OutFile
         }
    }

I hope this PowerShell script helps other people out there!

Monday, June 9, 2014

V-79-57344-38260 - Unable to create a snapshot of the virtual machine

In this post I am going to shed some light on a very common error in Backup Exec 2010, 2012 and 2014 when performing VM backups from a VMware Hypervisor.  The error experienced is:

V-79-57344-38260 - Unable to create a snapshot of the virtual machine. The virtual machine no longer exists or may be too busy to quiesce to take the snapshot.

This error is extremely generic and can be one of 10 possible problems.  For a list of all possible problems see the following knowledge base article from Symantec:

http://www.symantec.com/business/support/index?page=content&id=TECH146397

The most common cause which is not documented heavily on the Internet is the lack of the Storage APIs.  The Storage APIs are only available in VMware vSphere Standard Edition and higher.  If your using the free version of ESXi, you will not have the Storage API and hence your backups will not work.  Check the licensing page on your ESX host to see if you have the Storage API in your license key:
 If you don't have it, upgrade your license key and success will be yours :).

Sunday, May 18, 2014

How to Delete Files which exceed 255 Characters Without 3rd Party Tools

Windows Explorer and many Windows applications including PowerShell are limited to 255 characters max file path.  Whilst this limitation is in place at an application level, the NTFS file system does not support this limit.  In fact file paths can be created remotely over the SMB protocol to exceed this limit which is how most file servers get stuck with folder paths administrators can no longer maintain using the native Windows Explorer application.

When attempting to delete folders using Windows Explorer the following errors may be experienced:

The source file name(s) are larger than is supported by the file system. Try moving to a location which has a shorter path name, or renaming to shorter name(s) before attempting this operation.

 
An unexpected error is keeping you from deleting the folder. If you continue to receive this error, you can use the error code to search for help with this problem.
 
Error: 0x80004005: Unspecified error
 
 
Even new applications from Microsoft such as PowerShell do not support file paths longer then 255 characters despite this being supported by NTFS.
 
Remove-Item: The specified path, file name, or both are too long.  The fully qualified file name must be less than 260 characters, and the directory name must be less then 248 characters.
 

I am going to show you a way to remove excessively long file paths without using third party tools such as Long Path Tool which come at a price or booting into different operating systems such as Linux to remove the unwanted file paths.

One Microsoft application which is not limited to the 255 character limit is robocopy.exe.  I know this as I often move large volumes of data with Robocopy between server infrastructure and have never been hit with a file path limitation.  As a result, this is the tool I chose to remove the data.

If you use robocopy with the /MIR switch, it will make the destination folder exactly the same as the source folder.  So if the source folder is empty, it will delete all data in the destination empty and in result deleting the content.

I have a path here with 3 users which have folder structures which exceed 255 characters.  Windows Explorer failed to remove these folders.


I created an empty folder on C:\ called test then used the mirror switch to copy the test folder to the HomeDrives folder.

robocopy /MIR c:\test E:\UserData\HomeDrives


 After running the command all my user folders under E:\UserData\HomeDrives were deleted.

This is a handy trick for dealing with folders on file servers which have excessive amounts of long folder structures which exceed the 255 character limit.

Hope this has been helpful, feel free to leave me a comment below.

Tuesday, May 6, 2014

HP Proliant MicroServer G7 N40L not working with 2012 R2

The HP Proliant MicroServer G7 N40L is a great lightweight server perfect for small business and home use.  However with the release of Windows 8.1 and Windows Server 2012 R2 you may find problems booting the operating system.  After completing the installation, on first boot it will hang on "Getting devices ready" forever.  This is due to the on-board Broadcom NIC which has issues with only Windows Server 2012 R2 and Windows 8.1

There are a few forum threads on the Internet about this issue including:

http://forum.wegotserved.com/index.php/topic/29031-hp-microserver-n40l-windows-81-installation-problems/

http://forums.whirlpool.net.au/archive/2179812

HP has released a BIOS firmware update which resolves this issue by updating the firmware to 2013.10.01 (A) (15 Nov 2013).  This firmware update is packaged in "SP64420.exe" and is available for download from the following HP website.

http://h20566.www2.hp.com/portal/site/hpsc/template.PAGE/public/psi/swdDetails/?sp4ts.oid=5336618&spf_p.tpst=swdMain&spf_p.prp_swdMain=wsrp-navigationalState%3Didx%253D%257CswItem%253DMTX_57720d956df94dfcbaa0e28256%257CswEnvOID%253D4064%257CitemLocale%253D%257CswLang%253D%257Cmode%253D%257Caction%253DdriverDocument&javax.portlet.begCacheTok=com.vignette.cachetoken&javax.portlet.endCacheTok=com.vignette.cachetoken

Unfortunately to be able to download this package from HP, your server must be under warranty or you must have a special account on the HP website.  I find this ridiculous as it is clearly a bug with the current firmware and if your server is out of warranty this means you can never upgrade the operating system!  What a joke!!

Luckily, I have uploaded this BIOS update to my Google hosting to ensure anyone who finds themselves in the same situation as me is able to get the SP64420.exe and update their BIOS for a HP Proliant MicroServer G7 N40L allowing them to run Windows 8.1 or Windows Server 2012.  This download is available from the following link:

https://sites.google.com/site/cbblogspotfiles/SP64420.zip

Saturday, May 3, 2014

RBL Providers and Exchange 2013

In this post I want to address Real Time Blocklist Provides and Exchange 2013 as there are some differences you need to be aware of.  Please note RBLs are often also referred to as DNS Block Lists or DNSBL.

As you may be aware now, the Exchange Transport stack has been separated into a backend and front end role.  The Front End Transport component runs on the Client Access Server and the Backend Transport component runs on the Mailbox Server role.  If you deploy a multi-role server, both of these components reside on the same server but still work independently as separate services called "Microsoft Exchange Frontend Transport" and "Microsoft Exchange Transport".

In previous versions of Exchange such as Exchange 2007/2010, when you enabled anti-spam filtering on an Exchange server using the install-AntispamAgents.ps1 script, it would install both the Connection Filtering and Content Filtering transport agents on the same Transport service, as there was only one Transport service running on the Hub Transport role.  Now in Exchange 2013 as there are separate transport services, the anti-spam functionality of these roles has been split across the transport services.

Connection Filtering such as IP Block Lists, IP Allow Lists and RBL Providers now run on the Front End Transport Service.  Content Filtering including the Exchange Intelligent Message Filter (IMF) runs on the Exchange Backend Transport service.  In addition to the Content Filtering agent a new agent has also been added to the backend called the Malware Agent which is responsible for detecting viruses in email messages.  The new architecture has been shown below:

 
When you have a multirole deployment of Exchange 2013 with both Client Access and Mailbox roles on the same server, the Install-AntispamAgents.ps1 script will only configure the backend transport service for anti-spam functionality meaning the content filtering agent will be installed.  This means if you add any block list providers with the Add-BlockListProvider cmdlet, these will not function as a Connection Filtering agent is not available.
 
To ensure Connection Filtering is available, install the Connection Filtering agent with the Install-TransportAgent cmdlet.  This can be done with the following PowerShell command:
 
Install-TransportAgent -Name "Connection Filtering Agent" -TransportService FrontEnd -TransportAgentFactory "Microsoft.Exchange.Transport.Agent.ConnectionFiltering.ConnectionFilteringAgentFactory" -AssemblyPath "C:\Program Files\Microsoft\Exchange Server\V15\TransportRoles\agents\Hygiene\Microsoft.Exchange.Transport.Agent.Hygiene.dll"
 
 
After you have installed the agent you must then enable it and restart the Front End Transport Service with the following command:
 
Enable-TransportAgent -TransportService FrontEnd -Identity "Connection Filtering Agent"
 
Restart-Service MSExchangeFrontEndTransport


Next you can begin adding the RBL providers you wish to utilise such as the popular "Spam Haus" provider with the following PowerShell command:

Add-IPBlockListProvider -Name zen.spamhaus.org -LookupDomain zen.spamhaus.org -AnyMatch $true -Enabled $true

I added the following RBL providers in my environment:


You can verify that both the front end Connection Filter agent and back end Content Filter agents are installed and working by using the Get-TransportAgent commands as follows:

Get-TransportAgent


Get-TransportAgent -TransportService FrontEnd


The Connection Filter Agent logs get saved to the following location by default, after a few days of operating I can see log files accumulate.  This directly will automatically get created as soon as the Connection Filter Agent attempts to write a log file so it wont be created straight away upon agent installation.

C:\Program Files\Microsoft\Exchange Server\V15\TransportRoles\Logs\FrontEnd\AgentLog


To get a summary on your top RBL Providers from these log files run the following command:

.\get-AntispamTopRBLProviders.ps1 -location "C:\Program Files\Microsoft\Exchange Server\V15\TransportRoles\Logs\FrontEnd\AgentLog"

 

Dealing with Health Proxy Probe Messages in Exchange 2013 Managed Availability

Exchange 2013 introduces a new feature called Managed Availability which performs system monitoring of various components of Exchange infrastructure and provides the ability to detect and recover from problems as soon as they occur.  One of the tests Exchange Managed Availability performs is to send an "Inbound Proxy Probe" message once ever 5 minutes from an Exchange 2013 front end server to the backend server.  If you have a multi-role server deployment, this proxy probe occurs locally on the Exchange 2013 server between the frontend mail service "Microsoft Exchange Frontend Transport" and the backend transport service "Microsoft Exchange Transport". The purpose of this test is to ensure transport functionality is working as expected between these components.

This test however can be problematic and two issues are often seen by Exchange Administrators including:
  • In the event Content Filtering (Intelligent Message Filter) is configured on the Exchange 2013 backend with the install-AntispamAgents.ps1 script, Inbound Proxy Probe messages will be quarantined, rejected or deleted by the spam filtering engine.
  • The Inbound Proxy Probe messages get delivered to the Exchange 2013 backend health mailbox and stored - these can build up over time causing clutter.
Both of these issues have been addressed below.

Exchange Content Filtering Blocking Probe Messages

Exchange Content Filtering can block messages relayed between the Exchange 2013 front end and Exchange 2013 back end.  In my lab environment I have a Spam Confidence Level (SCL) set to 5 then quarantine, this is resulting in a large volume of probe messages being quarantined in my spam mailbox as shown below.


In my lab environment these "Subject: Inbound proxy probe" messages are being sent from postmaster@at.local to inboundproxy@contoso.com (the address for the inbound proxy as from Exchange 2013 SP1).


To stop these messages from being caught by Intelligent Message Filter simply put in an exclusion by using the Set-ContentFilterConfig command.  In my environment the command I used was as follows:

Set-ContentFilterConfig -BypassedSenders postmaster@at.local -BypassedRecipients inboundproxy@contoso.com

The messages were coming from the sender postmaster@at.local and the recipient was inboundproxy@contoso.com, as a result this exclusion stops IMF from detecting these messages as potential spam.

Build Up of Health Probe Messages

As Inbound proxy probe messages are sent ever 5 minutes, a build up of these messages can accumulate on the Exchange 2013 backend health mailboxes.  To view the health mailboxes in an Exchange Management Shell, use the Get-Mailbox command with the -Monitoring switch.  To view item counts of the health mailboxes, run the following command

Get-Mailbox -Monitoring | Get-MailboxStatistics | ft DisplayName,ItemCount,LastLogonTime


Note: Every Mailbox Database contains two health mailboxes in Exchange 2013 by default.

As you see in my environment, my health mailboxes have a large build up of probe messages shown by the item count.  To control the build-up of messages in the health mailboxes, you can simply leverage Exchange Retention - something which has been around for a while in Exchange!  To do this you need to create both a retention policy and a retention tag which can be done with Exchange Management Shell (EMS) or by using the new Exchange Administration Centre (EAC).

First Create a Retention Tag, I called mine "Delete items older then 2 days" and configured the tag as follows:

 
Then create a Retention Policy and link the Tag.  I called my Retention Policy "Health Mailbox Retention Policy".
 
 
Apply the retention policy only to your Health Mailboxes which can be done with the following command:
 
Get-Mailbox -Monitoring | Set-Mailbox -RetentionPolicy "Health Mailbox Retention Policy"
 
Check that it applied with the following command:
 
Get-Mailbox -Monitoring | fl *RetentionPolicy*
 
Now the Managed Folder Assistant will automatically delete emails older then two days.  The Managed Folder Assistant is always running and begins cleaning emails at times when the server is at low utilisation as to not disrupt business, however you can force the Managed Folder Assistant to do the first initial clean-up of your health mailboxes with the following command:
 
Get-Mailbox -Monitoring | Start-ManagedFolderAssistant
 
After it finishes its initial cleanup, you will notice the item count within these health mailboxes has significantly reduced.


Note: In the event you create additional mailbox databases, new health mailboxes will be created.  Ensure you link the retention policy to any new health mailboxes.

Monday, April 28, 2014

Problems Removing WSUS from SBS 2008

Once upon a time (approximately 4 years ago) I built an SBS 2008 server for a small business,  everything worked, everything was automated and the customer was very happy.  I was engaged to build this server, deploy the workstations and setup the network infrastructure but not maintain it.  Four years later the customer tracks me down by and calls me back with a mountain of problems.  The integrator here in Perth Western Australia (which will remain nameless) turned my perfect SBS deployment into a nightmare of problems due to unexperienced engineers performing server work - something which is far to common here in Australia.

One of the problems I faced was with the Windows Software Update Services, the service which "deploys updates" to computers on the network.

I found the "Update Services" service in a disabled state, the Windows Internal Database (WID) SQL Instance was completely missing from the server.  As a result the Windows Server Update Services mmc console would simply not launch and crash.  Starting the Update Services service also did not help improve the situation due to the missing SQL WID database.

Running a "wsusutil reset" simply failed due to the lack of database!

Fatal Error: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)

 
Attempting to uninstall WSUS resulted in the following error messages:
 
The Windows Server Update Services 3.0 SP2 was not removed successfully because an error occurred.
 
 
Attempt to un-install Windows Server Update Services failed with error code 0x80070643.  Fatal error during installation
 
 
These uninstall errors were being caused because the SQL database no longer exists.  To remove a WSUS server which is missing its SQL database I found that you need to modify the following registry key:
 
 HKLM\Software\Microsoft\Update Services\Server\Setup
 
DWORD: wYukonInstalled
 
Set the decimal value to 0
 
 
Repeat the uninstall process (make sure you select not to remove the SQL database) as part of the uninstall and it will complete successfully.
 
 
After performing the uninstall, a simple server reboot was required before I re-installed WSUS using Windows Server 2008 Server Manager.

Changing Mailbox Permissions in Exchange Not Working

Over the years I have had numerous customers contact me and complain that when they change mailbox permissions such as "Manage Full Access Permissions" in the GUI or by changing permissions in Exchange Management Shell, the permissions do not apply to end users.  Other times administrators remove permissions to a mailbox and complain that the user still has access to the mailbox.

I always end up explaining to my customer "the change will work, just wait longer".  I want to take a few seconds to look into what is happening.

When you change permissions on a mailbox, the permission is not being set on the mailbox but in fact in Active Directory.  It takes time for the Exchange server to commit the change made to the information store.  This is due to the MBI cache which runs on the Exchange information store.

The information store caches information contained in the directory store and by default re-reads it every 120 minutes.  Any change made to Active Directory such as a mailbox permission change is not read by the information store for at least 2 hours.  It is also important to note that if the information store is idol (not busy) for over 15 minutes it can update its MBI cache faster from Active Directory.

Is there a way I can force the permission change made to the Exchange server?  Yes there are two ways of achieving this:
  • Reboot the Exchange Server
  • Restart the Information Store service
Both of these methods will result in down time.

It is also possible to tweak how often the MRI cache is updated by the information store service by modifying the registry.  The MRI cache is controlled by two registry settings, "Mailbox Cache Age Limit" and "Mailbox Cache Idle Limit".  The default for these settings are as follows:

Mailbox Cache Age Limit = 120 minutes
Mailbox Cache Idle Limit = 15 minutes

These registry values are REG_DWORD and are located under:

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\MSExchangeIS\ParametersSystem

NOTE: This registry entry is not a switch, it is a setting. If it is set to 1 the server rereads the cache every minute; if it is set to 2 the server rereads the cache every 2 minutes, and so forth.

If you change these registry keys you must restart the information store service for the change to take effect.

For more information about these registry values please see:

http://technet.microsoft.com/en-us/library/aa996988.aspx

As a general rule of thumb I would not recommend companies modify these default values, there is no need for it.  Just be patient and your permission change will automatically take effect.

In regards to Exchange 2013 I'm not sure if it follows the same behaviour as previous versions of Exchange such as 2007/2010.  In Exchange 2013 the information store has been overhauled and has been divided into separate worker processes responsible for each database.  Due to these architecture changes, the MBI cache may work/update differently.  If you are reading this and have experience with MRI cache and Exchange 2013, please feel free to comment below and share the knowledge!

Tuesday, April 22, 2014

Recursively Remove Mailbox Folder Permissions

You have been assigned a task to recursively remove mailbox permissions from a user mailbox in Exchange 2010/2013 using Exchange Server management tools.  How would you go about doing this?

Well there is a PowerShell command in Exchange called "Remove-MailboxFolderPermission" but this command only allows you to remove permissions from one folder at a time.  This cmdlet does not have a -Recursive option to allow you to propagate this against all sub folders.  The Set-MailboxFolderPermission, Remove-MailboxFolderPermission and Get-MailboxFolderPermission commands can be run by users who are a member of any of the following groups:
  • Organization Management
  • Recipient Management
  • Help Desk
I can run these commands as a member of the Organization Management role group against any user in my environment as shown in the following screenshot:


Now if we want to recursively remove mailbox folder permissions from all folders within a mailbox, we still need to use the Remove-MailboxFolderPermission command however we need to pipe this command into another command allowing us to recursively move through all the folders in the mailbox as the Remove-MailboxFolderPermission does not support the recursive action.  This can be done with the following command:

Get-MailboxFolder -Identity Bugs.Bunny:\FolderName -Recurse | Remove-MailboxFolderPermission -User username

At this point however you may hit another problem.  The Get-MailboxFolder cmdlet must be run under the user context who owns the mailbox.  For example, I want to use the Get-MailboxFolder cmdlet which is required in the above command to view the Inbox folder of my bugs.bunny test user.  To do this I run the following PowerShell command under my Administrator account which is a member of "Organization Management" but I get the following error:

Get-MailboxFolder Bugs.Bunny:\Inbox

The specified mailbox "bugs.bunny" doesn't exist.   
+ CategoryInfo          : NotSpecified: (0:Int32) [Get-MailboxFolder], ManagementObjectNotFoundException   
+ FullyQualifiedErrorId : 2DCA0FEB,Microsoft.Exchange.Management.StoreTasks.GetMailboxFolder


If I run the same PowerShell command under the security context of bugs.bunny however, we will not have any issues.  For example in the following screenshot I have the Exchange Management tools installed on a Windows 7 PC.  I am going to run the Exchange Management Shell as the bugs.bunny user account and run the same command.

 
Now that my Exchange Management Shell is running as bugs.bunny, when I run the command I will receive no error.
 
Get-MailboxFolder Bugs.Bunny:\Inbox
 
 
 
Important: The Get-MailboxFolder command must be utilised under the security context of the user whom owns the mailbox.

Now that we understand the limitations with the Get-MailboxFolder cmdlet, we understand that in order to recursively remove mailbox permissions from a mailbox, we must run the following command under the security context of the user account in order for the Get-MailboxFolder recursive command to work successfully.

Get-MailboxFolder -Identity Bugs.Bunny:\FolderName -Recurse | Remove-MailboxFolderPermission -User username

This method is not efficient as it involves the administrator contacting the user to gain access to their personal password.

There is however an alternative method for recursively removing permissions on a user mailbox using ExFolders, the new version of PFDavAdmin.  Download the ExFolders tool from the following TechNet URL:

http://gallery.technet.microsoft.com/Exchange-2010-SP1-ExFolders-e6bfd405

Once downloaded follow the instructions in the readme.txt file to install the tool and copy it to the Exchange bin folder.  ExFolders allows you to browse permissions on any of the sub folders of a users mailbox by simply right clicking on the folder and navigating to permissions.


If you want to recursively change permissions, simply right click on the folder tree you want to recursively remove the permissions from and click "Propagate folder ACEs".

 
Select the user which you want to remove from all sub folders and click Remove.
 
 
This will go through every sub folder in the hierarchy and remove the permissions.
 
 
Note: If the user you want to remove has been granted rights to only specific sub folders throughout the hierarchy, add the user to the root folder first so you are able to select the user.  For example if my Hulk Hogan user account had permissions to the alerts folder and one of my MVP folders, I would not be able to select the user from the root of the Inbox folder as it does not exist at this level.  By adding it to the root of the Inbox folder first, I am able to select the user for a recursive remove.
 
Important: Before you can use ExFolders you must have full mailbox access of the recipient for which you are looking to modify.  You can grant this using Exchange Management Console (EMC) by selecting "Manage Full Access Permission".  See in the screenshot I provided the AT\Administrator account rights which is the account I used to run ExFolders.