All posts by admin

Why Slack is winning

Why Slack is winning

Or the importance of cross platform apps.

I saw a twitter post the other day about how Microsoft bought Yammer for a billion dollars, and how Slack came in and kinda took over what Microsoft thought Yammer could be.

It made me reflect a bit.

I’d thought of using Slack on a few projects I have, but I had never thought of using Yammer.

Why is that?

Slack has a mac App.

Could something as simple as that be it?

Certainly Slack has done quite a few neat things, but really at the end of the day, aren’t slack, facebook, yammer, newsgator, you-name-it all pretty much re-imaginations of the “Bulletin board”?

Lets look at the importance of having ‘native’ apps. (I put native in quotes, because in this case, it’s not really important if the apps is “truely” native- it might have been developed once using a cross platform framework.) Note that everthing in the list below, is solved by having an “App for that”

Let me share some things I don’t like about your web app (This is ANY web app)

  1. I don’t like not being able to start your web app easily.
  2. I don’t like having to sign on to your web app.
  3. I don’t like that when your web app is running, I can’t easily tell which of my running apps is yours, and which is the google page I opened.
  4. I don’t like that I can’t easily bring your web app to the front of my screen. (because of #3)
  5. I don’t like that I accidentally close your app when I close my browser, and have to go through steps 1 and 2 all over.

In other words, if I’m going something every day, the 5 things above will annoy me as a user EVERY DAY and I will seek alternatives.

The original folks that created Yammer made a nice app using adobe air – this was a cross platform app that worked on mac and windows, and I used it when I first was introduced to Yammer.

Then Microsoft Killed it.

The day the Yammer Mac app stopped working on my mac, was the last day I used Yammer. I had actually forgotten that Yammer existed until I saw the tweet that sparked this blogpost.

I’m sure some will argue that there are other reasons Yammer failed to live up to its expectations, and some will argue there are other reasons Slack is so popular right now. I don’t mean to discount those reasons, just felt like sharing that Yammer lost me when the dropped mac support.

I use slack now.

Is the MVC acronym even Accurate?

Is the Acronym MVC even accurate?

MVC, Short for Model View Controller is a pretty popular acronym in the web development space, but I wonder, if the acronym itself is making it more difficult for beginners?

I know it did for me.

I think a better acronym might have been RCMV.

Let me explain my thoughts…

Whats the first thing you learn in any MVC tutorial? Routing. Where’s the ‘R’outing in MVC? It’s missing.

Routing is not only the first thing you learn, but it’s also the first thing you deal with for a request to make it’s way in and back out of your webserver. Routing comes first, and it’s not even mentioned in MVC.
Lets add the R to MVC. Now while MVCR rolls off the tongue, It still puts the letters in the wrong order. Routing comes first so it should really be RMVC.

So far so good.

Now about the order of MVC…

Every tutorial I’ve ever read pretty much points routes to controllers.

So logically, our Acronym should be RC _ _
Controllers typically make use of Models, so in order that gives us RCM _ which leaves View or RCMV

R oute
C ontroller
M odel
V iew

See what we did there? We now have an acronym that not only spells out each piece of how we do things, but does so in order.

I wonder, how much easier would this be for people brand new to Software Development? Would it make it 1% easier? 3%? 5%?

It seems to me that at some point in learning, our brains jump from having to read something every time, to “just knowing it”. The amount of time it takes progress like this varies per person. But one thing is certain, up until the point where a developer “just knows it”, they don’t know it, and have to extend some thinking power and energy EVERY time they come across it.

If RCMV can lower the amount of thinking power needed during that critical early learning stage, isn’t that a good thing?

Here’s my ask:
If you find yourself helping a brand new developer try to understand “MVC” try presenting it to them as RCMV and see if that makes it easier.

Who knows, maybe it will catch on.

A review of several denoise Audio plugins

I was looking for a way to clean up my audio so I downloaded few demos of denoise plugins and did this youtube video:

It’s 20 minutes long. If you don’t have time to view it, here’s what I covered:

Acon Digitial Denoiser (part of the audio restoration suite $99)

iZotope RX4 Dialog Denoiser (part of the RX4 standard suite $349)

Waves NS1 Denoise plugin ($200)

Waves WNS Denoise plugin ($500)

The Denoiser that comes with Logic Pro X

In Summary:

There were two categories of results:

  1. Plugins that helped with out making it worse.
    1. Waves NS1
    2. Waves WNS
    3. RX4 Dialog Denoiser
  2.  Plugins that were able to make the sound worse
    1. Acon Digital Denoiser
    2. Logic Pro X’s Denoiser

Overall I felt that RX4 did the best job, working like magic for my voice and the noise I was trying to remove.

Waves NS1 did a pretty nice job also.

Waves WNS didn’t do a whole lot for me, but maybe on a different sound source and with different background noise it might have.

Acon and Logic’s denoisers were both troubling, it was easy with both to introduce a ‘gurgling’ sound. I am always leery of plugins that add things like this.

If you’re in the market for such plugins, put on a pair of headphones and listen in a quiet room, and you’ll pretty clearly hear the difference.

Home Made DIY Acoustic Panels

I’ve recorded a few training videos here and there.  Being a bit of an Audio Nerd, I’m really concerned with sound quality.  The room I did my recordings in was small office with lots of hard surfaces and there was a noticeable echo.

As my Audio Engineer friend Bob Demaa would say: “You can take the mic out of the room, but you can’t take the room out of the Mic”

So here’s what I set out to do:

  • Either Buy or Build panels
  • Some Ceiling mount
  • Some wall mount
  • Ideally, lower cost, yet still effective.

Here’s what I did:

IMG_1181   IMG_1105  IMG_0996

Parts were sourced from 3 suppliers:

I chose all suppliers based on location – I’m in the Chicago Area, and the Insulation Boards, which are typically the hardest thing to source, are in stock at Fabrication Specialties and not too long of a drive.


DIY panels are both easy and cheap –  I wanted these to look  at clean as possible, while also being as effective and lightweight as possible.

I’d seen several articles and loosely based my work on this one:

In that blog post, Eric makes an inexpensive frame around the insulation using 1×2 furring strips from Home Depot.

It seemed like a great idea so I did the same thing, with a twist: I put my wooden frame behind the insulation:



In the picture above I have a 4″ thick piece of Rock Wool, with the frame boards laid out.

As I was walking through Home Depot, I wanted to find something inexpensive and easy to work with that I could use on the front size of the rock wool to maintain a sharp edge.  It turns out Plastic Drywall edging was perfect! It’s not only lightweight, and easy to cut / work with, but it’s perforated which means I’m not giving up much in the way of acoustic performance around the edges!

IMG_1072     IMG_1074


I used a pair of Tin snips I had to cut the plastic corners and they worked perfectly.

I wrapped the ceiling panels with white fabric from JoAnn, using a hand stapler:



The hand stapler got a little tiring, and I already owned an air stapler/compressor so I switched over to that after the first one.

I used eye bolts to hang them, here’s a close up of how that worked – the piece you see with the threads was screwed into the ceiling. That was the worst part of this whole project – Tolerances are pretty tight – I had in mind that I would screw the hook all the way to the ceiling for a near flush mount look. My goal was not to see the hooks, which if you think about it, is kinda funny -after all, I’m hanging these ridiculous panels from the ceiling, it’s not like not seeing the hooks is going to increase the “Wife Acceptance Factor” at that point…

IMG_0995   IMG_1078

Here are a few finished shots of the 4″x2’x2′ panels:

IMG_0989 IMG_0990 IMG_0997


I learned a few things building these that I’d like to pass on:

  • White fabric is actually kinda hard to make non-transparent – as you can see in the photo, you can kinda see where the drywall edging is underneath. To get around this, Joanne sells a very cheap white fabric called muslin, it might be worth a layer of that first, or better still, pick a color fabric that isn’t white.
  • The metal hooks I hung with were nearly impossible to get perfectly lined up – if you can stomach seeing the hooks and the eyelets, it’s likely 10x easier to hang them.
  • The metal hooks I hung them with would rattle when the floor above was walked on. – this eventually either stopped or I don’t  notice it anymore, but it could have been easily prevented by using heat shrink tubing on the hooks, or by wrapping them in electrical tape.

Ok on to the wall panels!

For the Wall panels, I used 2×4 x 2 inch thick rock wool.

I again placed the frame behind the material, and I apologize that I don’t have a bunch of pictures from this phase.



In the picture above you see a nearly completed frame, and you’ll notice I have a center support in this one.

In the next one I built, I switched the center supports to two spaced out at 1/3 intervals like this:


I did this so I could hang the frame either vertically or horizontally.

Speaking of hanging, I wanted something that would hang flush to the wall if possible, and I didn’t want to spend $15 per panel on elaborate hardware.

I thought about making a french cleat, but I didn’t have a table saw.

I found some trim that looked like it would work so I nailed about a 1 foot piece to the inside top and side of every 2×4 panel I built.

Again, I don’t have a picture, hopefully this illustration will do:

frame with hangars  frame 4

The Trim piece works on the wall something like this:

hook on wall


Now for the painful part: Did it work? How long did it take? Did I save any money? Was it worth it?

Yes, it worked.  Echo / delay in that room is greatly reduced.

It took several weekends to assemble, plus a trip to the insulation supplier about 30 miles away, plus multiple trips to Joanne Fabric and Home Depot.

I built a total of 7 burgundy 2 inch thick 2×4 panels for the walls, plus 4 white 4 inch thick 2×2 panels for the ceiling.

Thats a total of 11 panels. I figure I spent $336 or about $31 a panel.

Was it worth it?

From an end results perspective, absolutely! Acoustically treating that room was the best thing I could have done!

From a cost/time/effort perspective, probably not. I was just talking with a friend today who had outfit his home studio with panels from ATSAcoustics. Those panels are nearly identical to the ones I built above, and only slightly more expensive, with the huge benefit that someone else has figured out all the variables, and done all the work – There is something to be said about having something arrive at your doorstep ready to hang on your wall!

Update June 2015:

The DIY panels have held up well. My wife hijacked my office (shown above) and I needed to outfit a second space. I did a ton of research and settled on AudiMute panels and I couldn’t be happier! There were a lot of things I liked about AudiMute verses the competitors, and I definitely recommend checking them out.  DIY is still cheaper, but As you see from the article above, DIY wasn’t super easy either! I can’t tell you how great it felt to just order panels, and have them delivered, and hang them, with less than 30 total minutes of my time invested!

– Jack

Starter Lighting for at home webcasting

I did a few skype interviews for SharePoint Saturday Chicago Suburbs and found that the room lighting wasn’t giving the look I wanted.

I looked at some professional lights, but they were expensive and I really didn’t have room for them in my small home office.

Ikea to the Rescue

I purchased two clamp mounted desk lamps from Ikea for $9 each. Initially I tried aiming them directly at me, but it was too distracting.

I then purchased two foam core boards from staples for about $5 each – this setup is pictured below:



When I am not “filming” I can put both Foam boards away and my room is largely back to normal.

The background behind me still felt a little dark in the resulting video so I grabbed one of those $5 clamp lights from home depot, and temporarily put it on top of my book case, pointed at the ceiling, which reflected back down and brightened up the back wall:



Total cost: $33 plus light bulbs

Come Say Hi at SharePoint Saturday Chicago Suburbs 2013

I’m part of a small team putting together a SharePoint Saturday Event for the Chicago Suburbs

Official Info is at

We’re holding the event on Saturday June 1st at the DeVry campus in Addison IL – Right about where I-355 and I-290 intersect.

We’ve got a ton of great sessions in the works, and like any SharePoint event, it’ll be a great opportunity to network with other SharePoint professionals!

I hope to see you there!

– Jack

Sharing service apps between farms.

See Section 8

Run this PS on both farms and exchange certs:

$rootCert = (Get-SPCertificateAuthority).RootCertificate
$rootCert.Export(“Cert”) | Set-Content D:\Certs\ConsumingFarmRoot.cer -Encoding byte
$stsCert = (Get-SPSecurityTokenServiceConfig).LocalLoginProvider.SigningCertificate
$stsCert.Export(“Cert”) | Set-Content D:\Certs\ConsumingFarmSTS.cer -Encoding byte

Then you can use Central admin (Security->Manage Trusts) to enter these in.

From the Consuming Farm, run get-farm | Select ID to get the ID of the consuming farm.

$farmID = <ID from your farm>
$security = Get-SPTopologyServiceApplication | Get-SPServiceApplicationSecurity
$claimProvider = (Get-SPClaimProvider System).ClaimProvider
$principal = New-SPClaimsPrincipal -ClaimType "" -ClaimProvider $claimProvider -ClaimValue $farmID
Grant-SPObjectSecurity -Identity $security -Principal $principal -Rights "Full Control"
Get-SPTopologyServiceApplication | Set-SPServiceApplicationSecurity -ObjectSecurity $security

Using MaintenanceWindows in SharePoint 2013

Screen shot of the Maintenance Window functionality in SharePoint 2013
Screen shot of the Maintenance Window functionality in SharePoint 2013


At the SharePoint Conference this week, Session SP210 on upgrading to SP2013 mentioned a brand new feature that didn’t exit in the preview edition: MaintenanceWindows.

As you can see from the screenshot above, this feature puts up a simple banner alerting users of the upcoming work.

The message is localized in the users language so long as language packs are installed.

The “More Information” link can point to any page you specify.

I was pretty excited about this, and couldn’t wait to try it out!

The PowerShell to do this wasn’t as easy as I expected.

I’ve pasted below what worked for me.

 #1st get a content database
 get-SPContentDatabase  #this will list them all
                        #copy and paste a database ID and use it to assign a specific DB to a variable
 $ContentDB = Get-SPContentDatbase -Identity #ID goes here
 #now we're going to add a maintenance window to the SPContentDatabase with $ContentDB.MaintenanceWindows.Add()
 #before we can do that we need to create a Maintenance window object and populate it.
 #                         Parameter List             "MaintanceWarning or MaintanencePlanned",  Mt Start   ,  Mt End   , Notify Start, Notify End, duration , urlforinfo
 $MaintWindow = New-Object Microsoft.SharePoint.Administration.SPMaintenanceWindow "MaintenancePlanned", "1/1/2013", "1/2/2013", "11/16/2012" , "1/3/2013", "1:00:00", ""
    #Parameter List for above:
      #1: MaintanceWarning or MaintanencePlanned,
      #2: Maintenance Start Date
      #3: Maintenance End Date
      #4: Notification Start Date
      #5: Notification End Date
      #6: Duration in the format of DD:HH:mm:ss - "1:00:00" = 1 hour, "1:00:00:00" = 1 day
      #7: URL for info
      # Parameters 2-5 all take a date time in this format: "1/20/2012" or "1/20/2012 5:00:00 PM"  
  #Now we can see the properties of a single MaintenanceWindow by just typing in $MW and hitting enter:
  #for me this looked like this:
  # MaintenanceStartDate        : 1/1/2013 6:00:00 AM
  # MaintenanceEndDate          : 1/2/2013 6:00:00 AM
  # Duration                    : 01:00:00
  # NotificationStartDate       : 11/16/2012 6:00:00 AM
  # NotificationEndDate         : 1/3/2013 6:00:00 AM
  # MaintenanceType             : MaintenancePlanned
  # MaintenanceLink             :
  # UpgradedPersistedProperties :
  #ok with that out of the way, we just need to add it to he content database

Ok so that’s it – refresh your website and you should see the pink banner on the screenshot above!

Note, I originally tried to do this by just setting up a blank object without paramters, and then setting the properties one by one, but I found that MaintenanceStartDate and NotificationStartDate could not be changed after the object was created.

– Jack

Using Powershell to get a list of user IDs from AD

One of my network admin friends needed an easy way to provide some users with a list of names vs AD account names.

In many organizations, this is easy to guess, for example if my name is  Jack Basement, my id might be jbasement, but in this case, it wasn’t that easy so we needed to go to AD.

There are AD cmdlets, but they are not in powershell by default.

If you have the Remote Server Administration Tools for Windows 7 installed, then you’ll find a link to “Active Directory Module for Windows PowerShell” in your administrator tools menu.


Using that we can easily get a list of the users needed and select just the columns we want

for example

Get-ADUser -identity domain\user #gets info on that user
Get-ADUser -filter {name - like "jack*"} #returns all the people named Jack

We can combine that with the select statement such as this:

Get-ADUser -filter {name - like "jack*"} | Select name, SamAccountname

Which gives us a nice list


Get-ADUser -filter {name - like "jack*"} | Select name, SamAccountname | convertto-csv

which will out put it as a comma separated CSV (Perfect for importing into Excel)


Get-ADUser -filter {name - like "jack*"} | Select name, SamAccountname | convertto-csv | out-file userlist.txt

which outputs the same thing, but to a file.


Now one neat trick, is that often you want to output all the users of a group in AD (technically this is called an Organizational Unit, or OU)

There is an LDAP filter type we can use for this

Whats cool here is that LDAP filters are sometimes a pain to get “just right” so we can cheat:

We can use the distinguished name of a known user in that group and grab the group from that

so for example

Get-ADUser -identity domain\bJack

results in a bunch of output, the first field is the distingished name and we can copy and paste that for our next command

Get-ADUser -filter * -SearchBase = "OU=mygroup,DC=basementjack,DC=com"

this outputs all the users in that OU

again we can chain for flexibility

Get-ADUser -filter * -SearchBase = "OU=mygroup,DC=basementjack,DC=com" select name, SamAccountName | sort-object name


Lastly don’t forget get-help

Get-Help Get-ADUser -examples

shows a few good examples.


Cleaning up Newsgator controls from SharePoint

On our farm, we have multiple URLS, multiple site collections etc.

One of them has a social add in called newsgator social sites.

I kept seeing errors in the ULS logs of other sites saying things like:

Failed to create a custom control from assembly ‘NewsGator.NewsManager.Web’ .. The type is not registered as safe.

I know from experience that this means the control isn’t listed in the web.config for the given site, nor should it be – I don’t have, nor want newsgator to have anything to do with the site in question.

I also know that the errors aren’t really hurting anything, but if nothing else they are making the ULS logs a little bigger and honestly, I don’t want a farm that has known errors in it.

So I set out to understand where they were coming from and how to safely get rid of them.

Finding these in the ULS logs

They are all over our ULS logs, but it’s nice to have  a quick way to validate if they are still there so I did a search with the windows Findstr command:

findstr /C:"is not allowed for web" *.log


The first thing I wanted to do was see if there was an obvious, easy fix – ie from site settings, site features, or site collection features, is there a newsgator feature that’s enabled that I can just turn off?

I tried this and no, there wasn’t

The solution turned out to be painfully simple.

In the ULS logs, there were entries like this:

Failed to create a custom control 'CustomMenu_NewsStreamAdmin', feature 'NewsGator.NewsManager_Actions' (id:16c89384-881d-44aa-a6f5-f66301596851) using attributes (ControlSrc='', ControlAssembly='NewsGator.NewsManager.Web, Version=, Culture=neutral, PublicKeyToken=a1b9791f4e4509c7', ControlClass='NewsGator.NewsManager.Web.NewsStreamAdminActions': System.ArgumentException: The control with assembly name 'NewsGator.NewsManager.Web, Version=, Culture=neutral, PublicKeyToken=a1b9791f4e4509c7' class name 'NewsGator.NewsManager.Web.NewsStreamAdminActions' is not allowed for web
at URL ''. The type is not registered as safe.

The error above is always paired with another less descriptive error – but the error above turns out to have all the information we need – the id. (In this case id 16c89384-881d-44aa-a6f5-f66301596851)

In powershell, Get-SPFeature will list all the features on the farm – in my case it showed the above ID.

Now, given that newsgator is legitimately installed on our farm and on ONE web application (URL) I didn’t want to remove it from the farm!

What was helpful was the command:

get-spfeature  -webapplication

This showed that the feature was associated with that web application and it also showed that it was webapplication scoped.

So next I used the command

Disable-SPFeature -Identity 16c89384-881d-44aa-a6f5-f66301596851 -URL

I had to do this for a few different features – pulling the ID from the ULS logs and running the disable command- While I’m sure it potentially could be automated, I preferred handling it “Hands On” doing them one at a time and confirming my SharePoint sites still worked as expected.

After that, the errors in the ULS log stopped for that site, and get-spfeature -webapplication no longer showed that feature.

It was a great feeling to get these nagging recurring ULS entries to stop!



Remove Stuck Item from FAST search

I manage a fast installation with a few million documents.

Something went wrong and FAST is returning results for files that no longer exist.

The “Normal” way to fix this is to do another crawl of the content source – In this case, it did not work.

The “best” way to fix this is to reset the index and re-crawl all the content.

Unfortunately, because of the size of our fast install, this is not practical – it takes over a week to index everything.

In other words, fixing this problem the “right” way will also bring down fast for at least a week for some content – not good.

On a support call with Microsoft – they told me of a quick way to remove individual results – it’s not quite as effective as a full index reset, but it has it’s place – for example – say that a confidential document got crawled, and the summary is showing up in search results. You’d want to get that out of the way right away – this approach can be good for that.

First download the free tool FS4SP Query Logger by Mikael Svenson – I found version 3 on codeplex with a quick internet search.
Run this on the fast server and click the start logging button, then go do your search using whatever search page is returning the bad results.
Once you see your search term show up on the upper left, look at the result XML and find the result.
you’ll want to grab the value of the “contentid” field – it will look something like this:

 <FIELD NAME="contentid">ssic://SomeNumberHere</FIELD>

Be sure the Area of XML you are looking at matches the search result you are trying to eliminate!

now, also on the fast server, open a FAST powershell command.
enter the command:
DOCPUSH -c sp -U -d ssic://YourNumberHere

Just like that, your search result should stop appearing in search results.


As a side note, while we were looking at some things, we used a clever powershell command to search multiple directories for some text

 select-string <longIDnumberrepresentingwhatwewerelookingfor> [0-9]\*[0-9]\index_data\urlmap_sorted.txt

Select-string is like Grep in Unix or Findstr in windows – it looks for strings.
what was neat here was the Regular expression for the path it limited the search to just a few key directories. – ie

A few quick commands to tell if FAST search is working

This post applies to users of Microsoft FAST search for SharePoint 2010.

Here are a few commands you can run on the FAST server to see if it’s healthy and also to get a feel if the back end is working in co-operation with the front end crawlers.

nctrl status will show if all the fast services are started and running.

indexerinfo status dumps some xml -the key piece here is the partitions – if you run indexerinfo status a few times, you should see some movement on these (percents will change)

Corrupt WMI fix

My good friend Jeremy shared this simple WMI fix.


Winmgmt /backup filename
Winmgmt /verifyrepository
if that fails, try
Winmgmt /salvagerepository
[if that fails, as a last resort:]
[Winmgmt /resetrepository]

SharePoint 2010 Session State Service

Sharepoint has two state service commands that threw me for a loop

first there is get-SPSessionStateService
then there is get-SPStateServiceApplication

Here’s an article from MSDN that talks about the differences

In Central Admin->Manage Service Applications these show up as follows:

Get-SPSessionStateService -> shows as type “SharePoint Server ASP.NET Session State Service”
Get-SPStateServiceApplication-> shows as type “State Service” and hopfully “State Service Proxy”

While you can easily delete both from Central admin, you can create neither of them from the service applications page.

Creating a new SPStateServiceApplication (and proxy) is relatively easy: 3 lines of powershell:

# from
$serviceApp = New-SPStateServiceApplication -Name ""
New-SPStateServiceDatabase -Name "" -ServiceApplication $serviceApp
New-SPStateServiceApplicationProxy -Name "" -ServiceApplication $serviceApp -DefaultProxyGroup

Creating a new SPSessionStateService, on the other hand is a little more involved…

How do I know?

I’m glad you asked….

I ran into an issue where an access report would not display because “session state is not turned on” it didn’t say which one, and through some trial and error, I now understand it was likely looking for the service returned by get-SPSessionStateService.

For me that returned a blank line with no database entry so I thought I’d be best to delete it and recreate it from scratch.

I was wrong.

While deleting and recreating the SPStateServiceApplication is easy, the SPSessionStateService was not easily done in SP2010 with the included powershell commands.

Fortunately I found this article: Which had the steps to recreate the service manually.

I enabled the ASP.Net state windows service, then followed the article above, stopping about half way through, before the provisioning part.

To Provision it, I used Enable-SPSessionStateService -DefaultProvision

Get-SPSessionStateService now returns a complete row, with a database server, and DB name, and ID and best of all Enabled = True

So to summarize my problem,
MS Access services reports needed “SPSessionStateService” which also uses the windows service “ASP.Net State Service”

In troubleshooting, I wasn’t aware of the difference between states so I deleted the “wrong” one in an attempt to reset it.
A little digging and I now have a better understanding of the issue and of the two different state services.

I hope this helps!

Simple Powershell script connect to servers as a different user

A common practice in IT is to have a separate admin account to connect to servers.

Often in day to day administration of SharePoint servers, it’s necessary to connect to the c$ or d$ share to look at a log file, copy an installer, etc…

You can do this from windows, and it will usually prompt you for credentials, but that can be a pain if you regularly connect to a bunch of machines that need different credentials.

This script will prompt you for a password, then use that password along with a pre-defined user account and server list to connect you to each server in advance.

function mapdrives
   #Update these variables for your environment:
   $account = "domain\useraccout"
   $serverlist = @("Server1", "Server2", "Server3", "Etc...")
   $SecurePwd = read-host -assecurestring "Enter password for $account"
   $pwd = ConvertTo-PlainText($SecurePwd)
   foreach ($Server in $ServerList)
        net use \\$server /d
        net use \\$Server /user:$account $pwd
   write-host "Done mapping drives"  
# This function came from Oisin Grehan, MVP 
# via:
Function ConvertTo-PlainText( [security.securestring]$secure ) 
   $marshal = [Runtime.InteropServices.Marshal]
   $marshal::PtrToStringAuto( $marshal::SecureStringToBSTR($secure) )

VMware Fusion 5 released – finally adds folders!

VMware Fusion 5 was released this week.
Here are a few quick thoughts…

They now sell a “regular” ($49) and “Professional” ($99) version

Now with the introduction of the Professional edition, VMware is only selling upgrades to Fusion 5 Professional for $49

At first this might seem unfair, but I checked my order history and I paid $49 for the upgrade from v3 to v4 so the upgrade price is the same as the last go around, and you’re getting the higher end version of the product.

Now for the good news…

Historically VMware Workstation on the PC has had more features and was better suited to running VM labs for trying out new stuff.

Fusion seemed to be more consumer focused – Ie it was a good fit for someone who needed to run one copy of windows 7, but it wasn’t as good as Workstation for someone trying to manage, say a dozen or two virtual machines for various labs.

Fusion 5 Pro introduced 2 new features that just made this a lot more viable on the mac:

– Folders – this is such a simple, but necessary container – If you have more than a few VM’s it’s useful to be able to group them into folders.

– Networking – This isn’t as good as I remember it on the PC version – there’s no settings to limit bandwidth (useful for simulating slower connections) but it’s nice to see them add the feature. It does mean that you can likely setup multiple, isolated private networks (useful  for isolating VM labs from each other)

– Lastly an annoying behavior they added with VMware Fusion 4 has been resolved – it used to be that when you launched a VM, the “Virtual machine library” would dissapear – it was a bit of a pain if you had to kick off 3 or 4 VM’s – this window stays put now.

– the Virtual Machine Library also now features both a list and an icon view. The list view is very nice, with a tree view on the left of your VM’s (and folders) and a preview window on the right. under the preview window it shows a few lines of the notes field so you can easily see what the selected VM is all about. Under that is a storage breakdown and the  lines of notes which are displayed very visibly now – this is great if you leave yourself notes on what each VM is for. Under that is storage graph and the ability to see how much disk space you can reclaim.


On my environment, I upgraded from v4 – one of my Vm’s (windows 7) was suspended and that worked fine in 5, but it warned me that some things were different and recommended that I shut down the machine so that the vmware tools could be upgraded and so that the compatibility setting could be updated.

Another machine had a snapshot and that seemed to go ok as well – Though as a precaution, after the Vmware tools were updated I deleted my snapshot then created a new, current one.




Simple PowerShell to Enable BLOB Cache on multiple SharePoint sites.

I needed to enable / configure BLOB caching on multiple sharepoint sites.

This is done by editing the web.config of each SharePoint site, on each SharePoint server.

I wrote this down and dirty script so I would not need to edit all the web.config’s by hand (I had about 20 web.configs to touch)

Note that since its just editing the web.config, we don’t need to run this in a SharePoint shell – I ran it from an ordinary PowerShell command prompt on my workstation.

The script:

Echo "Run this script under an admin account with rights to the servers being hit"
$dir = "\\Server\c$\inetpub\wwwroot\wss\VirtualDirectories"
$currentDate = (get-date).tostring("mm_dd_yyyy-hh_mm_ss")
# loop through each subdirectory to find each sharepoint site.
foreach ($subdir in dir $dir)
   #  Here In my case, all my SharePoint sites had as part of the folder names,
   #  So the contains statement was an easy way to only touch the web.config's of actual SharePoint sites
   #  while leaving alone central admin and other non-SharePoint websites IIS had.
        $path = $dir + "\" + $ + "\Web.config"
        echo $path
        $backup = $path + "_$currentDate.bak"
        $xml = New-Object XML
        $element = $xml.configuration.SharePoint.BlobCache
        echo $element
        $element.location = "X:\BlobCache\14"
        $element.enabled = "true"
        $element.maxSize = "5"
        echo $element

Sharepoint Search: Fix for Office 2007 titles not showing up properly in search results

If search results from SharePoint (not Fast) search are not showing the right title, and instead are showing a few words from the content of the document, theres a registry setting you can set to fix that.

The registry setting will be found on any machine running sharepoint search (Central admin-> Manage servers in this farm will show you which boxes have this role)

These powershell commands will show what the value is currently (or an error if its not found – a good sign that you’re on the wrong machine!)

  $RegKey ="\SOFTWARE\Microsoft\Office Server\14.0\Search\Global\Gathering Manager\"
  Cd hklm:\$RegKey 
  $key = "EnableLastModifiedOverride"
  Get-ItemProperty -path. -name $key | Select $key
  $key = "EnableOptimisticTitleOverride" 
  Get-ItemProperty -path. -name $key | Select $key

(you can see the registry entries in the code, you can edit these manually if you’d like)

This script changes the values to 0, fixing the office 2007 issue in SP2010 search:

$RegKey ="HKLM:\SOFTWARE\Microsoft\Office Server\14.0\Search\Global\Gathering Manager\"
set-ItemProperty -path $RegKey -name EnableLastModifiedOverride  -value 0
set-ItemProperty -path $RegKey -name EnableOptimisticTitleOverride -value 0

After you’re done with the above, restart the Sharepoint search service and do a full crawl – it is not necessary to reset the index.

Tips for installing the SharePoint 2013 preview

I recently ran into a few issues installing SharePoint 2013 that can easily be avoided by installing in a given order.

Firstly for the quick install, this seemed problem free:
1) Build a new windows 2008R2 SP1 Standard edition VM
(I gave mine 3GB of ram)
1a) I joined it to a domain (I did not build the VM as a domain controller)
2) be sure you have an internet connection
3) install sharepoint 2013 preview, choose the standdalone version. (the one that will install the free version of SQL2008 R2
This seemed to work fine.

The trouble came when I tried to install it with the latest version of SQL
Here’s what I did:
(Warning: this fails!)
1) Build a new windows 2008 R2 SP1 standard edition VM
2) Install the RTM version of SQL2012 – it needed .net 4 and IIS
(I think this is where things went wrong, SQL2012 configued IIS with .net 4, not 4.5)
3) tried to run sharepoint install – it failed on the pre-reqs – it seems in my case it could not determine if IIS was setup with .net properly. I even tried the old aspnet_regiis -i command to force it but still the pre-req installer would stop at that point.

I must admin, I was in a hurry to see the new SharePoint so I didn’t take additional troubleshooting steps (I could have downloaded the rest of the prereq’s manually and tried the SharePoint install, but I did not)

Instead I figured I’d try a different approach – first I tried the install I listed at the top of this article – that worked like a champ, but I wanted the SQL2012 based install.

Next I did this:
1) new VM with Windows Server 2008 R2 SP1
2) Run the pre-req installer from the sharepoint 2013 install iso.
(this configured IIS, and 4.5)
3) Do NOT run the SharePoint installer
4) Install SQL 2012 – since IIS and 4.5 are already installed it should leave them alone.
5) Come back and do the SharePoint install
It’s installing now – I’ll post the results when it’s done.

Setting the default 3 groups in SharePoint from PowerShell

I ran into this today –

Tagging on /_Layouts/permsetup.aspx to a site’s url brings up a list of the 3 standard groups for a site:

  • one for visitors
  • one for members
  • one for owners

Today when I tried to change one through the GUI, It threw an error.
The ULS logs were a mess, and not wanting to loose a day opening a case with MS I tried PowerShell.

The solution is fairly easy.
Grab a pointer to the web in question with:

$web = get-spweb

You can see all the web’s properties with:

$web | fl

The 3 that we’re interested in are:

  • AssociatedVisitorsGroup
  • AssociatedMembersGroup
  • AssociatedOwnersGroup

We can set these with PowerShell easily:

$web.AssociatedVisitorsGroup = $web.SiteGroups["Nameofthegroup"]

A list of Sharepoint Virtual File paths

This will start out to be an incomplete list, but should grow with time.
These are links that are sometimes handy to have when the UI doesn’t display them.




Path and file explanation
_layouts/viewlists.aspx Same as site actions->View all content
_catalogs/masterpage/Forms/AllItems.aspx View master pages for this site collection.
_layouts/changesitemasterpage.aspx Changes the master page for the site collection (must be called from the site collection url, not a subweb url)
_layouts/permsetup.aspx assings the 3 magic groups to a sharepoint web called from a web or subweb url
_catalogs/lt/Forms/AllItems.aspx List template Gallery
_catalogs/solutions/Forms/AllItems.aspx Site Template (/Solutions) Gallery
_layouts/settings.aspx Site Settings
_layouts/user.aspx A list of all groups and users in a given Site
_layouts/groups.aspx A list of all groups in a given Site Collection
_layouts/AreaTemplateSettings.aspx This screen chooses what site templates are available when creating a new site in a given site collection
_layouts/quiklnch.aspx An almost odd list view of the quick launch items –
but not the one you get to from site settings.
This is linked to from the “getting started” web part.
_layouts/qlreord.aspx Same as above – this one lets you sort the quick launch items.
_layouts/AdminRecycleBin.aspx End User Recycle Bin.
_layouts/AdminRecycleBin.aspx?View=2 Deleted from End User Recycle Bin.
Pagename.aspx?contents=1 View the web parts on a page – good for times when a web part keeps a page from rendering in normal mode.

Get all sharepoint users users in the farm with Powershell to a CSV file

This is a script that gets each sharepoint site on the farm, enumerates all the site collections and webs and dumps them to the screen as well as a CSV file.

The Current date and time is always appended to the file name so you don’t have to worry about wiping out previous results.

Add-PSSnapin microsoft.sharepoint.powershell -ErrorAction SilentlyContinue
$timestamp = get-date -format "yyyyMMdd_hhmmtt"
$filenameStart = "AllFARMUsers"
$logfile = ("{0}{1}.csv" -f $filenamestart, $timestamp)
$header = "type,user,group,weburl,webname"
$header | out-file -FilePath $logfile
$iissitelist = get-spwebapplication 
foreach($onesite in $iissitelist)
	foreach ($SiteCollection in $onesite.sites)
		write-host $SiteCollection -foregroundcolor Blue	
		foreach ($web in $SiteCollection.Allwebs)
			 write-host "    " $web.url $ "users:" -foregroundcolor yellow
			 # Write-host "        " $web.users | select name 
			 foreach ($userw in $web.users)
				#if ($userw -like "domain\*")
					write-host "        " $userw -foregroundcolor white
					#$msg = ("{0},{1} user:{2}" -f $web.url,$, $userw)
					$msg = ("RootUser,{0},-,{1},{2}" -f $userw, $web.url,$ 
					$msg | out-file -FilePath $logfile  -append
				#  }
			 foreach ($group in $web.Groups)
						Write-host "        " $web.url $ -foregroundcolor green
				 foreach ($user in $group.users)
					# if ($user -like "Domain\*")
						  Write-host "            " $user -foregroundcolor white
						  #$msg = ("{0},{1},group:{2}, user:{3}" -f $web.url, $, $group, $user)
						  $msg = ("GroupUser,{0},{1},{2},{3}" -f $user, $group, $web.url, $
						  $msg | out-file -FilePath $logfile  -append

Sharepoint Powershell to add a user from a trusted domain to sharepoint

Our sharepoint farm was in Domain A and we wanted to grant rights to a group in Domain B.
It worked fine from the GUI but powershell add-spuser or new-spuser failed – both stating the user ID we were adding was no good.
Specifically this was for Mysites – we had thousands of them so doing it by hand wasn’t an option.

$app = Get-SPWebApplication -Identity
foreach($site in $app.Sites)
    write-host "Updating $site"
    $web = $site.RootWeb
    $web.AllUsers.Add("DomainB\Domain Users", [System.String]::Empty, "Domain Users", [System.String]::Empty)
    Set-SPUser -Identity 'DomainB\Domain Users' -Web $web.Url -AddPermissionLevel 'Read'

Graphics card fix for 15″ “JackBook Pro”

I’ve had trouble with my Macbook Pro since upgrading to Lion (osx 10.7)

After much digging the problem can be traced back to the driver for the graphics card power management.

It’s talked about in depth at this discussion but its always so hard to find the page with the solution I thought I would write it down here…

This file is located at \System\Library\Extensions\AppleGraphicsPowerManagement.kext

Right click it and choose “Show Package Contents”

Then edit the info.plist file with TextWrangler.

in TextWrangler, click the icon at the top – it has a pencil with a red line through it – this is a protected file and you need to authenticate to edit it.

Now Search for MacBookPro6,2

Mine was around line 1421 or so

There are some nested settings, I had to change them as follows:

Changed to 250, was 100 ----------------------> <integer>250</integer>
This key is new in 10.7.x --------------------> <key>P3HistoryLength</key>
the key and value did not exist before  +----->	<integer>2</integer>
Changed to 4 was 10 -------------------------->	<integer>4</integer>
Changed to 88 was 80 -------------------------------->	<integer>88</integer>

Remove a stuck timer job in SharePoint using Powershell

I recently had a stuck timer job in our sharepoint farm.
It seemed like an easy thing for Powershell, but it turned out to be one step more complicated – I’m not sure why, but here’s the solution I used – thanks to Todd from the Vendor I was working on for providing the fix!

We can use the Cmdlet get-SPTimerJob to see all timerjobs in our sharepoint farm.

If we add a nice little where clause, we can limit the list to a single item:

Get-SPTimerJob | where {$ -like "Name of your stuck job"}

Normally I’ve been able to assign the results to a variable

ie like this:

$badjob = Get-SPTimerJob | where {$ -like "Name of your stuck job"}

Which works.
What didn’t work however was this:


For some reason, I got an error that there was no delete method.

So instead:

Get-SPTimerJob | where {$ -like "Name of your stuck job"} |fl
# I then read the ID from the output of the above (note I added | fl at the end) 
# and I copied and pasted it into this command:
$badjobTake2 = Get-SPTimerJob -ID (pasted the ID here)
$badjobTake2.Delete()  #this worked

I’m not sure what the difference is, maybe I even fat fingered it the first time..
but that’s how it got resolved.