Category Archives: Uncategorized

Why Slack is winning

Why Slack is winning

Or the importance of cross platform apps.

I saw a twitter post the other day about how Microsoft bought Yammer for a billion dollars, and how Slack came in and kinda took over what Microsoft thought Yammer could be.

It made me reflect a bit.

I’d thought of using Slack on a few projects I have, but I had never thought of using Yammer.

Why is that?

Slack has a mac App.

Could something as simple as that be it?

Certainly Slack has done quite a few neat things, but really at the end of the day, aren’t slack, facebook, yammer, newsgator, you-name-it all pretty much re-imaginations of the “Bulletin board”?

Lets look at the importance of having ‘native’ apps. (I put native in quotes, because in this case, it’s not really important if the apps is “truely” native- it might have been developed once using a cross platform framework.) Note that everthing in the list below, is solved by having an “App for that”

Let me share some things I don’t like about your web app (This is ANY web app)

  1. I don’t like not being able to start your web app easily.
  2. I don’t like having to sign on to your web app.
  3. I don’t like that when your web app is running, I can’t easily tell which of my running apps is yours, and which is the google page I opened.
  4. I don’t like that I can’t easily bring your web app to the front of my screen. (because of #3)
  5. I don’t like that I accidentally close your app when I close my browser, and have to go through steps 1 and 2 all over.

In other words, if I’m going something every day, the 5 things above will annoy me as a user EVERY DAY and I will seek alternatives.

The original folks that created Yammer made a nice app using adobe air – this was a cross platform app that worked on mac and windows, and I used it when I first was introduced to Yammer.

Then Microsoft Killed it.

The day the Yammer Mac app stopped working on my mac, was the last day I used Yammer. I had actually forgotten that Yammer existed until I saw the tweet that sparked this blogpost.

I’m sure some will argue that there are other reasons Yammer failed to live up to its expectations, and some will argue there are other reasons Slack is so popular right now. I don’t mean to discount those reasons, just felt like sharing that Yammer lost me when the dropped mac support.

I use slack now.

A review of several denoise Audio plugins

I was looking for a way to clean up my audio so I downloaded few demos of denoise plugins and did this youtube video:

It’s 20 minutes long. If you don’t have time to view it, here’s what I covered:

Acon Digitial Denoiser (part of the audio restoration suite $99)

iZotope RX4 Dialog Denoiser (part of the RX4 standard suite $349)

Waves NS1 Denoise plugin ($200)

Waves WNS Denoise plugin ($500)

The Denoiser that comes with Logic Pro X

In Summary:

There were two categories of results:

  1. Plugins that helped with out making it worse.
    1. Waves NS1
    2. Waves WNS
    3. RX4 Dialog Denoiser
  2.  Plugins that were able to make the sound worse
    1. Acon Digital Denoiser
    2. Logic Pro X’s Denoiser

Overall I felt that RX4 did the best job, working like magic for my voice and the noise I was trying to remove.

Waves NS1 did a pretty nice job also.

Waves WNS didn’t do a whole lot for me, but maybe on a different sound source and with different background noise it might have.

Acon and Logic’s denoisers were both troubling, it was easy with both to introduce a ‘gurgling’ sound. I am always leery of plugins that add things like this.

If you’re in the market for such plugins, put on a pair of headphones and listen in a quiet room, and you’ll pretty clearly hear the difference.

Home Made DIY Acoustic Panels

I’ve recorded a few training videos here and there.  Being a bit of an Audio Nerd, I’m really concerned with sound quality.  The room I did my recordings in was small office with lots of hard surfaces and there was a noticeable echo.

As my Audio Engineer friend Bob Demaa would say: “You can take the mic out of the room, but you can’t take the room out of the Mic”

So here’s what I set out to do:

  • Either Buy or Build panels
  • Some Ceiling mount
  • Some wall mount
  • Ideally, lower cost, yet still effective.

Here’s what I did:

IMG_1181   IMG_1105  IMG_0996

Parts were sourced from 3 suppliers:

I chose all suppliers based on location – I’m in the Chicago Area, and the Insulation Boards, which are typically the hardest thing to source, are in stock at Fabrication Specialties and not too long of a drive.

Technique:

DIY panels are both easy and cheap –  I wanted these to look  at clean as possible, while also being as effective and lightweight as possible.

I’d seen several articles and loosely based my work on this one: http://acousticsfreq.com/blog/?p=62

In that blog post, Eric makes an inexpensive frame around the insulation using 1×2 furring strips from Home Depot.

It seemed like a great idea so I did the same thing, with a twist: I put my wooden frame behind the insulation:

IMG_1067

 

In the picture above I have a 4″ thick piece of Rock Wool, with the frame boards laid out.

As I was walking through Home Depot, I wanted to find something inexpensive and easy to work with that I could use on the front size of the rock wool to maintain a sharp edge.  It turns out Plastic Drywall edging was perfect! It’s not only lightweight, and easy to cut / work with, but it’s perforated which means I’m not giving up much in the way of acoustic performance around the edges!

IMG_1072     IMG_1074

 

I used a pair of Tin snips I had to cut the plastic corners and they worked perfectly.

I wrapped the ceiling panels with white fabric from JoAnn, using a hand stapler:

IMG_0987

 

The hand stapler got a little tiring, and I already owned an air stapler/compressor so I switched over to that after the first one.

I used eye bolts to hang them, here’s a close up of how that worked – the piece you see with the threads was screwed into the ceiling. That was the worst part of this whole project – Tolerances are pretty tight – I had in mind that I would screw the hook all the way to the ceiling for a near flush mount look. My goal was not to see the hooks, which if you think about it, is kinda funny -after all, I’m hanging these ridiculous panels from the ceiling, it’s not like not seeing the hooks is going to increase the “Wife Acceptance Factor” at that point…

IMG_0995   IMG_1078

Here are a few finished shots of the 4″x2’x2′ panels:

IMG_0989 IMG_0990 IMG_0997

 

I learned a few things building these that I’d like to pass on:

  • White fabric is actually kinda hard to make non-transparent – as you can see in the photo, you can kinda see where the drywall edging is underneath. To get around this, Joanne sells a very cheap white fabric called muslin, it might be worth a layer of that first, or better still, pick a color fabric that isn’t white.
  • The metal hooks I hung with were nearly impossible to get perfectly lined up – if you can stomach seeing the hooks and the eyelets, it’s likely 10x easier to hang them.
  • The metal hooks I hung them with would rattle when the floor above was walked on. – this eventually either stopped or I don’t  notice it anymore, but it could have been easily prevented by using heat shrink tubing on the hooks, or by wrapping them in electrical tape.

Ok on to the wall panels!

For the Wall panels, I used 2×4 x 2 inch thick rock wool.

I again placed the frame behind the material, and I apologize that I don’t have a bunch of pictures from this phase.

IMG_1081

 

In the picture above you see a nearly completed frame, and you’ll notice I have a center support in this one.

In the next one I built, I switched the center supports to two spaced out at 1/3 intervals like this:

newframe

I did this so I could hang the frame either vertically or horizontally.

Speaking of hanging, I wanted something that would hang flush to the wall if possible, and I didn’t want to spend $15 per panel on elaborate hardware.

I thought about making a french cleat, but I didn’t have a table saw.

I found some trim that looked like it would work so I nailed about a 1 foot piece to the inside top and side of every 2×4 panel I built.

Again, I don’t have a picture, hopefully this illustration will do:

frame with hangars  frame 4

The Trim piece works on the wall something like this:

hook on wall

 

Now for the painful part: Did it work? How long did it take? Did I save any money? Was it worth it?

Yes, it worked.  Echo / delay in that room is greatly reduced.

It took several weekends to assemble, plus a trip to the insulation supplier about 30 miles away, plus multiple trips to Joanne Fabric and Home Depot.

I built a total of 7 burgundy 2 inch thick 2×4 panels for the walls, plus 4 white 4 inch thick 2×2 panels for the ceiling.

Thats a total of 11 panels. I figure I spent $336 or about $31 a panel.

Was it worth it?

From an end results perspective, absolutely! Acoustically treating that room was the best thing I could have done!

From a cost/time/effort perspective, probably not. I was just talking with a friend today who had outfit his home studio with panels from ATSAcoustics. Those panels are nearly identical to the ones I built above, and only slightly more expensive, with the huge benefit that someone else has figured out all the variables, and done all the work – There is something to be said about having something arrive at your doorstep ready to hang on your wall!

Update June 2015:

The DIY panels have held up well. My wife hijacked my office (shown above) and I needed to outfit a second space. I did a ton of research and settled on AudiMute panels and I couldn’t be happier! There were a lot of things I liked about AudiMute verses the competitors, and I definitely recommend checking them out.  DIY is still cheaper, but As you see from the article above, DIY wasn’t super easy either! I can’t tell you how great it felt to just order panels, and have them delivered, and hang them, with less than 30 total minutes of my time invested!

– Jack

Starter Lighting for at home webcasting

I did a few skype interviews for SharePoint Saturday Chicago Suburbs and found that the room lighting wasn’t giving the look I wanted.

I looked at some professional lights, but they were expensive and I really didn’t have room for them in my small home office.

Ikea to the Rescue

I purchased two clamp mounted desk lamps from Ikea for $9 each. Initially I tried aiming them directly at me, but it was too distracting.

I then purchased two foam core boards from staples for about $5 each – this setup is pictured below:

lights1

 

When I am not “filming” I can put both Foam boards away and my room is largely back to normal.

The background behind me still felt a little dark in the resulting video so I grabbed one of those $5 clamp lights from home depot, and temporarily put it on top of my book case, pointed at the ceiling, which reflected back down and brightened up the back wall:

lights2

 

Total cost: $33 plus light bulbs

Come Say Hi at SharePoint Saturday Chicago Suburbs 2013

I’m part of a small team putting together a SharePoint Saturday Event for the Chicago Suburbs

Official Info is at http://www.spschicagosuburbs.com

We’re holding the event on Saturday June 1st at the DeVry campus in Addison IL – Right about where I-355 and I-290 intersect.

We’ve got a ton of great sessions in the works, and like any SharePoint event, it’ll be a great opportunity to network with other SharePoint professionals!

I hope to see you there!

– Jack

Using MaintenanceWindows in SharePoint 2013

Screen shot of the Maintenance Window functionality in SharePoint 2013
Screen shot of the Maintenance Window functionality in SharePoint 2013

 

At the SharePoint Conference this week, Session SP210 on upgrading to SP2013 mentioned a brand new feature that didn’t exit in the preview edition: MaintenanceWindows.

As you can see from the screenshot above, this feature puts up a simple banner alerting users of the upcoming work.

The message is localized in the users language so long as language packs are installed.

The “More Information” link can point to any page you specify.

I was pretty excited about this, and couldn’t wait to try it out!

The PowerShell to do this wasn’t as easy as I expected.

I’ve pasted below what worked for me.
 

 #1st get a content database
 get-SPContentDatabase  #this will list them all
                        #copy and paste a database ID and use it to assign a specific DB to a variable
 $ContentDB = Get-SPContentDatbase -Identity #ID goes here
 
 #now we're going to add a maintenance window to the SPContentDatabase with $ContentDB.MaintenanceWindows.Add()
 #before we can do that we need to create a Maintenance window object and populate it.
 
 #                         Parameter List             "MaintanceWarning or MaintanencePlanned",  Mt Start   ,  Mt End   , Notify Start, Notify End, duration , urlforinfo
 $MaintWindow = New-Object Microsoft.SharePoint.Administration.SPMaintenanceWindow "MaintenancePlanned", "1/1/2013", "1/2/2013", "11/16/2012" , "1/3/2013", "1:00:00", "http://www.mydomain.com/outageinfo.html"
    #Parameter List for above:
      #1: MaintanceWarning or MaintanencePlanned,
      #2: Maintenance Start Date
      #3: Maintenance End Date
      #4: Notification Start Date
      #5: Notification End Date
      #6: Duration in the format of DD:HH:mm:ss - "1:00:00" = 1 hour, "1:00:00:00" = 1 day
      #7: URL for info
      # Parameters 2-5 all take a date time in this format: "1/20/2012" or "1/20/2012 5:00:00 PM"  
 
  #Now we can see the properties of a single MaintenanceWindow by just typing in $MW and hitting enter:
  $MaintWindow
 
  #for me this looked like this:
  # MaintenanceStartDate        : 1/1/2013 6:00:00 AM
  # MaintenanceEndDate          : 1/2/2013 6:00:00 AM
  # Duration                    : 01:00:00
  # NotificationStartDate       : 11/16/2012 6:00:00 AM
  # NotificationEndDate         : 1/3/2013 6:00:00 AM
  # MaintenanceType             : MaintenancePlanned
  # MaintenanceLink             : http://www.mydomain.com/outageinfo.html
  # UpgradedPersistedProperties :
 
  #ok with that out of the way, we just need to add it to he content database
  $ContentDB.MaintenanceWindows.add($MaintWindow)
  $ContentDB.Update()

Ok so that’s it – refresh your website and you should see the pink banner on the screenshot above!

Note, I originally tried to do this by just setting up a blank object without paramters, and then setting the properties one by one, but I found that MaintenanceStartDate and NotificationStartDate could not be changed after the object was created.

– Jack

Using Powershell to get a list of user IDs from AD

One of my network admin friends needed an easy way to provide some users with a list of names vs AD account names.

In many organizations, this is easy to guess, for example if my name is  Jack Basement, my id might be jbasement, but in this case, it wasn’t that easy so we needed to go to AD.

There are AD cmdlets, but they are not in powershell by default.

If you have the Remote Server Administration Tools for Windows 7 installed, then you’ll find a link to “Active Directory Module for Windows PowerShell” in your administrator tools menu.

 

Using that we can easily get a list of the users needed and select just the columns we want

for example

Get-ADUser -identity domain\user #gets info on that user
 
Get-ADUser -filter {name - like "jack*"} #returns all the people named Jack

We can combine that with the select statement such as this:

Get-ADUser -filter {name - like "jack*"} | Select name, SamAccountname

Which gives us a nice list

and

Get-ADUser -filter {name - like "jack*"} | Select name, SamAccountname | convertto-csv

which will out put it as a comma separated CSV (Perfect for importing into Excel)

and

Get-ADUser -filter {name - like "jack*"} | Select name, SamAccountname | convertto-csv | out-file userlist.txt

which outputs the same thing, but to a file.

 

Now one neat trick, is that often you want to output all the users of a group in AD (technically this is called an Organizational Unit, or OU)

There is an LDAP filter type we can use for this

Whats cool here is that LDAP filters are sometimes a pain to get “just right” so we can cheat:

We can use the distinguished name of a known user in that group and grab the group from that

so for example

Get-ADUser -identity domain\bJack

results in a bunch of output, the first field is the distingished name and we can copy and paste that for our next command

Get-ADUser -filter * -SearchBase = "OU=mygroup,DC=basementjack,DC=com"

this outputs all the users in that OU

again we can chain for flexibility

Get-ADUser -filter * -SearchBase = "OU=mygroup,DC=basementjack,DC=com" select name, SamAccountName | sort-object name

 

Lastly don’t forget get-help

Get-Help Get-ADUser -examples

shows a few good examples.
 

 

VMware Fusion 5 released – finally adds folders!

VMware Fusion 5 was released this week.
Here are a few quick thoughts…

They now sell a “regular” ($49) and “Professional” ($99) version

Now with the introduction of the Professional edition, VMware is only selling upgrades to Fusion 5 Professional for $49

At first this might seem unfair, but I checked my order history and I paid $49 for the upgrade from v3 to v4 so the upgrade price is the same as the last go around, and you’re getting the higher end version of the product.

Now for the good news…

Historically VMware Workstation on the PC has had more features and was better suited to running VM labs for trying out new stuff.

Fusion seemed to be more consumer focused – Ie it was a good fit for someone who needed to run one copy of windows 7, but it wasn’t as good as Workstation for someone trying to manage, say a dozen or two virtual machines for various labs.

Fusion 5 Pro introduced 2 new features that just made this a lot more viable on the mac:

– Folders – this is such a simple, but necessary container – If you have more than a few VM’s it’s useful to be able to group them into folders.

– Networking – This isn’t as good as I remember it on the PC version – there’s no settings to limit bandwidth (useful for simulating slower connections) but it’s nice to see them add the feature. It does mean that you can likely setup multiple, isolated private networks (useful  for isolating VM labs from each other)

– Lastly an annoying behavior they added with VMware Fusion 4 has been resolved – it used to be that when you launched a VM, the “Virtual machine library” would dissapear – it was a bit of a pain if you had to kick off 3 or 4 VM’s – this window stays put now.

– the Virtual Machine Library also now features both a list and an icon view. The list view is very nice, with a tree view on the left of your VM’s (and folders) and a preview window on the right. under the preview window it shows a few lines of the notes field so you can easily see what the selected VM is all about. Under that is a storage breakdown and the  lines of notes which are displayed very visibly now – this is great if you leave yourself notes on what each VM is for. Under that is storage graph and the ability to see how much disk space you can reclaim.

 

On my environment, I upgraded from v4 – one of my Vm’s (windows 7) was suspended and that worked fine in 5, but it warned me that some things were different and recommended that I shut down the machine so that the vmware tools could be upgraded and so that the compatibility setting could be updated.

Another machine had a snapshot and that seemed to go ok as well – Though as a precaution, after the Vmware tools were updated I deleted my snapshot then created a new, current one.

 

 

 

Graphics card fix for 15″ “JackBook Pro”

I’ve had trouble with my Macbook Pro since upgrading to Lion (osx 10.7)

After much digging the problem can be traced back to the driver for the graphics card power management.

It’s talked about in depth at this discussion but its always so hard to find the page with the solution I thought I would write it down here…

This file is located at \System\Library\Extensions\AppleGraphicsPowerManagement.kext

Right click it and choose “Show Package Contents”

Then edit the info.plist file with TextWrangler.

in TextWrangler, click the icon at the top – it has a pencil with a red line through it – this is a protected file and you need to authenticate to edit it.

Now Search for MacBookPro6,2

Mine was around line 1421 or so

There are some nested settings, I had to change them as follows:

1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
	<key>MacBookPro6,2</key>
			<dict>
				<key>LogControl</key>
				<integer>0</integer>
				<key>Vendor10deDevice0a29</key>
				<dict>
					<key>BoostPState</key>
					<array>
						<integer>0</integer>
						<integer>1</integer>
						<integer>2</integer>
						<integer>3</integer>
					</array>
					<key>BoostTime</key>
					<array>
						<integer>3</integer>
						<integer>3</integer>
						<integer>3</integer>
						<integer>3</integer>
					</array>
					<key>Heuristic</key>
					<dict>
						<key>ID</key>
						<integer>0</integer>
						<key>IdleInterval</key>
Changed to 250, was 100 ----------------------> <integer>250</integer>
This key is new in 10.7.x --------------------> <key>P3HistoryLength</key>
the key and value did not exist before  +----->	<integer>2</integer>
						<key>SensorOption</key>
						<integer>1</integer>
						<key>SensorSampleRate</key>
Changed to 4 was 10 -------------------------->	<integer>4</integer>
						<key>TargetCount</key>
						<integer>1</integer>
						<key>Threshold_High</key>
						<array>
							<integer>57</integer>
							<integer>70</integer>
Changed to 88 was 80 -------------------------------->	<integer>88</integer>
							<integer>100</integer>
						</array>
						<key>Threshold_High_v</key>
						<array>
							<integer>1</integer>
							<integer>3</integer>
							<integer>98</integer>
							<integer>100</integer>
						</array>
						<key>Threshold_Low</key>
						<array>
							<integer>0</integer>
							<integer>68</integer>
							<integer>75</integer>
							<integer>95</integer>
						</array>
						<key>Threshold_Low_v</key>
						<array>
							<integer>0</integer>
							<integer>2</integer>
							<integer>4</integer>
							<integer>99</integer>
						</array>
					</dict>
					<key>control-id</key>
					<integer>17</integer>
				</dict>

Powershell Script to disable Certificate Revocation List (CRL)

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
#the following statement goes on one line
set-ItemProperty -path "HKCU:\Software\Microsoft\Windows\CurrentVersion\WinTrust\Trust Providers\Software Publishing" -name State -value 146944
 
#the following statement goes on one line also
set-ItemProperty -path "REGISTRY::\HKEY_USERS\.Default\Software\Microsoft\Windows\CurrentVersion\WinTrust\Trust Providers\Software Publishing" -name State -value 146944
 
get-ChildItem REGISTRY::HKEY_USERS | foreach-object {set-ItemProperty -ErrorAction silentlycontinue -path ($_.Name + "\Software\Microsoft\Windows\CurrentVersion\WinTrust\Trust Providers\Software Publishing")  -name State -value 146944}
 
Write-Host -ForegroundColor White " - Disabling Certificate Revocation List (CRL) check..."
ForEach($bitsize in ("","64")) 
{			
  $xml = [xml](Get-Content $env:windir\Microsoft.NET\Framework$bitsize\v2.0.50727\CONFIG\Machine.config)
  If (!$xml.DocumentElement.SelectSingleNode("runtime")) { 
    $runtime = $xml.CreateElement("runtime")
    $xml.DocumentElement.AppendChild($runtime) | Out-Null
  }
  If (!$xml.DocumentElement.SelectSingleNode("runtime/generatePublisherEvidence")) {
    $gpe = $xml.CreateElement("generatePublisherEvidence")
    $xml.DocumentElement.SelectSingleNode("runtime").AppendChild($gpe)  | Out-Null
  }
  $xml.DocumentElement.SelectSingleNode("runtime/generatePublisherEvidence").SetAttribute("enabled","false")  | Out-Null
  $xml.Save("$env:windir\Microsoft.NET\Framework$bitsize\v2.0.50727\CONFIG\Machine.config")
}

Powershell to test a URL against UAG rulesets

Many companies use Microsoft Forefront Universal Access Gateway (UAG) to publish sharepoint sites to the public internet.

We recently had a problem where office (word, excel, powerpoint) documents would not open through a sharepoint published site Via UAG in the office app on the end users home PC.

In UAG there’s a bunch of rules that match the URL in question via a regex.
We needed a quick way to test our URL against the regex in each and every rule so we knew which rules applied.

There currently isn’t a way to do that in UAG (there should be)
So as an alternate to doing these manually, I used the “Export rules” feature, then wrote the following powershell script to parse the exported file, gather the RegEx’s of each, and test the URL in question against each RegEx so you can see what rule is actually being applied.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
# if you run from ISE and get a permissions error copy and paste everything to a new tab.
# or run get-executionpolicy (take note what it is) then run (as an administrator): set-executionpolicy unrestricted
#set theses two variables -
$ExportedRuleSetFile = get-content 'c:\PS\PRD_URLset.txt'
$URLtoFind = "/get/content/from/sharepoint.aspx"
Function FindAMatch($Rulefile, $URL)
{
	$table = New-Object system.Data.DataTable "RegExList"
	$col1 = New-Object system.Data.DataColumn Section,([string])
	$col2 = New-Object system.Data.DataColumn Name,([string])
	$col3 = New-Object system.Data.DataColumn RegEx,([string])
	$table.columns.add($col1)
	$table.columns.add($col2)
	$table.columns.add($col3)
 
	$row = $table.NewRow()
	$row.Section = "Section"
	$row.name = "Name"
	$row.RegEx = "Regex"
 
	foreach ($line in $Rulefile )
	{
		if ($line[0] -eq "[")
		{
			$table.Rows.add($row)
			$row = $table.NEwRow()
			$row.Section = $line
		}
		elseif ($line.substring(0,7) -eq "m_regex")
		{
			$row.RegEx = $line.substring(8)
		}
		elseif ($line.substring(0,7) -eq "m_name=")
		{
			$row.Name = $line.Substring(7)
		}
	}
	$table.Rows.add($row)
 
	write-host Note: this only searches SharePoint Rules
	# note the select statement is needed because many of the records don't have regex values
	foreach ($record in $table.select("name like 'SharePoint%'"))
	{
		if ($url -match $record.regex)
		{
			Write-host $url found in $record.name via regex matching $record.regex
		}
	}
} #end function
FindAMatch $ExportedRuleSetFile $URLtoFind

Sample output looks like this:

Note: this only searches SharePoint Rules
/get/content/from/sharepoint.aspx found in SharePoint14AAM_Rule1 via regex matching (/[^"#&amp;*+:&lt;&gt;?\\{|}~]*)/?
/get/content/from/sharepoint.aspx found in SharePoint14AAM_Rule47 via regex matching (/[^"#&amp;*+:&lt;&gt;?\\{|}~]*)*/[^"#&amp;/:&lt;&gt;?\\{|}~]*\.aspx
/get/content/from/sharepoint.aspx found in SharePoint14AAM_Rule48 via regex matching /
/get/content/from/sharepoint.aspx found in SharePoint14AAM_Rule60 via regex matching (/[^"#&amp;*+:&lt;&gt;?\\{|}~]*)/?

As a side note, we were able to pinpoint the rule that applied to our URL which made fixing it much easier!

How to tell what Version/Patch level SCOM is running at

Microsoft System Center Operations Manager 2007 has recently released CU3 (Update 3)
the install proceedure varies depending on what version you have installed.

how can you tell what version you have?
Open the SCOM Console.
Under ‘Monitoring’ select the item ‘Discovered Inventory’
Change the Scope of the items displayed to ‘Heath Service’ (there is a scope button on the tool bar on top)

Grep Equivalent for Windows

Every now and then I need a way to pull all of the lines from a log file that match a certain string.

An example of this would be sharepoint uls logs – every error gets a unique ID number, and the errors can cover 20-30 lines at times.

if all these lines were together, that’d be easy, but often they are not together so a grep like tool comes in handy.

Thanks to a google search and the scripthacks website, it looks like there is a command in windows for this already:

Findstr

http://scripthacks.wordpress.com/2008/09/16/grep-equivalent-for-windows-string-parsing/

Basically, if you open a command prompt and cd to the logs directory,

you can enter something like this:

Findstr “Correlation ID to find” filename

or

findstr “Correlation ID to find” *.log 

 which will look though all the log files in that directory (This can take forever if there are lots of large files)

 Of course you can save the results by adding  > c:\temp\findresults.txt to the end of the line…

 

A handy feature for SharePoint admins with lots of log files might be the /F:file flag

basically, this reads the file list from a file.

Why would you want this?

Because you can easily dump a list of all files to a file with this command:

DIR /B *.log > filelist.txt

once you have this list (mine was 435 lines long!) , you can easily edit it in notepad and reduce the list down to the date range you’re after, this could take you from say, 435 lines down to 24

you’d then do your search with Findstr like this:

Findstr “Correlation ID to find” /F:filelist.txt

Using Visifire Charts in Sharepoint with SQL data – Part 1

Visifire.com has a nice silverlight charting library that does animated charts and graphs.

My employer was looking to do up a Dashboard in SharePoint and I used Visifire, the Sharepoint Content Editor Web part and a few back end aspx pages to grab data from various sql servers to present the graphs.

I’m doing a video series on the basics of how to do it.

This first video introduces visifire, downloads some samples and walks through getting the sample to work locally on your PC.

Part 2 will discuss the Visifire Chart Designer.
Part 3 will show how to copy the needed files to SharePoint and how to put the relevant HTML in a SharePoint Content Editor WebPart.
Part 4 talks about how to move Source XAML to a separate file.
Part 5 talks about one way to construct a back end ASPX page to produce the XAML and feed it back.
Part 6 talks about how to modify the ASPX page to query a database and produce live data
Part 7 talks about how to move that page onto the SharePoint Server.

Remove stubborn/stuck computer objects from MS SCOM

I work with a network monitoring tool from Microsoft called System Center Operations Manager (SCOM for short)
The version of SCOM I use (2007 R2 with Fix rollup 2) has an issue that seems to affect computers that are part of a cluster –
The computers could not be deleted, which is a bit of a pain.

This query was provided by a MS support rep, the first query should return a GUID for the object(s) that match the computer name
The second gives you a preview of what you’ll change.
the 3rd actually changes the objects in the database to deleted status.
note that it doesn’t actually delete anything, it just updates the values.

Select TopLevelHostEntityId from basemanagedentity where Name like ‘%ComputerName%’

Select * from basemanagedentity where TopLevelHostEntityId = ‘GUID’

Update BaseManagedEntity Set Isdeleted=1 where TopLevelHostEntityId = ‘GUID’

Powershell notes

I’ve wanted to learn about powershell for a while, but never really had time to mess around with it.
This post will be a collection of key ideas and commands as I read through a book or two on powershell.

#1 Launch powershell:
powershell
#2 Launch powershell ise
powershell_ise
#3 get-childitem
in the powershell command window, hit F7 to bring up a list of recently used commands

commands to get info about commands:
get-Command – (info about command)
get-help *-* (gets info about all commands)
get-help get-* (gets help about all get commands)
get-help set-* (gets help about all set commands)get
invoke-command
measure-command (measure run time)
Trace-command (trace)

Get-hotfix (gets hotfix info)

display environment variables with $env:varname ie $env:computernameget
See the execution Policy: get-ExecutionPolicy
Set the execution policy so it will run anything: Set-ExecutionPolicy

pipeable formatting commands:
format-wide -column 3 (ie Get-command | format-wide -column 3)

Manually download the Juniper VPN client for mac

The new juniper 6.5 client seems work with OSX 10.6.x better than the old ones did – it no longer requires creating a directory and setting permissions from the command line.

That said, my upgrade seemed to go ok, but a few days later, it stopped working – saying it was unable to download files from the URL of my webvpn.

I did an uninstall (I used appdelete) then downloaded the VPN client from the VPN
it turns out the client is easily obtained manually by using
(URL of your webvpn)/dana-cached/nc/NetworkConnect.dmg
this downloads the disk image file, then you can run the setup program from there – once I did that, it worked fine without any further modifications.

(thanks to William (post #9) on http://forums.juniper.net/t5/SSL-VPN/Snow-Leopard-Network-Connect-Fix/m-p/29985)

Changing the server location of checked out Files with Subversion

I recently rebuilt my home domain controller and renamed my home domain that I use internally.

As a result, all the links to my subversion server are now broken.

In other words, my local working copy thinks it’s linked to a subversion server that it can no longer reach.

if only there were a way to update my local copy to let it know where to look…

first I opened terminal (on a mac here, open the cmd prompt on windows)

cd to the working directory

enter svn info – this will display where subversion thinks it’s pointing to

enter svn switch –relocate FROM http://oldpath  TO http://newpath

Video: CODA + Subversion (Beanstalk) How-To

Video: CODA & SUBVERSION HOW TO
I did a video tutorial recently on how to use CODA (A popular mac editor used by web designers) with the Source Control System Subversion (Using Beanstalkapp.com’s free account offering)

The video is just under 20 minutes long and covers:

Setting up a free account on BeanstalkApp.com
How to configure the free account.
How to configure the ‘sites’ tab in Coda to work with the free account.
How to push changes to the Subversion Server (in this case Beanstalkapp)

How to compare an old revision with the current one.

How to ‘roll back’ a code change to an older version that’s in Subversion.

Possible Grid Controls to use with PHP

I’m working on a project in PHP at the moment.

Nothing gets a project moving like dropping a good grid control on  a page, spending 2 minutes setting a few properties and having a nice editable grid.

For ASP.NET, I’ve used the obout.com grid before with pretty good results.

Since this is my first PHP project I thought I’d see what grids are out there.

Here’s what I found…

———————

PlatinumGrid – only works with Delphi – no go.
PHPEzyGrid  -sucked – no inline editing.

KoolPHP.net $130 for the suite – nice flexibility – other tools too (combo box, calendar etc)

AppHP.com’s grid – looks nice – cheap $35

phpgrid.com – requires zend optimizer $99

Active Widgets –  too expensive ($500)

DHTMLX – I like that this is actually a Javascript grid – The  good part about this is that the knowledge gained and the grid itself could be used on other platforms (ie ASP.NET) The bad is that there’s some back end server code that needs to be written. also kind of pricey ($200-450) Typically with a pure Javascript based grid, the back end has to be written 100% by you, however, they have a connector product (still in beta no price given) The nice thing here is that the connector works with php, asp.net and jsp.

eyesis – Does not seem to have the features of other grids (no inline editing)
On the plus side, it does seem very easy to use, so there are likely places where this could be a good fit (last update was Dec 2008 however…)

ExtJS is another javascript control set – they claim to have a bunch of big customers.

It’s not cheap ($330) but like DHTMLx you are buying a whole slew of tools.

One nice thing about ExtJS is that they have a designer tool for it.
They also offer phone support (paid not free)

jQuery Grid (jqGrid) – this is a popular javascript grid for use with the jQuery framework. -$300 per platform (PHP/ASP.net )

datatables.net – (jQuery based – donation based)

SlickGrid (http://github.com/mleibman/SlickGrid)
This is my favorate from an end user perspective for it’s inline editing – it’s just like a spreadsheet.

PHP getmonth function

PHP doesn’t offer getmonth, getday or getyear.

Instead use this:

$unixtime = strtotime($test);
echo date(‘m’, $unixtime);
echo date(‘d’, $unixtime);
echo date(‘y’, $unixtime );

strtotime creates a unix style time (The number of seconds since some point in time around 1970)

Once you have that, you can use the date function as shown.

OnLine Backup done safely and cheaply

A few posts back I wrote about a program called SuperFlexibleFileSynchronizer.

My latest project is to Scan all the papers in my file cabinet. That’s a little scarry. There’s information in there that I can’t afford to loose, yet most of it is private enough that I wouldn’t really trust it being stored online.

So today’s post talks about one possible solution for that, using Dropbox and SFFS.

DropBox:
Dropbox is a great online storage service. You put things in your local dropbox folder, and it copies them to the dropbox server. Dropbox has only one problem, and it’s a problem shared by any and all online backup systems: Security.

Call me an untrusting person, but I don’t trust that information put on Dropbox will stay private. It’s just too big a target for a hacker to avoid. At some point, it will be compromised. I feel this way about all online storage.

So what’s the solution?

Encryption.

Encrypt the data here, then put it on the Dropbox folder.

Now there are two ways to do this –
a) encrypt everything into a single file like a zip file or truecrypt volume
or
b) encrypt everything individually

The problem with a) is that you’d need to re-upload the entire file for every change, which isn’t really practical.

The problem with b) is that once the file is encrypted, it’s not as easy to compare the encrypted file, so incremental backups become an issue.

Enter SFFS. SFFS has an elegant solution to the problem. SFFS can make a backup where the folder structure is replicated, but each file is zipped with AES encryption. SFFS also keeps a signature for each file and puts it in the file name. In this way, SFFS is able to do incremental backups, comparing the unzipped originals to the zipped & encrypted copies.

Perfect.

For me.

To be clear: the names of the folders and files are visible, so if you’re dealing with super secret stuff, you’ll want to take that into account.

For my needs, I think it’s enough extra protection that I’m happy with it.

Now, if a hacker ever gets into dropbox, they’ll need to do some extra work to get into my files. Could they do this? Absolutely. Would they do this? Probably not. There would be so many other files that aren’t protected that my files wouldn’t be worth the hackers time.


Keeping multiple PC’s in sync

I’m a computer guy.

In the beginning, life was simple. You had one computer – you kept your files on it. If you backed up, you did so to a floppy disk.

Today, I have a few computers, and I use them for different things, but I need my data on all of them. If they were all desktops, Life would be easy – keep all the files on one, then share it out to the rest. Add a laptop and things get messy.

What I want is to have a local copy of all my files, and have those automagically copied to and from the server whenever changes take place.

For that, I’m using SuperFlexible File Synchronizer.  I’ve tried over a half dozen apps on the Mac for Sync and this one is by far the best. It’s also available in a Windows version, which is awesome, because one of my computers is a windows computer -that solves the problem of what to do there, and SF is available as a bundle that includes both versions.

What makes SuperFlexible better than the others? Control & Options. Superflexible lives up to it’s name. Do you need scheduled sync – no problem. Exclude .DS_Store? Got it. Need to sign into a Windows server before you run the backup? Done. I could go on and on – it seems to EVERYTHING. Here’s a few more things it can do – it can encrypt and zip each file so you can backup to internet based storage like Amazon S3 or Google without worrying about your data being exposed. It can detect when files move so they aren’t simply re-synced. It can add version numbers to your files when it copies them over.  The interface is well refined too. When you run a sync, you can do so in the foreground or background – background is kind of like Autopilot, where foreground gives you a bunch of information – what files are going to copy, in what direction etc…  Here’s an area that set this one apart – you can right click on any file before it’s synced, and tell it what direction to sync, to ignore it, to delete it from one but not the other etc..

I also found it rewarding that during testing, I wiped out the target folder, and on the next run, SF came up and asked if the folder should be re-created. (other tools have given an error in this scenario)

My setup is as follows:
I have a folder I keep all my stuff in. I keep a copy of that folder on my Mac Laptop, and another copy on my home server. On the Mac, I have a Sync job setup with SuperFlexible File Synchronizer that keeps things in both places up to date.  This way, when I’m on my laptop, I’m always working with a local copy of my data. If I take the laptop on a trip, that data comes with me. When I get home, it copies it to the Server.

SuperFlexible File Synchronizer also has some other good qualities and I hope to use it for online backup soon as well. It supports backing up to Amazon S3, and Google storage support is in Beta as of this writing. It also supports FTP & Webdav folders (such as apple’s iDisk). And as mentioned above, it supports zipping and encrypting files which makes me far more comfortable using online storage as a backup target.

Epson GT-S50 Scanner thoughts

I recently picked up an Epson GT-S50 Scanner for a project I am working on for a client.

My Client’s office is using Windows, and at home I have a combination of windows and Mac.

I’ve read tons of great reviews on the Fujitsu ScanSnap scanners (like the S1500), but you have to buy a mac or windows version of those, and they only work with the built in software.  There were some issues when Mac OSX 10.6 Snow Leopard came out, and the scanners lost a lot of functionality for a few months while Fujitsu readied new software.

The big draw to the Fujitsu is how good the software is that comes with it. It does a bunch of things for you – and may people say it “just works”.

After reading a review on the Epson GT-S50 online, I knew the software wasn’t as polished, but wondered how far off it would be..

Impressions:

I connected the GT-S50 to my Macbook Pro and installed all the latest drivers.

The epson has a nice 2 line display that tells you what’s going on, and on Windows you can see what preset you have selected, along with the description.  On the mac, the description is gone, you see Job 01, Job 02 etc…

One button scanning is HORRIBLE. Awful, the worst.  It takes FOREVER to even get the thing started. In fact, I thought I’d time one and it’s been 2 minutes and the software still hasn’t scanned (the panel says Scanning and the software launched on the mac so it’s doing something.) When I hover over the ‘Progress’ Window, I get the beachball of death.

Ok 3 minutes in and I’ll ‘force Quit’, but the scanner still thinks it’s scanning.

Click the red button on the scanner, – power light stops flashing, display still says Scanning.

I thought I’d scan once manually with the Epson scan software, but when I launch that, it shows the icon in the doc, but no menu.

I tried launching Apple Image Capture, but the scanner doesn’t show up so it’s now time to power cycle the scanner.

At this point I’m thinking the scanner sucks, but in reality, the software they have for ‘one touch’ sucks.

The reason you’d buy an Epson over the Fujitsu is because it has a twain driver, and the Fujitsu does not. This means you can use other software with the scanner.

So now I’m out to find out if any of the other software is any good.

I have adobe Acrobat Pro 9.3 installed, and it can scan using the twain driver.

Acrobat does a great job with the scanner, it’s just a little slow. The whole process is slow, but…
If you use Acrobat to scan, you’re pretty much done when you’re done. Since Acrobat Pro does the OCR, the deskew, the page rotation, the auto page sizing & the image compression there’s no need to go into another program to clean it up. In fact ,after you scan, your document is sitting in acrobat, waiting for you to save it – So you don’t even have to go find it and rename it. When you’re done, you’re done…

Still I wondered if I was missing out on the scansnap  – There’s so many good reviews of that thing.

I thought about it for a bit, but decided that the twain interface was something I really liked the flexibility of having.

And the Epson scanner seems better built (11 lbs vs the 6lb fujitsu)

And I’d read more than once about the fujitsu misfeeding issues.

So at this point I feel the Epson has the best hardware, but not the best software.

Enter Image Capture.

The macs come with a program called image capture.

I opened it, and my scanner was there, so I used that – It was fast!

it doesn’t do everything that acrobat does, but if you need to whip up a quick letter sized pdf, it does the trick – quickly!

This has me on a hunt for more apps that support the image capture interface.

So far I’ve had little luck.

I tried 2 or 3 apps that support image capture and in all cases the scan comes up as a gray box.

So I’m not sure what’s up, but somethings up.

I’m also not sure if the image capture driver came from epson or apple.

I need to hook it up to a second mac, one that hasn’t had the epson drivers installed, to see where the drivers are coming from, and then hopefully I can point to either apple or epson and get the problem fixed.

(for all I know the image capture driver might be using the twian driver and not displaying the dialog)’

Making a WinPE CD with HP SmartArray Raid Drivers

The other day I had to extend the C partition on a server.

Microsoft has a command for this, it’s part of DISKPART.EXE

Unfortunately, the needed subcommand is “EXTEND” -which fails on the Boot parition (aka drive C)

Fortunately, there’s an easy way around this – boot from a winPE CD, and the CD becomes the Boot Partition, freeing the C drive for the extend operation to work it’s magic.

Making a WinPE disk isn’t hard, but it’s not as easy as downloading and burning an iso either. In total, there’s 8 commands I needed to enter to make my custom WinPE iso.

About WinPE

WinPE is a ‘PreInstallationEnvironment’ – whatever that means!  To me, it’s a bootable CD, that brings up a windows command prompt. I’ve used one in the past to get files off a retired server that I no longer had access to the password for – boot from winpe, copy the files to a usbkey, done.

Where do you get it?

If you have access to technet or MSDN, you can download something called the windows Automated Installation Kit (Windows AIK) This is filed away with different versions of Windows, it’s not under a menu like tools, or applications, I found it mixed in with Windows 7 as well as 2008R2. Both locations list the same ISO for download.

Unfortunately, the iso you get from technet is the AIK Iso, not the ISO of the WinPE CD you want to burn. You have to first download the AIK then follow the steps below to construct a WINPE iso customized with the drivers you need for your raid card.

Lets begin,

Download the AIK

While you’re waiting for the AIK to download, you’ll also need drivers for your raid card.  You’ll need both the driver (usually a .sys file) and a .INF file. In my case, I looked at device manager to see what driver (.sys file) was in use by the raid card, then I went to HP’s website and downloaded a driver set, extracted it and confirmed that the .sys file I needed was there – there was a .inf with the same name, so I figured I was in business….

(back in a few, waiting for this to download…)

Burn the ISO of the AIK to a DVD (or extract the iso to your hard drive)

Run the Windows AIK Setup (there’s a link to it on the autorun that pops up)

Once thats done you’ll have a new folder in your start menu ‘Windows AIK’ open that folder.

Have a look at the Windows PE manual under documents.

The section I used was ‘Customizing Windows PE’->’Windows PE Walkthroughs’->’Walkthrough: Create a Bootable Windows PE RAM Disk on CD-ROM’

What this does is help make you a bootable CD that will load itself into Ram – you’ll know its running in ram, because the command prompt points to drive X: which is the ram disk.

Before you get started with the above walkthrough, have a look at one more section:

‘Customizing Windows PE’->’Windows PE Customizations How-To Topics’->’Add a Device Driver to an Offline Windows PE Image’

ok what does that mean? well there’s another help option for ‘online’. What’s the difference? Well ‘Online’ means you’ve already booted from the CD, we are still building our CD so we want to use the advice for the ‘offline’ that way, once our CD is done, we won’t need to do anything else to use the driver.

Ok now if you’re clever, you’ve read both sections and have noticed theres something a bit off in trying to merge the two sections – the instructions in the walkthrough tell you to copy winpe.wm to \iso\sources\boot.wim,

but then in the driver section, they tell you to open the winpe.wim to change it. I did so, then when I was done, I just recopied the file to the boot.wim…

Here’s the list of steps I followed:

  1. started deployment tools command prompt as administrator
  2. ran copype.cmd x86 c:\winpe_x86
  3. Dism /Mount-WIM /WimFile:c:\winpe_x86\winpe.wim /index:1 /MountDir:c:\winpe_x86\mount
  4. Dism /image:<path_to_image> /Add-Driver /Driver:(Here I put the folder path to the folder with the .inf and .sys files) /recurse
    (the /recurse causes all the drivers in that folder to be added)
  5. dism /unmount-wim /Mountdir:c:\winpe_x86\mount /commit
    At this point, we’ve altered the winpe.wim (which is basicallly a fancy .iso file)
  6. copy c:\winpe_x86\winpe.wim c:\winpe_x86\ISO\sources\boot.wim
  7. oscdimg -n -bC:\winpe_x86\etfsboot.com C:\winpe_x86\ISO C:\winpe_x86\winpe_x86.iso
    (this created an actual .iso you can burn with your burning software)
  8. Burn the .iso to CD. I used the free & excellent ImgBurn

That’s it!

8 relatively simple steps and you’ll have a bootable Windows PE CD with the Raid drivers of your choosing!

Pearnote

Pearnote (http://www.usefulfruit.com/pearnote/) is a really clever idea – a sound recording app, along with a text editor.

As you type, some hidden data is stored with your text linking it to that position in the audio recording app.

I can’t imagine how awesome this would be if you were a student and wanted to take notes, and record a lecture on your laptop.

Great idea!