Labels

2013-09-19

Powershell WebAdministration and case sensitivity for web applications


There is a caveat when working with IIS and WebAdministration module in Powershell: You must be careful if you create physical directories under your sites' physical root. If they mismatch, you will not be able to remove the web application again.

I've only experienced this on my Windows 7 with IIS 7, I'm not sure how it will work on higher versions of any of these products.

Here is a replication of the problem, follow these steps to reproduce.

> mkdir C:\inetpub\wwwroot\foobar
> New-WebApplication -Site "Default Web Site" -Name FooBar -PhysicalPath C:\inetpub\wwwroot\foobar 
dir 'IIS:\Sites\Default Web Site'
Remove-WebApplication -Name FooBar -Site "Default Web Site"
> dir 'IIS:\Sites\Default Web Site'
> rmdir C:\inetpub\wwwroot\foobar
Remove-WebApplication -Name FooBar -Site "Default Web Site"
mkdir C:\inetpub\wwwroot\foobar
> New-WebApplication -Site "Default Web Site" -Name foobar -PhysicalPath C:\inetpub\wwwroot\foobar 
dir 'IIS:\Sites\Default Web Site'
Remove-WebApplication -Name foobar -Site "Default Web Site"
dir 'IIS:\Sites\Default Web Site'



2012-04-27

Using encfs4win for encrypted storage on cloud drive

1. Get and install the software

  1. Go to http://members.ferrara.linux.it/freddy77/encfs.html and download encfs.zip
  2. Download Dokan (there is a link on the first download page). Install Dokan. Dokan is a user mode file system, like FUSE for linux.
  3. Unzip encfs (there is no installer) and copy the exe-files to a folder of your choice, e.g. C:\usr\bin

2. Run the encfs windows application

Note: This is the simple way. You can do a lot more by using the command line options.

  1. Open Windows Explorer. 
  2. Go to where you put the binaries (C:\usr\bin).  
  3. Right-click on encfsw.exe. Choose "Run as administrator". 
  4. You will get a key in the system tray area

3. Set encrypted storage folder

  1. Create a folder where you want to put your encrypted files. In this case, I call the folder for my new drive "Test" under "Vault\fs" inside my cloud drive area. Typical values for my scenario:
       C:\users\username\Google Drive\Vault\fs\Test
       C:\users\username\Dropbox\Vault\fs\Test
  2. Click on the key icon in the sys tray. Click Open/Create. Select the Test folder in the encfsw dialog. Select the drive letter, e.g. "X:". Supply the password and confirm it.

4. Mount the disk

  1. Click on the key in the system tray. Select the folder you want to mount, e.g.
       C:\users\username\Google Drive\Vault\fs\Test
  2. Select the drive you want to mount it to, e.g. "X:". This is where your files will be in their decrypted form. Supply the password you set earlier. Your encrypted X:-drive should show up.

Issues

As of now, it seems that most documentation is in pre-alpha, so you have to do some searching yourself to find good examples. Especially on how to use the command line. I will update this post later.

You won't get much in way of feedback when something goes wrong. Try to run encfsw.exe without admin privileges and see for yourself. To go for something more mature, you can also opt for the payware version of encfs from BoxCryptor in Germany.

Thanks


  • Frediano Ziglio for porting encfs, See http://gitorious.org/encfs4win
  • And all contributors to the mother project and the port
  • Oliver Heller that wrote the blog post that made it work for me

Warning


The software used in this guide is newly developed and perhaps not in a production quality setting. I take no responsibility for whatever happens to your files. I'm still going to have other backups using more mature technology.

As with all crypto-software, you have to trust those who wrote it, or make the effort of validating their code and  algorithms.



2011-09-05

Extraction of embedded inline images in InfoPath 2010

At work, we needed to extract images from rich text boxes in a InfoPath 2010 form. These image files are stored to a temporary disk area and used to create a HTML preview of a document based on the XML content in the form, and also for uploading those images to a SharePoint image library, separated from their source document. In InfoPath 2010, all images are stored inline on the img-elements, in base64 encoding in the attribute xd:inline.

Our forms requires the InfoPath filler application, not web forms hosted by SharePoint. I don't think that will work. The code in the example FormCode.cs below needs full trust, since it writes to the hard disk. It's not very useful as it is written here, but it is just to illustrate the extraction mechanism.

If you upload the images to a SharePoint library like we do, you also have to add a src-attribute in the img-element, so that it points to its new address.

I don't include the form in this post, but it's simple: a rich text box with embedded images enabled, connected to one field, and a button with the id BTN_EXTRACT_IMAGES.

If images are of a more photographic nature than our images, then perhaps JPG is a better storage format than PNG.

(double click the code to copy it)

2011-05-11

Upgrading Infopath forms and version on Sharepoint

This is an off-topic log entry, but I guess most .net developers also are, or will be, familiar with SharePoint, whether they want it or not. We are currently working on migrating from a set of documents (InfoPath form xml files) edited on a SP 2003 server with a locally installed Infopath 2007 form template to a set of documents in a SharePoint 2010 Form library where the template will be stored inside the form library.

We tried a number of different approaches, but it failed when it came to how to tell the old documents that they should use the new form in the library. An entry on the InfoPath blog gave a lot of valuable information. The re-linking alternative didn't work as expected, so we turned to the the PIFix-alternative. That was a success.

I'm not sure why re-linking failed, my guess is that it failed because our old documents were using an installed schema, so they weren't pointing to any template URL. Perhaps SharePoint should support that scenario? It might also be unrelated, but I didn't spend more time finding out the cause.

To migrate the forms, we have done the following steps:
  1. Use InfoPath 2010 to open the 2007 .xsn file and convert it.
  2. Update the forms submit options so that when user submits, it ends up in the new library.
  3. Set versioning information the way we want it (we use a yyyy.m.d.n scheme to make it simple).
  4. Publish the form to the new forms library.
  5. Create a "dummy" test file with the "+ Add document" button in the forms library.
  6. Open that dummy file with a notepad application and note the version and product version info as well as the url to the template inside the forms library (it ends with /Forms/template.xsn)
  7. Map a drive on our computer to the form library.
  8. Install InfoPath 2003 sdk, you can find the link in the blog article referenced above.
  9. Open a command prompt and go to the forms library.
  10. Run PIFix tool from InfoPath 2003 SDK as described below.
And voila, the forms are now using the form template for the folder.

Here is a detailed example on how to run PIFix. /v is for the template version and /prv is for product version, the lowest version of InfoPath you support for this form. For InfoPath 2010, this value is 14.0.0, if you set it, you cannot open it in an old version of InfoPath anymore. The /url parameter sets where the document will look for its template. All these values can be extracted from the dummy-file we created in the steps above. I've used Windows Explorer to map \\sharepointserver\sites\mysite\my form library\ to Z: before I start doing anything on the command prompt.

2011-05-08

Changing Windows hosts file with Powershell

I've barely started looking into Powershell, but in in my mind, every developer needs to know something about it, at least if you use cmd.exe for anything today. I've already run into some problems, when I tried to script changes to the hosts file on my computer.

It seems that Windows 7 (at least my x64 installation with Norwegian regional settings) does not tolerate Powershell to create a hosts-file. I've tried with -encoding ASCII and other encodings, but Windows just ignores it, until I create a new one with Notepad.

The solution seems to use Clear-Content on the file and then append text to the end like this non-modifying code-example show. You typically want to do some search and replace or something else to change $content into $newcontent. You shouldn't run it without backing up the hosts file first.

(doubleclick the code to copy it with ctrl-c)

I found this way of doing things after a lengthy search on the net, in a piece of source code by Mark Embling at github.

I'm still not sure why Windows doesn't like hosts-file written directly by Powershell. I've looked at one working and one non-working file in a hex editor without seeing any differences (like byte order marks), but I'm sure there is an explanation somewhere. I guess this solution leaves the file almost like it was, without touching any attributes.

2010-04-10

Custom targets for solutions and projects in MSBuild

By stumbling upon a couple of lines in Microsoft.Common.targets and then reading a very good blog post about extending MS Build, I learned a lot about how one could inject own behavior into MSBuild by using the conditional import mechanism found in Microsoft.Common.targets. You create two MSBuild files and you put them into a folder named by the version number of MSBuild (e.g. v3.5) under whatever $(MSBuildExtensionsPath) point to in your installation of MSBuild. One is called Custom.Before.Microsoft.Common.targets and will be imported before all other definitions in Microsoft.Common.targets. The other is named Custom.After.Microsoft.Common.targets and will be imported after all definitions. All this is pretty well documented already.

Here is the twist: By adding import statements that looks for target files in specific directories and following a naming pattern, every project can have its own MSBuild files that are imported before and after the main definitions in the Microsoft.Common.targets file.

The result is that every project in a solution will build the exactly way you want them to. Even when you build your solution inside Visual Studio! The solution targets files will affect every project, while the project targets file will only affect the specific project.

There are a number of things one can do with this mechanism. Here are some examples:

  1. Change the way version numbers are updated when building. How about leaving AssemblyInfo.cs alone and let a build task update the AssemblyVersion and AssemblyFileVersion attributes automatically, by e.g. picking up $(CCNetLabel) set by CruiseControl.Net
  2. Change the $(AssemblySearchPaths) property to have a more restrictive resolving of references in a build. We don't want Visual Studio to go looking for assemblies to "help us"
  3. Suppress those annoying "missing xml comments" warnings that no-one cares about anyway.
  4. Inserting targets into $(BuildDependsOn) that are called before or after build, instead of using those nasty pre- and post-build steps that you specify in the project properties dialogue.

Custom.Before.Microsoft.Common.targets:

Custom.After.Microsoft.Common.targets:

Download the source here

So why bother implementing a practice like this instead of just putting all your custom behavior into the custom target files themselves? Well, by doing that, you will affect all kinds of projects. By importing solution and project level MSBuild files if they are there, the projects and solutions themselves can decide exactly how they want to be built.

A couple of caveats:

  1. If you mess up your solution and project level MSBuild files, you may get your VS rendered useless until you fix the problems. The only way is to edit your files without loading the solution and test it with MSBuild at the VS command prompt.
  2. If you edit your solution and project build files from within your solution in VS, then you have to close the solution and reopen it to get them applied. I guess VS caches the build definitions to be able to resolve references, etc.
If you are not working as a solo developer, you may want to create a little MSBuild file that installs your source controlled custom target files into $(MSBuildExtensionsPath)\v3.5\ or whatever your version of MSBuild is. Then you make sure that it is run regularly at build servers and developer laptops. This way, you can change your custom before and after files and be confident that all developer and build environments are up to date (or destroyed if you make a mistake...)