Seattle .NET User Group - January 2010 Meeting : SharePoint development in Visual Studio 2010

Upcoming event for Seattle .NET User Group. For more details about location and group visit http://seattledotnet.org/

When

From: January 13, 2010 05:45 PM
To: January 13, 2010 08:00 PM

 

Location

Starbucks
Street: 2401 Utah Ave S
City: Seattle
State: Washington
Country: USA

What

Speaker:  Boris Scholl
Boris is a Program Manager with the Visual Studio for BizApps team. Besides taking care of the Visual Studio community he is focusing on LOB integration with SharePoint and is working on the next generation of SharePoint tooling. Prior joining Visual Studio he was working as a Technical Product Manager for Office Server building white papers for architectural guidance and LOB integration.

Boris started his Microsoft career working as an Application Development Consultant for portals back in 1999. He then was called into the World Wide IW Centre of Excellence working on large cross border SharePoint and Project Server implementations doing architectural design, reviews and LoB integration.

Abstract: 
The talk focuses on the SharePoint tools available in Visual Studio 2010. We will take you on a tour of the new SharePoint tools available in Visual Studio 2010. In addition we will have a closer look at how to develop SharePoint applications that integrate external data using the new Business Data Connectivity designer and how to take advantage of the Visual Studio Extensibility feature.

ASP.NET MVC Tutorials

Over the last week I have been playing around with the latest ASP.NET MVC 2 RC framework available here http://go.microsoft.com/fwlink/?LinkID=157071 

So far I feel like I have gone backwards from the rich UI development environment of ASP.NET Webforms. However as I write more an more code and look over more examples I am finding the main benefit is total control over what is output to the browser and very testable. Testing has been something I have avoided so far when it comes to webforms because of the complex approach of having to simulate the browser.

Today I was about to write up some blog posts about my experience so far and some of things I found helped me get started. However I also found a great resource today that explains exactly what I was going to write about and then some. Whoever wrote these articles thinks exactly like I do and did a fantastic job of explaining how to use MVC.

Take a look at this site http://www.asp.net/learn/mvc/

I plan on writting several applications using MVC to get myself more familiar and will then be posting some articles showing the same application written in winforms and mvc so that I can compare the advantages of each.

Here is the list of Applications I plan to build. Note each of them pretty simple apps.

Knowledgebase Application - Will allow creating / search of knowledge base information similiar to the microsoft knowledgebase available at support.microsoft.com

Download Center - Will allow simple management for publishing documents/software that need to be made available for download. Will be similiar to downloads.microsoft.com

 

HP MediaSmart Server is my favorite device for 2009

Automatically Backup all your PC’s and MAC’s. If backing up your machine is important to you but you have just not taken the step to implement a backup process, then I would highly recommend the HP MediaSmart Server . While it has many features the most important and coolest one in my opinion is the backup feature.

HP MediaSmart Server Drive Bays HP MediaSmart Server Back View

So what is so cool about it?

No matter how many computers you are backing up the HP MediaSmart Server will only keep a single copy of a given file. That means if you have 3 machines in your network all running Windows 7 the space required to backup the OS will only be the size of one of them. Below is the description from a Microsoft Document explaining how backup works.

The home computer backup solution in Windows Home Server has a single-instance store at the cluster level. Clusters are typically collections of data stored on the hard drive, 4 kilobytes (KB) in size. Every backup is a full backup, but the home server only stores each unique cluster once. This creates the restore-time convenience of full backups (you do not have to repeat history) with the backup time performance of incremental backups.

The home computer backup occurs as follows:

  • When a home computer is backed up to the home server, Windows Home Server software figures out what clusters have changed since the last backup.
  • The home computer software then calculates a hash for each of these clusters and sends the hashes to the home server. A hash is a number that uniquely identifies a cluster based on its contents.
  • The home server looks into its database of clusters to see if they are already stored on the home server.
  • If they are not stored on the home server already, then the home server asks the home computer to send them.
  • All file system information is preserved such that a hard disk volume (from any home computer) at any backup point (time) can be reconstituted from the database.

So imagine this scenario: You bring home your new HP MediaSmart Server and set it up by plugging in power and an ethernet cable. You install the client application on one of your machines. You run your first backup of your machine which for me took 17 minutes and 21 seconds to backup a base install of Windows 7 Ultimate Edition with all the updates applied (see below image).

image

By default it ignores things like user temporary files, system page file (mine was 2.29GB) , recycle bin, Hibernation file (mine was 2GB), Shadow volume implementation folders (mine was 19.54GB). That saved a tone of space and I would never want to really backup those anyways.

Ok, so you are probably thinking is that all? No it only gets better. Now you go to your second machine that one of your kids use and install the software because you want to have it backed up. The backup software reads all of the data but instead of having to backup everything it only needs to transmit content of those clusters that the home server does not already have. So if this machine has a lot of the same files or the same OS you wont be backing up a second copy of the same file. So think about it. How many machines do you have that have all the same music, photos, software etc… Only one copy of any of that will have to be stored.

Ok, getting cooler? Now you have all your machines in your network setup and backing up. Keep in mind that they are also backing up to a network device that is dedicated to your backups.

Ok, so hopefully you have made it down this far as I saved the best part for last. BARE METAL RESTORE!! For those not familiar with that term it means I could have a machine have a complete hardrive failure, pull it out and install a new blank one, grab the provided restore CD provided by HP, boot up my machine from the CD and it will then search the network, find my HP MediaSmart Server and provide me with a list of backups I can choose to restore from. Come back in a bit and my machine is back up and running as if nothing happened.

Now I don't know about you but that is worth a lot of money. In fact I got to try this process out 2 weeks after I purchased it. One of the things I did when purchasing this unit was to purchase a new larger hardrive. 2 Weeks later my new drive would not work and I had to take it back to the store and get a replacement. I brought the new drive home and restored from backup for the previous day and was back up and running in about 2 hours which I only had to walk away and come back when it was finished.

Other Features: Now I have only talked about backup. While that is in my opinion the coolest feature it is not the only. The HP MediaSmart Server also has the following features.

Media Storage and Streaming: You can setup the HP MediaSmart Server search computers on an interval and look for Pictures, Music and Videos and take a copy of them over to the server. Music is then available to be streamed to Media Extender devices, iPod, XBOX, Media Center machines, or access via web interface.

Media Streamer: Page accessible to stream your Music, Videos and Photos

image

Online Photo Album: Publish pictures that you can then provide access to.

image

Small Size: The unit is only 5.5” x 9.8” x 9.2” in size.

Remote Access: If you have the ability to make the device internet accessible then you can also provide Remote Access to any of your machines using Microsoft Remote Desktop services. This provides the ability to remotely access your machines from anywhere. As well once internet connected you can access your files, music, videos and pictures. Also your music and videos are streamed to you as you listen to them so you can instantly start listening to them rather than having to download them first. This also provides a optimum experience if you don’t have a lot of bandwidth.

HP MediaSmart Server Home: Default page showing what is available via a web browser.

image

Computer Access: Shows the list of machines in the network that are available to be connected to remotely.

image

User Accounts
Each person in your home or small office can have a user account in which they can use to store files in a central location. Now you can share folders on the HP MediaSmart Server and give access to only those that need them.

Storage
You can have up to 4 drives in the HP MediaSmart Server and they are also able to be HOT Swapped. My unit came with a 750GB drive and I purchased a 1TB drive to add more capacity and provide a backup drive. The HP MediaSmart Server does not use traditional RAID configuration but rather allow you to define folders that should have the data stored on more than one drive. So in a raid configuration if you put in another drive you do not get access to it as it is only used to provide a backup drive. With the HP MediaSmart Server you get all the extra capacity minus whatever is needed to keep a backup of the folders you said should be mirrored. Maximum storage is only limited by the number of drives and USB ports. So take 4 2TB drives and put them in and 4 2TB external USB drives and you have a whopping 16TB of storage.

Many More Features
There is a lot more features that you can check out on Amazon. Here is a link to the one I purchased. HP EX490 1TB Mediasmart Home Server (Black)

Automatically attaching VHD files in Windows 7 and Windows 2008

If you have played with the new VHD feature in windows 7 or windows 2008 then you know just how cool of a feature this is. However the problem is that when you reboot your machine you find that when it comes back up all your VHD files are no longer attached? Here is what I did to get around the issue.

  1. Create a batch file that will hold the following line:

    diskpart /s “c:\path to script\diskpartscript.txt”

    I named my batch file attachvhd.bat and placed it in the same folder as my VHD files


    attachvhd

  2. Create the script file that is being referenced by the attachvhd.bat batch file. Here is what the contents of that script needs to contain:

    select vdisk file="c:\path to vhd files\myvhddrive.vhd"
    attach vdisk

    I named my script file diskpartscript.txt and placed it in the same folder as my VHD files.

    script 

  3. Create a scheduled task that will automatically run when your machine starts up.

    • Go to Start / Administrative Tools / Task Scheduler
    • Click Create Basic Task

      1_addtask

    • Fill in the name of the task and the description and Click Next

      2_createbasictask 

    • Select “Start a program” radio button option and then Click Next

      3_start_program

    • Select “When the computer starts” and then Click Next

      3_when_computer_starts

    • Browse to the folder that you setup your batch file in and select it. Click Next

      5_script

    • Click Finish

      6_finish 


  4. You have now completed all the necessary steps. Restart your computer and you should find that your VHD files are now automatically attached. One caveat is that if you reboot and you log into your machine quick enough it is possible that the task may not have been run yet. Once you are logged in if the task runs you will get an Autoplay dialog as follows. Simply close it. This does not happen if the task runs before you get logged in.

    auto_play

Seattle Dot Net User Group - Sept 9th 2009 Meeting - .NET Troubleshooting in a Production Environment

 For more information visit user group website http://seattledotnet.org/

Topic: .NET Troubleshooting in a Production Environment

Abstract:
When your .NET application experiences a problem on a production or QA server, your very first task is to avoid or at least minimize end user inconvenience and negative business impact.

.NET troubleshooting in production is often an art.

How do you pre-configure your production environment in order to make the troubleshooting process easy and effective? What's the most straightforward way to understand the root cause of the problem? How do you avoid the cost and time consuming process of reproducing the problem in your development environment?

There are a variety of troubleshooting tools and approaches: debuggers (traditional, remote, core, on-the-fly), custom instrumentation, profilers, monitoring. Which one do you use for your exact situation? When is the core debugger applicable and how do you read memory dumps? What is an on-the-fly debugger? How do you design helpful and valuable custom instrumentation?

We will cover these questions in this presentation.


Speaker: Alex Zakonov

Alex Zakonov is Chief Architect of AVIcode, the leading provider of .NET application and troubleshooting solutions.  With expertise in software management and over ten years of experience as a software engineer and architect, Zakonov leads AVIcode’s product strategy, development efforts and customer support operations. His in-depth knowledge of the development, operations and support phases of the application cycle allows him to contribute a unique perspective on these functional silos and the intersections between them. Under his leadership, AVIcode has demonstrated 50-75% year-to-year company growth and has established a recognized leadership position in the application performance management market. 

Zakonov’s experience includes developing and implementing manageable systems that span multiple vertical industries, including monitoring software for energy management, data processing software for the telecommunication industry, and business automation software for the legal industry. This combination of experience resulted in him co-founding AVIcode and developing the company’s core technology, Intercept Studio, which is based on his patent-pending inventions in software monitoring. Zakonov is also actively involved with several Microsoft products groups, including working with the Windows Management team on the operations aspects of software manageability and with the Visual Studio Team System Team on the development aspects.

Zakonov is fluent in Russian, and holds an MS in Math and Computer Science from St. Petersburg State University.

When:
Wednesday, September 9th, 2009
5:45 – 6:15 PM – Mixer, group kickoff and speaker introduction
6:15 – 7:45 PM – Presentation
7:45 – 8:00 PM – Spillover time and raffle

Where:

Starbucks Support Center
2401 Utah Ave S.
Seattle, WA 98134

Seattle .NET User Group - July 8th 2009 Meeting - ASP.NET MVC Framework by Brad Wilson

Topic:  ASP.NET MVC.

Abstract:
ASP.NET MVC
is a new web development framework from the ASP.NET team at Microsoft. In this talk, Brad Wilson will discuss what the Model-View-Controller pattern is, why the team decided to create a new framework, how it compares to ASP.NET WebForms, some of its primary benefits, and of course lots of time in Visual Studio walking through the uses of MVC.

Speaker: Brad Wilson

Brad Wilson is a senior software developer at Microsoft on the ASP.NET team with more than 15 years of experience writing software for everything from networking stacks to high-speed measuring machines, from startups to large enterprises, and most everything in-between. He is currently working on the MVC team helping to define and deliver the next version of ASP.NET MVC. He is co-creator of the xUnit.net unit testing framework.

When:
Wednesday, July 8th, 2009
5:45 – 6:15 PM – Mixer, group kickoff and speaker introduction
6:15 – 7:45 PM – Presentation
7:45 – 8:00 PM – Spillover time and raffle

Where:

Starbucks Support Center
2401 Utah Ave S.
Seattle, WA 98134

 

For more information visit seattledotnet.org

The next Seattle.Net User Group meeting June 10th, 2009

 for more information visit website: http://seattledotnet.org/

Topic:  Windows Azure.

Abstract:
In this session, David Lemphers, Senior Program Manager for Windows Azure, will provide an overview of Windows Azure, including how-to code demos on building cloud based applications using Windows Azure and Visual Studio.

Speaker: Dave Lemphers
David Lemphers is a Windows Azure program manager based in Redmond. Dave spends most of his time working on features and projects for Windows Azure, but also enjoys blogging and building robots in his spare time.
Originally from Australia, Dave spends his free time making vegemite sandwiches and eating meat pies and lamingtons at Cafe 41!

When:
Wednesday, June 10th, 2009
5:45 – 6:15 PM – Mixer, group kickoff and speaker introduction
6:15 – 7:45 PM – Presentation
7:45 – 8:00 PM – Spillover time and raffle

Where:

Starbucks Support Center
2401 Utah Ave S.
Seattle, WA 98134
See Map

Our meeting is open to everyone so bring your friends and co-workers. 
If you’re planning to come, please RSVP as soon as possible via email or at the Facebook Group.

Barcamp Seattle June 13th - June 14th 2009

BarCampSeattle is an ad-hoc gathering born from the desire for people to share and learn in an open environment. It is an intense event with discussions, demos, and interaction from attendees. It is an international network of user generated non-traditional social conferences: open, participatory workshop-events, whose content is provided by participants, often focusing on early-stage web applications, and related open source technologies, social protocols, and open data formats. Oh and super fun!!

I will be going to this on Saturday to see how this type of event works. I always enjoy going to these types of things just for the networking opportunities.

To register and see more details visit: http://barcampseattle-09.pathable.com/

Compress and Decompress using .net framework and built in GZipStream

I recently had a project in which I wanted to compress log files I was transferring between servers. I did not realize till I did some research that the .NET framework has a nice little library built in for creating GZIP files. While I think the maximum recommended size for using this is 4GIG I am well under that.

Here is my Compress / Decompress methods

byte[] startfile = File.ReadAllBytes("e:\\mylog.log");
byte[] comp = null;
byte[] decom = null;




 



comp = CompressBytes(startfile);
Console.WriteLine(comp.Length.ToString());



decom = Decompress(comp);
Console.WriteLine(decom.Length.ToString());

 


private void Compress(byte[] fileBytes)
{
using(MemoryStream ms = new MemoryStream())
{
using(GZipStream gz = new GZipStream(ms,
                                             CompressionMode.Compress,
                                             true))
{
gz.Write(fileBytes, 0, fileBytes.Length);

gz.Close();
}
}
}

private byte[] Decompress(byte[] fileBytes)
{
int buffer_size = 100;

using(MemoryStream ms = new MemoryStream(fileBytes))
{
using(GZipStream gz = new GZipStream(ms,
                                             CompressionMode.Decompress,
                                             true))
{
byte[] bufferFooter = null;
int readOffset = 0;
int totalBytes = 0;
byte[] finalBuffer = null;
int uncompLength = 0;
int compressedFileLength = fileBytes.Length;

// Get last 4 bytes (footer) as they contain the
// original length of the compressed bytes

// byte array to hold footer value
bufferFooter = new byte[4];

// Set position of MemoryStream end of stream
// minus the 4 bytes needed
ms.Position = ms.Length - 4;

// Fill the bufferFooter with the last 4 bytes
ms.Read(bufferFooter, 0, 4);

// Set Stream back to 0
ms.Position = 0;

// Convert footer bytes to the length.
uncompLength = BitConverter.ToInt32(bufferFooter, 0);

// Set a temporary buffer to hold the uncompressed
// information. We also make it slightly larger to
// ensure everything will fit. We will later trim
// off the unused bytes.
finalBuffer = new byte[uncompLength + buffer_size];

while(true)
{
// Read from stream up to buffer size. Note that return
// value is actual bytes read in case the number filled
// is less than we requested.
int bytesRead = gz.Read(finalBuffer,
                                        readOffset,
                                        buffer_size);

// If no bytes returned we are done
if(bytesRead == 0)
{
break;
}

readOffset += bytesRead;
totalBytes += bytesRead;
}

// Trim off unused bytes based
Array.Resize<byte>(ref finalBuffer, totalBytes);

return finalBuffer;
}
}
}

Calling ASMX .net Web Service using jQuery

Just started to pick up jquery recently and was playing with calling a .net web service from my page. Was really easy once I used Firefox Firebug to do my debugging and figure out some of the variable names to use to access my data.

Here is my final code

Javascript in page

<script type="text/javascript">
$(document).ready(function() {
$.ajax({
type: "POST",
url: "default.asmx/GetCatalog",
cache: false,
contentType: "application/json; charset=utf-8",
data: "{}",
dataType: "json",
success: handleSuccess,
error: handleError
});
});

function handleSuccess(data, status) {
for (var count in data.d) {
$('#bookTitles').html(
' <strong>Book:</strong> ' + data.d[count].BookName +
' <strong>Author:</strong> ' + data.d[count].Author +
                ' <br />');
}

}

function handleError(xmlRequest) {
alert(xmlRequest.status + ' \n\r '
+ xmlRequest.statusText + '\n\r'
+ xmlRequest.responseText);
}
</script>

and the div that I write the content to

<body>
<b>Books List</b>
<div id="bookTitles">
</div>
</body>

and my Web Service and Class Catalog


[WebMethod]
public Catalog[] GetCatalog()
{
Catalog[] catalog = new Catalog[1];
Catalog cat = new Catalog();
cat.Author = "Jim";
cat.BookName ="His Book";
catalog.SetValue(cat, 0);
return catalog;

}

public class Catalog
{
public string Author;
public string BookName;
}