I haven’t been writing much on the blog for a while. This has mainly been because I’ve been too busy working, setting up my own company, Lumin Creative and I haven’t had the time to lean back and describe what I was nerding with when I was programming.

At the same time I’ve thought about a lot of other things, that I would have like to comment on, like when I was watching a good movie, visited a special place or just anything that I found important in my life.

So I thought, this blog from now on will not only focus on technology, but about my life in general. If you only want stuff on NHibernate or asp.net, you’ll have to filter by the categories on the side.

Welcome to my life :)


In this post I’ll give an overall view of my preferred NHibnernate / asp.net architecture and focus on some examples of how to handle the NHibernate session in an asp.net web application.

When I first started using NHibernate as my preferred object-relational mapper it was very hard to find examples of how to architect an asp.net application. There were many concrete examples of how to make queries and how to create the mapping files, but it was difficult to find info on how to integrate NHibernate overall in a web application created with asp.net.

This meant that I have been going through a lot of trial-and-error on the way and one particular element that I have handled several different ways have been the NHibernate session, primarily because of it contains the connection to the database.

Overall I normally divide my applications into two layers:
- web
- business

The business layer will contain two main divisions: Data and Repository.
Everything within the Data namespace contains business classes and their nhibernate xml-mappings (i.e. User.cs and User.hbm.xml).
The Repository namespace is inspired by a pattern from Martin Fowler and normally contains a class per business class, and it handles the business methods relevant to that. For example I would have several ways of getting a concrete user (by id, by username and password) or groups of users by specific criteria (I would call it UserRepository.cs).

To handle NHibernate I have a NHibernateManager class. It looks like this:

public class NHibernateManager
static Configuration configuration;

static ISessionFactory factory;

private ISession session;

public ISession Session
if (session == null || !session.IsOpen)
session = factory.OpenSession();
catch (Exception e)
throw new Exception(“The session or database connection could not be created”, e);
return session;

static NHibernateManager()
configuration = new Configuration();

factory = configuration.BuildSessionFactory();
catch (Exception ex)
throw new Exception(“Could not create the NHibernate configuration”, ex);

The web.config has the necessary database connection information and this class reads all the embedded xml-mappings and builds a sessionfactory that can be used to open and close connections to the database.

The tricky part for me was how to handle the session in the best way.

I started out by in each method in the repository class that needed to access the database, I would create a new session and close the session when the query was done. It would look something like this:

public User GetUserByLogin(string username, string password)
NHibernateManager mgr = new NHibernateManager();

User user = null;


string queryStmt = String.Format(“from User where username = :username and password = :password”);
IQuery query = mgr.Session.CreateQuery(queryStmt);
query.SetParameter(“username”, username);
query.SetParameter(“password”, password);
user = (User) query.UniqueResult();

catch (Exception ex)
throw new Exception(“The user could not be retrieved from the database”, ex);

return user;

(Notice that I’m using IQuery and not ICriteria to get the list from the database, since the performance is much worse with ICriteria)

Now this worked fine at the beginning, but when I began using lazy loaded collections I ran into a problem. Lazy loading basically means that a collection of related objects to a concrete object is only loaded the first time it is accessed. Like if the user has a Messages collection of all the Message objects he has, these would only be loaded from the database at the time that they were actually needed. This improves performance a lot, but it doesn’t work if there isn’t any connection to the database at the time that the collection is accessed. NHibernate requires that it has to be the same session that loads the lazy loaded objects, so I couldn’t just close the session every time I made a query.

At first my solution was not to close the session thinking that although I might have more sessions (i.e. database connections) opened at the same time because of that, eventually the garbage collection manager would dispose of the not used sessions. Well, that was a very bad choice, because I simply began running out of database connections. There were way to many sessions open at the same time and they would be kept open for much too long.

So I began keeping only one session per repository. The session would be created in the contructor and closed in the destructor of the repository class so it was only a matter of managing the use of the repositories. For some reason this would not work well with Visual Studio 2003. Closing the session objects in the destructor would make it impossible for me to use the debugger. I’d get strange errors and apart from that it was still too loose a way of handling the closing of the sessions.

So I created a method instead on each repository CloseSession(), that I would call when the use of the repository was over. It was a bit impractical and required discipline, but it seemed to work.

But I ran into a problem when two different repositories handled objects that were related. I have a Save-method in each repository, but if I tried to save an object using a different repository than the one that it was opened with I got problems (right now I can’t remember what it was). I also still did have more than one session open at the same time per user, which was really not necessary, so I decided to handle the session on an even more general level.

At first I tried to do it on the Page level. It made sense to open the session when the page request was made and close it when the rendering was done. This meant that all the repository-classes needed to receive the NHibernateManager in the constructor, which meant refactoring a lot of code, but since it was a better solution, I did that.

But I was not completely happy with it. I didn’t want to depend on the Page-object and page cycle, so I ended up having an external class called GlobalManager that stores the NHibernateManager in the asp.net session:

public static NHibernateManager mgr
if (HttpContext.Current.Session["NHibernateManager"] == null)
HttpContext.Current.Session["NHibernateManager"] = new NHibernateManager();
return (NHibernateManager) HttpContext.Current.Session["NHibernateManager"];

So this is the solution I finally ended up with and that I’m using now and I’m pretty happy with it.

But I’d also love to hear other peoples approaches to this and possibe failures in my way of doing it, so post a comment if you want to add or correct anything.

The news regarding Wikipedia-founder Jimmy Wales’s development of a search engine has got quite some coverage. Whether it’s because people really believe the Wikiasari project to be able to compete with Google or just because it’s to imagine it, I don’t know.

I’m very much in favor of the idea it’s based on: a user-influenced, personalized, and generally web 2.0 inspired search that serves quality information! It sounds too good to be true. And I think it is.

As the article on LinuxInsider.com writes: “Human weeding alone cannot keep up with the spam”.

That’s simply the main point. Other similar user based ideas have problems:
Open Directory is not a success (at least sites are waiting 3-12 months to be accepted), people are “gaming” digg.com, the blogosphere is influenced by the payperpost issue and I doubt that such a lucrative area as search will succeed in battling spam without some very heavy automatic tools to help with it. And even then it’s hard…

But I welcome the initiative and I’m looking forward to how they will do it in practice.

Search Engine Optimization is a complicated task and neverending process, so I thought I’d share some of the free tools I’m using.

The basics first (semantic structure, file size, top keywords and more)
Get a general idea of how well your site is optimized.
- SEO Analyzer – requires registration which is free

Track your search engine listings
You can’t optimize your site, if you don’t check that what you do is working.
- SERP tracker – requires google api key, which is free to get

Get links
No tool can actually get you those links, but they are essential for your rankings and you can use this tool to see how well you’re doing.
- Backlink analyzer – requires google api key, which is free to get

More text – less code
The more real text you have per page, the more relevant it’s considered to be by the search engines. So use css as much as possible and stuff it and javascript away in external files.
- Code to text ratio calculator

Valid code
The browsers might be lenient on this, but the search engines aren’t.
- w3c validator

Keywords and phrases
You need to optimize your pages for specific keywords and phrases (3 max).
- Keyword density checker

If you’ve got any other tool to add to the list, please go ahead and post them.

I have moved my blog to my own domain name (david.givoni.com/blog), because of two reasons:

- I like the personal domain name better
- I wanted more control over the looks and contents of the blog. This includes adding a few link-advertisements and adapting the theme a bit.

I’m sure we’ll all get used to this new location :)

« Previous PageNext Page »