Monday, August 31, 2009

Calling stored procedures using .net lambda expression trees

Calling stored procedures from ADO.Net leads to verbose code with lots of syntactic overhead. It tends to look something like this:

comm.CommandType = CommandType.StoredProcedure;
comm.Parameters.Add(new SqlParameter()
{
ParameterName = "@return",
Direction = ParameterDirection.ReturnValue,
DbType = DbType.Int32
});
comm.Parameters.Add(new SqlParameter("@param1", 17));
comm.Parameters.Add(new SqlParameter()
{
ParameterName = "@outparam",
Direction = ParameterDirection.Output,
DbType = DbType.String,
Size = -1
});
comm.ExecuteNonQuery();

And all this to represent a call more succinctly written as:

result = MySProc 17, out outparam

We can achieve something close to this in C#, using lambda expression trees. Assuming that we will use a static method, and that we also need to pass in a connection, our call will look like:

StoredProcedures.Exec(
() => MySProc(17, out outparamvalue),
connection);

For this to work, we will start by specifying the signature of the stored procedure with a normal .net method. Like so:

public int MySProc(int param1, out string outparam)
{
}

and so, the signature for our Exec-method becomes:

public static int Exec(Expression<Func<int>> expr,
SqlConnection conn)

The Expression<> thing is what causes the compiler to stop doing what it normally does, which is to compile C# into MSIL, and instead give you the expression tree.

Focusing on the interesting bits, and leaving the details for the full code sample at the end, we start by noting that we are only interested in expressions that are method calls:

var callExpr = (MethodCallExpression)expr.Body;

The method name is simply callExpr.Method.Name. On to find the arguments!

foreach (var arg in callExpr.Arguments)

The arguments are themselves expressions, and that’s not really what we want right now. Although “17” in our example would be easy enough to handle, this really could be an arbitrarily complex expression. Thankfully, .Net lets us compile lambda expressions into delegates that we can then call to get the value of the expression. There are two minor details to get this working: First, the expression is strongly typed (as e.g. integer) and we need the boxed value, so we wrap the expression in a cast to object. Second, we create a lambda expression with our expression as the body. The type of the lambda expression is a function taking no arguments and returning an object. The code is:

private static object GetValue(Expression expr)
{
var valueExpr = Expression.Lambda<Func<object>>(
Expression.Convert(expr, typeof(object)),
new ParameterExpression[] { });
var deleg = valueExpr.Compile();
return deleg();
}

A bit more of a problem is handling out parameters. The argument – outparamvalue in our example above – is (typically) a local variable in the calling method. How do we assign a value to that? We need to look deeper into how the C# compiler treats anonymous methods. While this topic is worthy of a blog post of its own, the following brief summary will have to do: The compiler creates a class for the anonymous method, and the local variables become fields on that class. Thus, the out parameter expression becomes a field reference.

With that in mind, we can find the field and assign it using reflection:

private static void SetValue(Expression outarg, object value)
{
var member = (MemberExpression)outarg;
object source = GetValue(member.Expression);
var field = (FieldInfo)member.Member;
field.SetValue(source, value);
}

And we are done. The full code, as promised, is:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Linq.Expressions;
using System.Text;
using System.Data;
using System.Data.SqlClient;
using System.Reflection;

namespace StoredProcedureCaller
{
public class StoredProcedures
{
private static object GetValue(Expression expr)
{
var valueExpr = Expression.Lambda<Func<object>>(
Expression.Convert(expr, typeof(object)),
new ParameterExpression[] { });
var deleg = valueExpr.Compile();
return deleg();
}

private static void SetValue(Expression outarg, object value)
{
var member = (MemberExpression)outarg;
object source = GetValue(member.Expression);
var field = (FieldInfo)member.Member;
field.SetValue(source, value);
}

private static object ConvertToDbValue(object val)
{
if (val == null)
return DBNull.Value;
else
return val;
}

private static object ConvertFromDbValue(object val)
{
if (val == DBNull.Value)
return null;
else
return val;
}

public static int Exec(Expression<Func<int>> expr,
SqlConnection conn)
{
if (expr.Body.NodeType != ExpressionType.Call)
throw new ArgumentException("StoredProcedures.Exec() can only be used on method calls");

var callExpr = (MethodCallExpression)expr.Body;

var command = new SqlCommand(){
CommandText = callExpr.Method.Name,
CommandType = System.Data.CommandType.StoredProcedure,
Connection = conn
};

command.Parameters.Add(new SqlParameter()
{
ParameterName = "@return",
DbType = DbType.Int32,
Direction = ParameterDirection.ReturnValue
});

var paramlist = callExpr.Method.GetParameters();
var i = 0;
foreach (var arg in callExpr.Arguments)
{
var param = paramlist[i];
var sqlparam = new SqlParameter();
sqlparam.ParameterName = "@" + param.Name;

if (param.IsOut)
{
sqlparam.Direction = ParameterDirection.Output;
sqlparam.Size = -1;
}
else
{
sqlparam.Direction = ParameterDirection.Input;
sqlparam.Value = ConvertToDbValue(GetValue(arg));
}
command.Parameters.Add(sqlparam);

i++;
}

command.ExecuteNonQuery();

i = 1;
foreach (var arg in callExpr.Arguments)
{
var sqlparam = command.Parameters[i];

if (sqlparam.Direction == ParameterDirection.Output)
SetValue(arg, ConvertFromDbValue(sqlparam.Value));

i++;
}

return (int)command.Parameters[0].Value;
}

}
}


Sunday, June 7, 2009

Asynchronously calling an SSIS package from ASP.Net

The following stored procedure allows you to invoke an Intergration Services package asynchronously and, additionally, send in parameters (variable values or any configurable property). I found it here, and have made some updates to it. In short, it creates a SQL Server job to run the package. Be sure to read the details in the original article.

Changes:

1. I now invoke the package using a job step of type SSIS, rather than calling DTSEXEC. (DTSEXEC was not found in the specified folder on my installation. Presumably this should be more stable).
2. Because of this, the syntax for sending in values to the package has changed.

I am also following Jeff Atwood's modest proposal to assign a guid to the code for future copy-paste reuse.

Enjoy!

-- codesnippet:5852E73F-5CE7-44CC-A5CE-D6420FF16B98

declare @package varchar(128)
set @package = '\File System\MyProject\MyPackage'

-- Initialize command
declare @cmd varchar(4000)
set @cmd = N'/DTS "' + @package + N'" /SERVER "." /CHECKPOINTING OFF /SET "\Package.Variables[User::MyVariable].Properties[Value]";"' + @variablevalue + '" /REPORTING E'

-- ALLOW CONCURRENT DTS EXECUTIONS
declare @jname varchar(128)
set @jname = cast(newid() as char(36))

-- Create job
declare @jid uniqueidentifier
exec msdb.dbo.sp_add_job
@job_name = @jname,
@enabled = 1,
@category_name = '',
@delete_level = 1,
@job_id = @jid OUTPUT

exec msdb.dbo.sp_add_jobserver
@job_id = @jid,
@server_name = '(local)'

exec msdb.dbo.sp_add_jobstep
@job_id = @jid,
@step_name = N'Execute Package',
@step_id = 1,
@cmdexec_success_code = 0,
@on_success_action = 1,
@on_success_step_id = 0,
@on_fail_action = 2,
@on_fail_step_id = 0,
@retry_attempts = 0,
@retry_interval = 0,
@os_run_priority = 0,
@subsystem = N'SSIS',
@command = @cmd,
@database_name = N'master',
@flags = 0
-- Start job
exec msdb.dbo.sp_start_job
@job_id = @jid

Monday, September 29, 2008

Do-not-break-the-testcases 101

Before commiting code to source control:

1. Ensure that you have the latest of every one else's changes.
2. Build and compile everything.
3. Run ALL testcases.
4. Should any testcase fail - take care of it BEFORE commiting.
5. Make sure to check in ALL files necessary, including .csproj, data files, and testcases. Do a "show all differences" to make sure you're not missing any changed or added files.
6. Do not be afraid to ask someone to retrieve your changes to verify that they are ok.

I wrote these rules on a recent project I was on. These are the basic steps, there are more that have been said better by others before me, such as "Commit early, Commit often", and "Do not commit and run", and tools such as Cruise control to make your life easier.

Friday, September 5, 2008

Chrome

So, Google builds a web browser that is better able to handle javascript applications - such as Google Apps. The cynical part of me would say that Google is trying to solve a problem they themselves created. 

I use GMail, Google Reader and Google Docs regularly and I'm absolutely delighted at the improved performance and stability. 

Ofcourse, running Google's apps in Google's browser has the smell of monopoly attached to it. But I'm too high on all the fresh air to care.

All hail our new evil overlords!

Wednesday, May 7, 2008

Nullable enums

Enums in .net are represented as integers, right? This means you can do the following (yes, casting the integer to object is superfluous. It is added for clarity):
public enum Foo { X, Y };
[...]
public Foo Bar { set { ... } }
[...]
PropertyInfo prop = obj.GetType().GetProperty("Bar");
prop.SetValue(obj, (object)1, null);

I.e. you can use reflection to set the value of a property of an enum type using its integer value. (You could also just cast it like so: obj.Bar = (Foo)1;, but that's not quite as interesting, for reasons that will be revealed later.)

Also, a boxed value and a boxed nullable value have the same representation, right? They are both pointers to a small object containing the primitive. As the following code shows, boxed integers and boxed nullable integers are interchangeable:

int intValue = 1;
int? nullableIntValue = 1;
object boxedInt = intValue;
object boxedNullableInt = nullableIntValue;
intValue = (int)boxedNullableInt;
nullableIntValue = (int?)boxedInt;

And, not surprisingly, the same holds for enum values, now demonstrated using reflection as in our first example:

public Foo? Bar { set { ... } }
[...]
prop.SetValue(obj, (object)Foo.X, null);

Now, combining these two examples, what do you think the following code would do?

public Foo? Bar { set { ... } }
[...]
prop.SetValue(obj, (object)1, null);

If you, like me, thought it would happily set the property to the enum value correspoding to 1, in this case Foo.X, you'd be wrong. Instead it throws the following exception:

Object of type 'System.Int32' cannot be converted to type 'System.Nullable`1

Huh? The .net designers may know why this is, and they may even have a good reason for it, but to me it just seems plain weird. It is worth noting that obj.Bar = (Foo?)1; works just fine.

This issue came up when mapping objects from database tables, using integers to represent enum values. I've reluctantly added the following workaround to my mapping code:

PropertyInfo prop = [...]
object val = [...]
if ((prop.PropertyType.IsGenericType) &&
(prop.PropertyType.GetGenericTypeDefinition() == typeof(Nullable<>)))
{
Type targetType = prop.PropertyType.GetGenericArguments()[0];
if (targetType.IsEnum)
val = Enum.ToObject(targetType, val);
}
prop.SetValue(obj, val, null);

Sunday, April 20, 2008

Bad service

It appears there is a problem with Sql Server Service Broker when executing DROP SERVICE. I had it choke the server to eventually fail with the message "There is insufficient system memory to run this query".

The problem is that Service Broker needs to end all open conversations, and it does so in one transaction. If you have many open conversations troubles arise.

I found the following link which describes a fix for the problem. I'm reprinting the fix here in case the link goes bad:

declare crsConversations cursor for
select conversation_handle
from sys.conversation_endpoints e
join sys.services s on s.service_id = e.service_id
where s.name = 'myservicename';
open crsConversations;
declare @dh uniqueidentifier;
declare @batch int;
select @batch = 0;
begin transaction
fetch next from crsConversations into @dh;
while @@fetch_status =0
begin
-- replace this with end conversation @dh with error ... if
-- and error has to be sent to the peer
--
end conversation @dh with cleanup;
-- commit every 1000 conversations ended
select @batch = @batch + 1;
if @batch > 999
begin
commit transaction;
begin transaction;
end
fetch next from crsConversations into @dh;
end
commit transaction;
close crsConversations;
deallocate crsConversations;
go
drop service 'myservicename'
go


Now, if you have this problem the real question is, why are you leaking open conversations?

Saturday, April 19, 2008

(contd.)

My old blog can be found here:
http://blogs.codegear.com/andersivner.
I've since then left Borland/CodeGear and now work for Sigma.

The product I worked on at Borland/CodeGear, ECO, is now available from CapableObjects, a company run by my friends and former collegues. They have recently released ECO for Visual Studio. Way cool!