Build Dependency Question

On my current project we have two build trees, but they are not entirely separate. Build tree A knows nothing about Build Tree B, however Build Tree  B has dependencies on projects in Build Tree A.

The problem we encounter is that the Build Trees are run on different build agents and the output of Build Tree  A is only made available to Build Tree B when all of Build Tree  A builds successfully.    When a change is made to a project in Build Tree B that depends on a change also made in Build Tree A it can be hours before Build Tree A builds successfully all the way through and in the meanwhile Build Tree B fails until assemblies from Build Tree A are updated with the dependent change.  This is because the change in Build Tree  A ins’t made available until all of Build Tree A builds successfully.

I’m trying to work out solutions, the first that comes to mind is separating out the projects that both Build Tree A and B depend on that are in Build Tree A and create a third and shorter build Tree with only these assemblies.  Allow both A and B to be dependent on this much smaller third Tree, then wait time for synchronization of dependencies for Build Tree  B for items not in Build tree B become significantly less.  Another idea was simply to have all output from from assemblies in Build tree A that Build tree B relies on be immediately made available to Build tree B when they are built, regardless of whether they causes issues for the rest of Build tree A.

Any thoughts on a better way will be greatly appreciated via twitter @binzleyTwit.

How To Manage DI Registrations For Dependencies?

On a recent project I have been working on we’ve had a dilemma with how to register dependencies in libraries that our application consumes but does not own.  The development of these libraries is under our control, but separate from the top level consuming application’s development.  Using dependency injection at all in our environment was very controversial, and now we are experiencing similar controversy regarding how to manage the dependencies of the libraries we consume when they are also using dependency injection.

The problem appears to be how to coherently register implementations that are not in assemblies that are referenced by the top level application.  My thought is that the top level app should call registration modules in the assemblies it does references and allow those routines to register types in assemblies they reference, this would allow the top level application to be free of references to assemblies it doesn’t directly know about.    Others believe the top level application should register all implementations discreetly so should have a reference to all assemblies that hold types that the application will eventually use.  I can see how that is appealing, but I tend to think of what my application is consuming as being more plugins and I’d like the top level application to not have to care so much about the details of their implementations.

Perhaps the right answer is to have separate containers for things I want to be considered plugins and let them register what they need without having the top level application aware of it at all?

 

 

oData, DataServices, or Roll Your Own

On my current project we have been debating how to query a service that is potentially exposed over rest and or soap that talks in terms of domain objects, but really queries against data entities.  What this means is that the queries at the business level are in terms of the domain object,  for instance, give me all students that have a GPA property greater than 3.   This query has to be transformed to the persistence layer, which may mean a sql query something like,

‘SELECT * FROM Student WHERE std_gpa > 3’.

The persistence layer may not use our domain objects, but it’s own entities that more closely reflect the database structure.  This means that any query written in the context of the business objects must be transformed into the context of the data entities.  What we have been trying to decide is if we should make our service query in terms of business objects and transform underneath or have our service expose our data entity objects directly and let callers transform on their own?

If we structure our service to talk in terms of business objects it will be harder to construct, since we’ll have to manage the transforms.  The service will be easier for any caller that already uses our business objects to understand, however, and if our persistence mechanism changes in any way we can hide that from the callers as well.  If we expose our data entities directly we will be able to publish the service faster, and it will be potentially more flexible as callers won’t be limited to queries that are only supported by the business object structure.

 

Considering oAuth For Source Verification Requirement

I’m working on a Restful API that has some funky security requirements.

  • The first is that it needs to be locked down so only registered users can access it
    • No problem, have users request a token using credentials to get it, secure call to gettoken with SSL.  Have token passed in header to all subsequent calls, verify token on those calls.
  • The next is that the registered users need to have a license on their machine for our product, and if it is not there they should be denied usage
    • No problem, no license then you are refused.
  • The next is that if the user does not have a license but their call is coming from a particular source then they should be allowed to run our product with no license
    • Only way I can figure to do it is hand out private keys to each source I have to identify and have them sign something (like their token) so I can verify the source when they call the api, and if it is a source that can have no license then all is good.
      • Have to do some strange things to get application to run with no license, but ok, no big deal.
    • A consumer of the api  indicated that I did not need to do this, and if I just used oAuth all would be fine.  Based on this I did a little checking into oAuth.  It seems like oAuth lets me take tokens that are authenticated by a remote service, like Twitter or Facebook, tack on authorization information to a request token and go to town.  This is all good, although more complicated than the use case I’m dealing with.  What I don’t see is that it has a way for me to confirm that a call originated from a specific application.  I could be missing something though as oAuth documentation is not all the clear to me.

Since the api is always authenticated against our own credential store I don’t see where oAuth would give me much upside in above scenario.  It handles my first two requirements, but in a more complex manner than I’m already handling them, and doesn’t seem to satisfy the last requirement for source verification, at least not as far as I can tell.

Automated Deployment Fun First Item From List Msbuild

Last week was an education in making Msbuild jump through hoops to automate our deployments.   Having a development background it is always strange switching context into the land of Msbuild.  I find myself wanting to create variables and control structures, alas that is not how it works.

One interesting challenge I was able to overcome was figuring out how to pick the first item from an item when the item is a list.  We have multiple destiation servers we deploy to and we keep them in one item as a list, ‘DeployDestination’. I only want to use one of them as the source for the backup, however. To do the deployment I iterate through the server items calling a task to delete the code at that destination and copy in the new code per the below:



<PropertyGroup>
<BackUpRoot>C:\BackupRoot</BackUpRoot>
<SourceFileLocation>C:WorkingBuildLocation\</SourceFileLocation>
</PropertyGroup>

<ItemGroup>
    <DeployDestination Include="\\serv1\root\" >
      <ServerName>Web1</ServerName>
      <Environment>Production</Environment>
    </DeployDestination>
    <DeployDestination="\\serv2\root\" >
      <ServerName>Web2</ServerName>
      <Environment>Production</Environment>
    </DeployDestination>

<DeployDestination="\\serv3\root\" >
      <ServerName>Web3</ServerName>
      <Environment>Production</Environment>
    </DeployDestination>
</ItemGroup>

<!--Backup From One Server Trying to figure out how-->


<!-- Deploy To All Servers -->
<MSBuild Projects="$(MSBuildThisFile)" Targets="DeleteAndCopy" 
Properties="Env=%(DeployDestination.Environment);
MachineName=%(DeployDestination.ServerName);
SourceFiles=$(SourceFileLocation);
FolderDestination=%(DeployDestination.FullPath)"  />



<!--More deployment fun below-->

The above works fine.  However, for the purpose of taking a backup I want to only grab the code from one of the machine’s we are deploying to.  I was hoping to be able to grab the first item in the ‘DeployDestination’ list, perhaps using an index that would look like the below:




<MSBuild Projects="$(MSBuildThisFile)" Targets="Copy" 
Properties="SourceFiles=@(DeployDestination)[0];
FolderDestination=$(BackUpRoot)"  />



<!--More deployment fun below-->

Of course that is the developer in me, and the above does not work.  I knew I could just create another property that had the source for the backup hard coded and just reference that, but that seemed wrong, why update the location in two places?  Luckily I was using Msbuild 4 and the relatively new item functions were able to save me!  By adding a meta data item to the destination that should be the backup source I was able to get just the one value I am looking for.

Now I can get what I’d like using the following:



<PropertyGroup>
<BackUpRoot>C:\BackupRoot</BackUpRoot>
<SourceFileLocation>C:WorkingBuildLocation\</SourceFileLocation>
</PropertyGroup>

<ItemGroup>
    <DeployDestination Include="\\serv1\root\" >
      <ServerName>Web1</ServerName>
      <Environment>Production</Environment>
         <BackupSource>True</BackupSource>
    </DeployDestination>
    <DeployDestination="\\serv2\root\" >
      <ServerName>Web2</ServerName>
      <Environment>Production</Environment>
    </DeployDestination>

<DeployDestination="\\serv3\root\" >
      <ServerName>Web3</ServerName>
      <Environment>Production</Environment>
    </DeployDestination>
</ItemGroup>

<!--Backup From One Server -->
<MSBuild Projects="$(MSBuildThisFile)" Targets="Copy" Properties=
"SourceFiles=@(DeployDestination->WithMetadataValue('BackupSource','True'));
FolderDestination=$(BackUpRoot)"  />

<!-- Deploy To All Servers -->
<MSBuild Projects="$(MSBuildThisFile)" Targets="DeleteAndCopy" 
Properties="Env=%(DeployDestination.Environment);
MachineName=%(DeployDestination.ServerName);
SourceFiles=$(SourceFileLocation);
FolderDestination=%(DeployDestination.FullPath)"  />


<!--More deployment fun below-->