Monday, September 20, 2010

Free MSDN Ultimate subscriptions (+VS 2010) - Giveaway contest

@YaronNaveh

Last July I was nominated as a Microsoft MVP. A few days ago I got my "MVP Package":


It is a very cool package!

The package also includes this:



What's this?
Three MSDN ultimate subscription keys. Each one includes:


  • Windows Azure Platform


  • Visual Studio 2010 Ultimate


  • Visual Studio TFS 2010


  • Expression Studio Ultimate


  • Windows 7, Windows Server 2008 R2 and SQL Server 2008


  • Microsoft Office Professional Plus 2010, Project Professional 2010, Visio Premium 2010


  • and much more...

    This package is worth $11,899. And you will get it for free.

    How do I get a $11,899 MSDN Ultimate for free?
    Publish a comment to this post describing a project you were involved with where web services interoperability was required. The three most detailed descriptions by 8 October 2010 will grant their author a $11,899 MSDN Ultimate for free!

    Edit: detailed = in quality, not length :) chosen by me.

    @YaronNaveh

    What's next? get this blog rss updates or register for mail updates!

    15 comments:

    Ken Egozi said...

    When you say "web services" do you mean is the broader sense, or asmx/wcf only?

    Yaron Naveh (MVP) said...

    json over protobuf or alike is also great :)

    Unknown said...

    First. Wow, very nice prize i must say. I had experienced the MSDN subsription for a year, but now expired, and i miss it so much.
    Second.(The Experience)
    I am not the most experienced with web services and I actually didn't have a project where i had to interop with a existing service (there is a future project that will). All my projects for the last 2 years+ I have had control over the server/service but I did not have a choice of technology. I had to use WCF, I had to use netTCPBindings and I had to use custom username validation. That is when i came across your blog and joined. The reasons for the restrictions? Control, from what i was told. My boss tells me that TCP is much easier for him to control (and it did work out great, we have 4+ projects all using the same functionality) with firewalls, security and performance. But what we gain in performance and such we lose with nice built in features (ha ha).
    Over the last year, one project (first project) where i came across issues with slow connections (opening mainly) taking a long time, buffers being filled to thier max (report queries), certificates expiring (face palm, great day) due to securing the transport layer with certificates, routing from thier ISP screwing with businesses over the weekend (shut down some of our ports we used). All of these experinces were extremely stressful once they occured but i was able to get through them since i rely'd mainly on my coding to open and secure the connections instead of the config file (you know the default calls). I did use the configurations but I controlled that in code for which configuration to load and when and what binding config to use and how. I did this by creating my own CodeSmith templates then edtiing as needed from there, and it has been a great experience since.

    Yaron Naveh (MVP) said...

    One more update: There will be advantage to early submissions.

    Ladislav Mrnka said...

    This is very nice opportunity. Thank you Yaron. I will also try my luck. During last two years I used WCF in 4 interoperability scenarios:

    1. Two applications written in Progress (I have no idea what is it) integrated through BizTalk with our application. I was responsible for BizTalk development and I used WCF port for communication with those applications. Advantage of WCF over common SOAP port (based on ASMX) was support for wsdl:faults (FaultContracts) which was required.

    2. Integration with IBM WebSphere MQ which exposed web services. Web services used HTTPS with client certificate and Username token profile with digested password. We used custom message inspector to add SOAP header to outgoing messages. I helped my colleague to make it work.

    3. Integration with PHP + WSO2 application which exposed web service secured with message security over VPN tunnel. The service used WS-Security (signing and encryption), X.509 certificate token profile, Username token profile with digested password. We had to create custom token and credentials to support full integration of digested password into WCF security pipeline.

    4. Integration with some IBM CMS. This looked very easy because CMS exposed several services over HTTPS and access to server was controlled in ACL. But on production we faced strange issue. We were not able to create connection to CMS. After several weeks we discovered that problem is in TLS. Service worked when protocol was switched to SSLv3. After some server / network maintenance everything worked as expected.

    Anonymous said...

    I had done three projects involving WCF - Java interoperability.

    1. The first is a Java client calling WCF Service that required two factor authentication. This involved both the client and the server to mutually authenticate each other using client and server certificates that also encrypts the SOAP calls (message security). On top of that the caller identity from a LDAP server was passed to the WCF service for authorisation and auditing purpose. After some trial and error I used the Netbeans/Metro stack on Java side and some custom validator development and web.config configurations to get this working. I also had to customise the ServiceHostFactory to create a one page WSDL file as Netbeans cannot cope with a multi-file WSDL (WCF default).
    The best thing was the same WCF web service could be configured very easily to add another endpoint with NetTcpBinding for a .Net client. This project utilised the high customisability and extension points available in WCF.

    2.The second was a WCF service that in turn calls a 3rd party Java web service using HTTP/SSL with both client and server certificates. The security on other end is configured by IBM Tivoli. On top that the server cert provided doesn't match the identity and we had to handle this Cryptographic exception to override the certificate name mismatch.


    3. A Cobol program calling a WCF web service that only understands SOAP rpc style. We had to custom XML serialization to handle this scenario.

    Travis Spencer said...

    Hello. Thanks for the contest.

    As far as a project that involves web service integration goes...I'm nearing completion of a project for a local construction company. The idea behind the project is to sync data coming from the HR database and Active Directory (AD). The program periodically checks to see if any new or updated employee records exist in the HR database. If there are any updated records, a locally hosted web service is called which populates/updates the appropriate employee AD object with the information from the HR database query. A similar process is used for new employees (although there are some extra steps before the AD objects are created).

    Basically, a web service was created to handle creating, updating, and querying Active Directory records. The great thing is this web service can be used for various projects, not just the one I'm completing.

    Anonymous said...

    I have implemented a number of solutions with WCF, but the most interesting one was also the most challenging.
    The system had to expose a number of services for the Danish Government. They required that all data exchanged has to adhere to the OIOXML standard. This meant that the message formats was pre specified even before the project started.
    WSCF.blue proved to be a huge help building the data contract and message contracts. Without WSCF.blue we would properly still be working on the solution :-)
    The customer had specific requirement regarding security, validation and instrumentation – requiring implementation of X509CertificateValidator for custom certificate validation, EndpointBehaviors and DispatchMessageInspector for schema validation and instrumentation.
    To prove the interoperability, a couple of reference client implementations were implemented on other platforms (Java etc.).

    tatman said...

    I had to implement a WCF to Java webmethods 1.1 interface. There were a lot of challenges to over come. For starters, neither side understood each others technology. When I asked for a WSDL, they did not have any tools to produce it so they had to manually create a WSDL which they provided to me as a single file. While they had the data contracts right, the bindings, WS-Security and Ports was wrong. SVCUtil ended up generating incorrect settings. WCF has some limitations in webmethods 1.1 support. They were also using some really old standards that pre-date WCF.

    There were other challenges as well. They could not support using binarysecuritytoken for security and everything had to be "plain text". And (like Yaron's latest post Wednesday, September 29, 2010 WCF: Server signs response with a different certificate than client uses for encryption) their reply used different certificates.

    To get my bindings and security to work correctly, I had to abandon using the config file. Everything became code. I had to create custom bindings, modifed AsymmetricSecurityBindingElement settings and message version settings; create custom X509Store certificate look ups; create custom X509CertificateValidator and custom message encoder (to get around CRLF they put in the certificate signature in their replies); and finally (Yaron, thank you for that post as it solved my final issue) apply a WCF patch.

    To figure a lot of it out, much was trial and error. Google something, trying something, use fiddler to compare. Repeat. Yes it was very tedious and, at times, stressful. But I also learned a lot about WCF.

    Matt

    Sahan Arampath said...

    Hi Yaron, Thinks for this opportunity. Actually I got to know about you and your fantastic blog, when I am doing this project. I am still very appreciating the support that u gave in that time. Specially Problem I got when I am implementing WCF customer authenticator with X.509 certificates and the problem I got when I hosting that WCF service in IIS. (Specially granting permission for keys) so again big thank for your support and help.


    That Project was developing a one Centralized administration module for Banking System. The main object was that particular bank using Since it using several tools and technologies, those applications can be run on multiple platforms Administrators need to Control their systems User’s. Branches also can be placed in several locations in a country or territory. Since Administration Module is centralized.



    WCF used as the main Interface to Communicate with External systems that using this as there administrator system with maximum of security. Entire system is based on Service Oriented Architecture. Client System communications are based on the Services (API). In that manner, the Admin module will be providing very loosely coupled easy way to maintain the system. The admin module allow you to do Create add, edit delete User for all the Systems, Users managing
    Declare the Scope for each individual user or user group (roles/Credit Limit etc). Monitoring and logging users activities within all systems. All client systems will implement this service as there admin module that enable to do all relating activities.

    All Client logins validating with username and passwords by WCF customer authenticator with X.509
    Admin module has the possible IP range for every logging Client can be using. So I obtain the IP and validate with the IP range along with WCF customer validating. Even though X.509 having encrypt the communications, I also encrypted passing data by .net Cryptography

    felipe said...

    Thanks for the chance to win.
    I'm working on two WCF services that I will integrate with Windows Phone 7 apps that I am creating.
    The first is a service that mostly hosts a database and deals with all the access to it. It is for a running program, but can also be used for biking, walking, etc. On the app, you can design a run (a path that you plan to take) and then you can share it via the web service. The service takes a zip code and returns matches. It also handles uploads and downloads of the runs.
    My second webservice is part of a second app that allows a user to remotely access and control their computer from anywhere. Because WP7 does not allow socket access, I am developing this web service with WCF as well. This service will use WCF streaming for optimization. It has a lot of methods to get/set things like mouse position, what's on the screen, any text the user has entered, etc.
    I plan to integrate this with a Windows Service that will host it (even though it is currently hosted in IIS) and host a similar version in the cloud (Azure).

    These two services were my first experience with WCF. I've really liked it and found it pretty easy to use and learn. The only trouble I had was with Mtom and streaming, but I've figure those mostly out (Still working on Mtom: http://social.msdn.microsoft.com/Forums/en-US/wcf/thread/90f2b596-8757-4d90-b863-e41d8310fbf5 )
    Also, the deleted comment was mine; I wanted to add more details since I found out the contest is mostly base on the level of detail, but Blogger wouldn't let me edit my comment.

    Ken Egozi said...

    My story is too long to fit in a comment (blogger is complaining) so I will break it into a couple of comments:

    sorry for the time it took - this is my story:


    here's my integration story (PART I).

    On a project I was involved, we had a mix of .NET and Java services. When designing the system, we had some characteristics in mind:
    1. auto-discovery
    2. easy monitoring and debugging
    3. high availability and fault tolerance
    4. speed speed speed
    5. side-by-side deployments

    The system also has a high-power in-memory computing grid that (amongst other things) supply us with durable queues, which is the basis for async messages (on top of the many RPC style calls we do, mainly for querying data)

    Ken Egozi said...

    PART II:


    here is part of the design process and decisions we have made:
    1. binary TCP/IP vs. HTTP:
    although bare TCP/IP is faster (at least for traffic that is heavy on non-textual data), we decided to go with HTTP. The reasons are merely pragmatic
    a. There is predefined place for metadata - in the headers. this makes it easier to implement both ends of the technology (.NET and Java). It also makes plugging middleware such as load-balancers, distributors etc easier, as they won't need to unpack the payloads
    b. HTTP has solid containers that require minimal configuration and hassle on both sides - tomcat and IIS. this reduces the amount of custom code, and shorten the delivery time
    c. the overhead is minimal. We benchmarked prototypes with both (not thoroughly though as there were many benefits to HTTP).

    Ken Egozi said...

    PART III


    2. Serialization format - lets look at a few alternatives:
    thrift - was awkward to work with. Back at the time (two years ago) the ecosystem was not very friendly, especially for the .NET side of things
    protobuf - a very speedy (and well crunched) format. However suffer from a major shortage - it is making assumptions on the amount and order of fields in each entity. This makes side-by-side very difficult.
    BSON (binary json) - was not an option back then. Even today it has flaky .NET implementation.
    All of the binary formats also suffer from opacity - it is not straightforward to plug in a sniffer and figure out things when there are problems
    We finally setteled on JSON as the primary* format. It has robust and well supported implementations all around, it is schemaless and therefore makes it easy to consume a service that return extra data or send extra data to a service, and it is fully transparent, so debugging is made as simple as it gets
    The downside of JSON is that metadata (field names) is passed along every data entity. So if a payload contain a large mount of tiny members, the overhead becomes too high.
    On one of our services we discovered that we had to plug in a specialized serializer/deserializer in order to speed things up. so we encoded the data to be sent into an array of longs, which made the payload be 60% smaller on average, and overall call time went down 80% in average (since we avoided reflection on reconstruction)

    Ken Egozi said...

    PART IV (and last)


    All in all - the lessons to be learned from this experience are:
    1. do not optimise prematurely.
    2. use debugabble and transparent formats. If not, build custom profilers to your custom serializations to make deep-level monitoring possible
    3. so that you will be able to profile the heck out of your messaging subsystem
    4. so that you will be able to optimize Just-In-Time