Search Results

Search found 59139 results on 2366 pages for 'david olivencia(at)oracle com'.

Page 244/2366 | < Previous Page | 240 241 242 243 244 245 246 247 248 249 250 251  | Next Page >

  • Top Tweets SOA Partner Community – November 2011

    - by JuergenKress
    Send your tweets @soacommunity #soacommunity and follow us at http://twitter.com/soacommunity soacommunity SOA Community Dutch ACEs SOA Partner Community award celebration wp.me/p10C8u-i9 OracleBPM Gauging Maturity of your BPM Strategy – part 1/2, bit.ly/vJE9UZ MagicChatzi Dutch ACE’s and ACE Directors had a small party: achatzia.blogspot.com/2011/11/celebr… leonsmiers #Capgemini #Oracle #BPM Blog index bit.ly/tUYtvD #yam lucasjellema Blog post by my colleague Emiel on the AMIS blog: Timeouts in Oracle SOA Suite 11g – tinyurl.com/73amo3r biemond Solving __OAUX_GENXSD_.TOP.XSD with BPEL: When you use an external web service in combination with a BPEL servic… t.co/Gzzatzrr OracleBlogs Jumpstart Fusion Middleware projects with Oracle User Productivity Kit ow.ly/1fJMev cpurdy on Oracle Coherence data grid, its new RESTful APIs, and Oracle Service Bus (OSB): blogs.oracle.com/slc/entry/orac… Accenture Learn how Service-Oriented Architecture can help public service agencies solve legacy system issues. bit.ly/sTteM4 #SOA eelzinga Thanks for organising it Andreas! #soacommunity eelzinga Had a nice drink with the fellow Dutch Oracle ACE members for a little celebration of the SOA Community Partner Award. #soacommunity EmielP Wrote a blogpost about timeouts in the #Oracle #SOA Suite: bit.ly/uhUcrX OracleBlogs Processing Binary Data in SOA Suite 11g t.co/Tzd1xBsY OracleBlogs Finding the Value in SOA by Stephen Bennett t.co/9MMLJoLz OTNArchBeat SOA All the Time; Architects in AZ; Clearing Info Integration hurdles t.co/5viNj8ib OracleBlogs Demo: Business Transaction Management with SOA Management Pack ow.ly/1fFBv3 OTNArchBeat SOA All the Time; Architects in AZ; Clearing Info Integration hurdles t.co/Dnfzo0PN oracletechnet Wikis.oracle.com lives leonsmiers A new #capgemini #oracle #blog, Measuring the Human Task activity in Oracle BPM bit.ly/uPan08 #yam @CapgeminiOracle OTNArchBeat 3 SOA business cases, explained in a 2-minute elevator speech | @JoeMcKendrick t.co/aYGNkZup OTNArchBeat Gartner, Inc. places Oracle SOA Governance in Magic Quadrant for SOA Governance Technologies t.co/bSG5cuTr Jphjulstad Red carpet to Oracle BPM – evita.no evita.no/ikbViewer/soa-… Oracle #Oracle Named a Leader in #SOA Governance Magic Quadrant by Leading Analyst Firm t.co/prnyGu2U soacommunity What presentations & topics do you like to see at the next SOA & BPM & Webcenter Community Forum early 2012? #soacommunity soacommunity Oracle BPM Suite 11g Handbook Released wp.me/p10C8u-hU OTNArchBeat SOA Development Virtual Developer Day (On Demand) | @soacommunity bit.ly/sqhQmX OracleBlogs SOA Development Virtual Developer Day (On Demand) t.co/MDrdnx0h 9 Nov Favorite Undo Retweet Reply OracleBlogs Specialized Partners Only! New Service to Promote Your Events t.co/qTgyEpY4 biemond @stevendavelaar this is for you t.co/hInKCcfY it explains your sso problem soacommunity SOA Development Virtual Developer Day (on demand) t.co/flXPWk4R soacommunity IPT Swiss SOA Experts – thanks for the nice ink wp.me/p10C8u-i3 soacommunity Enjoy #wjax specially the presentations from our #ACE @t_winterberg @myfear @AdamBien pic.twitter.com/m8VcBSG3 OTNArchBeat Discounts on books, more, for Oracle Technology Network members bit.ly/vRxMfB OracleSOA Justify the ROI of SOA in 10 seconds…a pic is worth 1000 words bit.ly/roi_of_soa_img #oraclesoa #soa #oow11 orclateamsoa A-Team SOA Blog: Case Management in BPM 11g -  Mark Foster Oracle BPM 11g & Case Management I’ve seen… t.co/l5zb6pFr t_winterberg Die nächste SIG #SOA steht an: 7.12. in Hamburg. Neues Tooling und Erfahrungen rund um Oracle FMW, SOA, BPM… (cont) deck.ly/~YC57v OracleBlogs Continuous Integration for SOA/BPM ow.ly/1fsekI OracleBlogs BPM Suite 11g Handbook Released ow.ly/1frlzv lucasjellema Iterating over collection (array) in BPM (and dispatching jobs for entries in array): t.co/1SEhSvWv – subprocesses are the key. lucasjellema Lucas Jellema Useful tip from Mark Nelson: BPM API documentation (as well as Human Workflow Service) available: redstack.wordpress.com/2011/09/28/api… OTNArchBeat SOA, cloud: it’s the architecture that matters | Joe McKendrick zd.net/tNCiTF orclateamsoa: Building a job dispatcher in BPM -or- Iterating over collections in BPM ow.ly/1frbrz orclateamsoa Using the Database as a Policy Store for SOA 11g ow.ly/1frbrA OracleBPM Oracle launches Process Accelerators for BPM: t.co/XPEE61QL Jphjulstad Human-Centric BPM Selection Checklist t.co/3TZXZHLH OracleBlogs Fusion Middleware General Session at OOW 2011: Missed It? Read On… t.co/aU5JvM6K gschmutz Great! The product page of the OSB 11g Development Cookbook is now online: t.co/5Jfbe6Ng Looking forward to get it, u too? brhubart Oracle IT Architecture Essentials; Lightweight Composite Service Development with SCA and Spring; Cloud Migration ow.ly/7esNg eelzinga New blogpost : Oracle Service Bus, Generic fault handling, bit.ly/sGr4UL #osb #oracleservicebus For regular information on Oracle SOA Suite become a member in the SOA Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) Blog Twitter LinkedIn Mix Forum Technorati Tags: soacommunity,twitter,Oracle,SOA Community,Jürgen Kress,OPN

    Read the article

  • Rychlejší aplikace i bez zmen dotazu - 2.díl - vliv vázaných promenných

    - by david.krch
    V minulém díle jsme si na vzorovém príkladu vkládání 100.000 záznamu ukázali jak velkou zátež muže pro databázový server znamenat zbytecne casté commitování. Dobu zpracování této operace jsme snížili ze 167 na 105 sekund, tedy o tretinu. Ke slibovanému osmdesátinásobnému zrychlení nám chybí ješte dva kroky. V záveru predchozího dílu jsme zjistili, že parsování (rozbor a optimalizace) dotazu zabralo serveru celých 74 ze zminovaných 105 sekund. Svou pozornost dnes zameríme práve na minimalizaci casu parsování.

    Read the article

  • You Are Hiring But Do Candidate&rsquo;s Want to Work For You

    - by david.talamelli
    So here you are – it has happened, you are now interviewing for that position that you have either applied for or maybe were called about. Whether you are an “active” candidate looking for a job or a “passive” candidate who was contacted about the opportunity, it doesn’t matter now. Regardless of the circumstances of how you got to the interview stage, how you and your new potential manager connect with each other at interview will play a part in whether you are successful in landing that job. The best manager/employee relationships I think tend to be the ones where both the manager and employee have a common goal that they are both working towards and they work together in unison to achieve these goals. Candidates – when you are interviewing for a role, remember that an interview is a two way process. An interview shouldn’t be just a case of a company interviewing you to see if you are a good fit for a certain role. Don’t forget in an interview process it is equally important that you take the opportunity to similarly interview the company to see if that role/company are the right place for you to move to as the next step in your career. I think an interview should not only be a chance for a Hiring Manager to get to better know a candidate and asses his capability and cultural fit for a team/company but it should also be a chance for the candidate to similarly assess a company or manager about whether they are someone that they want to work with. Managers – I know Recruiters have been talking about the “war for talent” since before many of you were managers, but there is no denying it – it exists. You are not only competing with other companies for talented individuals but you are also competing with the existing companies that those talented individuals are working at. Companies are not going to let the people they have identified as superstars resign without a fight (this is the classic Counter Offer scenario which may be another blog post in itself). So how do we get these great people – their current employer will do all they can to keep them, everyone else wants them – does this mean all hope is lost? No, absolutely not. The same reasons that have always existed on why candidates are interested in other opportunities is still there: it could be that someone is looking for career advancement, or they want the chance to work with new technology or maybe you have an opportunity that is exactly what that person is looking to do. As a Hiring Manager don’t just conduct your interviews in question/answer mode. You should talk to that individual to work out what it is they are looking for and you can then relate how your role addresses that. It is potentially going to be the two of you working together so you two are the ones who have to be most comfortable with each other. Don’t oversell the role – set realistic expectations of what that candidate can expect working in your team – give them the good, the bad and the ugly so they can make an informed decision. Manager’s think back to when you last were looking for a job and put yourself in the candidate’s shoes. When you were looking for a job, what was it that you wanted to know about Oracle, or what was it that you wanted more information about. There are some great Business Leaders that work here at Oracle – if you are one of them it is likely that you already are doing all these things anyway. The good news for you is that you are also likely raising yourself head and shoulders above what many interviewers do – that in itself gives you a competitive advantage in this ‘war for talent’ but as a great Business Leader you already know that

    Read the article

  • Cuppa Corner talk "A trip to First Normal Form" available - Domains, Functional Dependencies, Repeat

    - by tonyrogerson
    It's 15 minutes, I talk about Domains, Functional Dependencies, Repeating Groups, Relational Valued Attributes and of course First Normal Form. http://sqlcontent.sqlblogcasts.com/video/cctr20100507dbdesign1nf/cctr20100507dbdesign1nf.html For questions just ask on the http://sqlserverfaq.com chat control or Twitter using #sqlfaq tag. Slides are also availble here: http://sqlcontent.sqlblogcasts.com/video/cctr20100507dbdesign1nf/cc_tr20100507_dbdesign1nf.pptx...(read more)

    Read the article

  • Allow Incoming Responses from Curl On Ubuntu 11.10 - Curl

    - by Daniel Adarve
    I'm trying to get a Curl Response from an outside server, however I noticed I cant neither PING the server in question nor connect to it. I tried disabling the iptables firewall but I had no success. My server is running behind a Cisco Linksys WRTN310N Router with the DD-wrt firmware Installed. In which I already disabled the firewall. Here are my network settings: Ifconfig eth0 Link encap:Ethernet HWaddr 00:26:b9:76:73:6b inet addr:192.168.1.120 Bcast:192.168.1.255 Mask:255.255.255.0 inet6 addr: fe80::226:b9ff:fe76:736b/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:49713 errors:0 dropped:0 overruns:0 frame:0 TX packets:30987 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:52829022 (52.8 MB) TX bytes:5438223 (5.4 MB) Interrupt:16 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:341 errors:0 dropped:0 overruns:0 frame:0 TX packets:341 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:27604 (27.6 KB) TX bytes:27604 (27.6 KB) /etc/resolv.conf nameserver 192.168.1.1 /etc/nsswitch.com passwd: compat group: compat shadow: compat hosts: files dns networks: files protocols: db files services: db files ethers: db files rpc: db files netgroup: nis /etc/host.conf order hosts,bind multi on /etc/hosts 127.0.0.1 localhost 127.0.0.1 callcenter # The following lines are desirable for IPv6 capable hosts ::1 ip6-localhost ip6-loopback fe00::0 ip6-localnet ff00::0 ip6-mcastprefix ff02::1 ip6-allnodes ff02::2 ip6-allrouters /etc/network/interfaces # The loopback network interface auto lo iface lo inet loopback # The primary network interface auto eth0 iface eth0 inet static address 192.168.1.120 netmask 255.255.255.0 network 192.168.1.1 broadcast 192.168.1.255 gateway 192.168.1.1 The Url to which im trying to get a connection to is https://www.veripayment.com/integration/index.php When I ping it on terminal heres what I get daniel@callcenter:~$ ping www.veripayment.com PING www.veripayment.com (69.172.200.5) 56(84) bytes of data. --- www.veripayment.com ping statistics --- 2 packets transmitted, 0 received, 100% packet loss, time 1007ms Thanks in Advance

    Read the article

  • Welcoming Gavin Payne to Coeo

    - by Christian
    I’m very pleased to announce that Gavin Payne starts with us today as a Senior Consultant! We’ve known Gavin for a while now through his work in the UK SQL Server Community and when a role came up in our consulting practice I took the opportunity to talk to him about it. Gavin brings a broad range of experience from his recent background as a Solution Architect and has a particular interest in virtualization which is very prominent in the work that we do so we’re thrilled to have him on board. He’s also presenting a couple of sessions at the upcoming SQLBits conference in Brighton where Coeo is once again sponsoring and exhibiting so be sure to congratulate him in person if you’re going to be there! Gavin has a prolific online presence so be sure to subscribe to his blog and follow him on twitter! Blog: http://blog.gavinpayneuk.com/ Twitter: http://twitter.com/gavinpayneuk SQLBits: http://sqlbits.com/Speakers/Gavin_Payne   Welcome Gavin! Christian Bolton  - MCA, MCM, MVP Technical Director http://coeo.com - SQL Server Consulting & Managed Services

    Read the article

  • ODI 11g – Scripting Repository Creation

    - by David Allan
    Here’s a quick post on how to create both master and work repositories in one simple dialog, its using the groovy capabilities in ODI 11g and the groovy swing builder components. So if you want more/less take the groovy script and change, its easy stuff. The groovy script odi_create_repos.groovy is here, just open it in ODI before connecting and you will be able to create both master and work repositories with ease – or check the groovy out and script your own automation – you can construct the master, work and runtime repositories, so if you are embedding ODI as your DI engine this may be very useful. When you click ‘Create Repository’ you will see the following in the log as the master repository starts to be created; ====================================================== Repository Creation Started.... ====================================================== Master Repository Creation Started.... Then the completion message followed by the work repository creation and final completion message. Master Repository Creation Completed. Work Repository Creation Started. Work Repository Creation Completed. ====================================================== Repository Creation Completed Successfully ====================================================== Script exited. If any error is hit, the script just exits and prints any error to the log. For example if I enter no passwords, I will get this error; ====================================================== Repository Creation Started.... ====================================================== Master Repository Creation Started.... ====================================================== Repository Creation Complete in Error ====================================================== oracle.odi.setup.RepositorySetupException: oracle.odi.core.security.PasswordPolicyNotMatchedException: ODI-10189: Password policy MinPasswordLength is not matched. ====================================================== Script exited. This is another example of using the ODI 11g SDK showing how to automate the construction of your data integration environment. The main interfaces and classes used here are IMasterRepositorySetup / MasterRepositorySetupImpl and IWorkRepositorySetup / WorkRepositorySetupImpl.

    Read the article

  • ODI 11g – Scripting Repository Creation

    - by David Allan
    Here’s a quick post on how to create both master and work repositories in one simple dialog, its using the groovy capabilities in ODI 11g and the groovy swing builder components. So if you want more/less take the groovy script and change, its easy stuff. The groovy script odi_create_repos.groovy is here, just open it in ODI before connecting and you will be able to create both master and work repositories with ease – or check the groovy out and script your own automation – you can construct the master, work and runtime repositories, so if you are embedding ODI as your DI engine this may be very useful. When you click ‘Create Repository’ you will see the following in the log as the master repository starts to be created; ====================================================== Repository Creation Started.... ====================================================== Master Repository Creation Started.... Then the completion message followed by the work repository creation and final completion message. Master Repository Creation Completed. Work Repository Creation Started. Work Repository Creation Completed. ====================================================== Repository Creation Completed Successfully ====================================================== Script exited. If any error is hit, the script just exits and prints any error to the log. For example if I enter no passwords, I will get this error; ====================================================== Repository Creation Started.... ====================================================== Master Repository Creation Started.... ====================================================== Repository Creation Complete in Error ====================================================== oracle.odi.setup.RepositorySetupException: oracle.odi.core.security.PasswordPolicyNotMatchedException: ODI-10189: Password policy MinPasswordLength is not matched. ====================================================== Script exited. This is another example of using the ODI 11g SDK showing how to automate the construction of your data integration environment. The main interfaces and classes used here are IMasterRepositorySetup / MasterRepositorySetupImpl and IWorkRepositorySetup / WorkRepositorySetupImpl.

    Read the article

  • how do you document your development process?

    - by David
    My current state is a mixture of spreadsheets, wikis, documents, and dated folders for my input/configuration and output files and bzr version control for code. I am relatively new to programming that requires this level of documentation, and I would like to find a better, more coherent approach. update (for clarity): My inputs are data used to generate configuration files with parameter values and my outputs are analyses of model predictions. I would really like to have an approach that allows me to associate particular configuration(s) with particular outputs, so that I can ask questions of my documentation such as "what causes over/under estimates?" or "what causes error 'X'"?

    Read the article

  • Migrate ASP.Net web site from IIS6 to IIS7

    - by David.Chu.ca
    I have to migrate an ASP.Net web site from IIS6 to IIS7. I tried to copy the all files for a web site from IIS6 (c:\inetpub\wwwroot\MySite) to another box with Windows Server 2008 R2 where IIS7 is the default web server. However, the simply copy seems not working. Should I rebuild the web site for IIS7 or should I make changes on the new box with IIS7 such as web.config? Thanks for the comments. Further investigation I found that http handers seems caused exception: <!--httpHandlers> <add path="Reserved.ReportViewerWebControl.axd" verb="*" type="Microsoft.Reporting.WebForms.HttpHandler, Microsoft.ReportViewer.WebForms, Version=8.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" validate="false"/> </httpHandlers--> After I comment out the above handler in web.config, the web page works fine. This is just my initial test. I am not sure if I should rebuild the web site from source codes or not. If so, do I need to specify for IIS7?

    Read the article

  • API Management Solutions

    - by Mike
    I'm currently building an API and am looking for a tool to allow me to monitor (in a GUI) and rate limit usage. I've come across a few enterprise solutions including: http://apigee.com/ http://mashery.com/ http://www.layer7tech.com/ http://www.3scale.net/ The Apigee enterprise plan is exactly what I'm looking for but plans start at $3000 / month which is out of my price range. The other solutions are all either to expense or do not provide the solution I'm looking for. This led me to look at some open source options including: http://apiaxle.com/ https://code.google.com/p/varnish-apikey/wiki/UsageManual Varnish seems like a fairly complete solution, however I would need to build a GUI to visualise the data. My final option would to build a solution from scratch using EventMachine and ruby. Any advice?

    Read the article

  • OWB 11gR2 - Find and Search Metadata in Designer

    - by David Allan
    Here are some tools and techniques for finding objects, specifically in the design repository. There are ways of navigating and collating objects that are useful for day to day development and build-time usage - this includes features out of the box and utilities constructed on top. There are a variety of techniques to navigate and find objects in the repository, the first 3 are out of the box, the 4th is an expert utility. Navigating by the tree, grouping by project and module - ok if you are aware of the exact module/folder that objects reside in. The structure panel is a useful way of finding parts of an object, especially when large rather than using the canvas. In large scale projects it helps to have accelerators (either find or collections below). Advanced find to search by name - 11gR2 included a find capability specifically for large scale projects. There were improvements in both the tree search and the object editors (including highlighting in mapping for example). So you can now do regular expression based search and quickly navigate to objects within a repository. Collections - logically organize your objects into virtual folders by shortcutting the actual objects. This is useful for a range of things since all the OWB services operate on collections too (export/import, validation, deployment). See the post here for new collection functionality in 11gR2. Reports for searching by type, updated on, updated by etc. Useful for activities such as periodic incremental actions (deploy all mappings changed in the past week). The report style view is useful since I can quickly see who changed what and when. You can see all the audit details for objects within each objects property inspector, but its useful to just get all objects changed today or example, all objects changed since my last build etc. This utility combines both UI extensions via experts and the public views on the repository. In the figure to the right you see the contextual option 'Object Search' which invokes the utility, you can see I have quite a number of modules within my project. Figure out all the potential objects which have been changed is not simple. The utility is an expert which provides this kind of search capability. The utility provides a report of the objects in the design repository which satisfy some filter criteria. The type of criteria includes; objects updated in the last n days optionally filter the objects updated by user filter the user by project and by type (table/mappings etc.) The search dialog appears with these options, you can multi-select the object types, so for example you can select TABLE and MAPPING. Its also possible to search across projects if need be. If you have multiple users using the repository you can define the OWB user name in the 'Updated by' property to restrict the report to just that user also. Finally there is a search name that will be used for some of the options such as building a collection - this name is used for the collection to be built. In the example I have done, I've just searched my project for all process flows and mappings that users have updated in the last 7 days. The results of the query are returned in a table containing the object names, types, full path and audit details. The columns are sort-able, you can sort the results by name, type, path etc. One of the cool things here, is that you can then perform operations on these objects - such as edit them, export single selection or entire results to MDL, create a collection from the results (now you have a saved set of references in the repository, you could do deploy/export etc.), create a deployment script from the results...or even add in your own ideas! You see from this that you can do bulk operations on sets of objects based on search results. So for example selecting the 'Build Collection' option creates a collection with all of the objects from my search, you can subsequently deploy/generate/maintain this collection of objects. Under the hood of the expert if just basic OMB commands from the product and the use of the public views on the design repository. You can see how easy it is to build up macro-like capabilities that will help you do day-to-day as well as build like tasks on sets of objects.

    Read the article

  • Ceský Oracle se stehuje

    - by david.krch
    Pro prípad, že se k nám chystáte na nejaký seminár, nebo schuzku, bude se vám hodit informace, že jsme se na konci minulého týdne rozloucili s naší budovou kousek od Václavského námestí a ode dneška nás najdete u metra Chodov na adrese: V Parku 2308/8 148 00 Praha 11 Mapa Pokud vám tato adresa prijde povedomá, hádáte správne - pristehovali jsme se k našim novým kolegum puvodne pracujícím pod hlavickou Sun Microsystems.

    Read the article

  • Le programme de rentrée de la rubrique C++, découvrez les articles qui seront publiés prochainement

    Le mois de septembre est souvent associé à la fin des vacances et des belles journées ensoleillées, pour ceux qui ont eu la chance d'en avoir Pour faciliter la reprise du travail, la rubrique C++ de Developpez.com vous propose un remède simple : vous plonger dans la série d'articles passionnants, traduits par l'équipe de rédaction. Les Guru of the Week ne sont plus à présenter. Ces articles de références, couvrant de nombreux points du langage C++, restent incontournables pour les débutants et les autres. Cette série d'articles, commencée un peu avant l'été, va se poursuivre jusqu'à ce que tous les articles soient tradu...

    Read the article

  • Stretching an ADF Faces Component to (near) 100%

    - by Christian David Straub
    In the past, many users would want their component to stretch to fill 100% of a horizontal area. However, to account for scrollbars that may or may not have been there, they would set the percentage to 98%, etc.A much better way to do this is to use the new "AFStretchWidth" style class, which will do this automatically for you.For instance, avoid this:<af:foo inlineStyle="98%" />and instead do this:<af:foo styleClass="AFStretchWidth" />You can learn more about ADF Faces layout management here.

    Read the article

  • My old Conchango blog posts are currently not accessible

    - by jamiet
    Some of you reading this may be aware that I used to blog at http://blogs.conchango.com/jamiethomson. That URL later changed to http://consultingblogs.emc.com/jamiethomson after Conchango (my employer) got taken over by EMC. In my last post on that site: I stated that I had 676 blog posts on that site. Unfortunately, as of today, those 676 posts are inaccessible. If you try to get to http://consultingblogs.emc.com/jamiethomson today you will see this: I am not the only one affected either; it seems that EMC have taken the same action for many blog sites of my old colleagues (e.g. http://consultingblogs.emc.com/merrickchaffer is also inaccessible). Early indications are that EMC have removed all blog posts by any former employees although that is yet to be confirmed.   A few of us former employees are endeavouring to get this situation rectified so watch this space. I am aware that many people in the SSIS community still refer to those old blog posts so please be aware that any attempt to access any of them will be futile for the foreseeable future. @Jamiet UPDATE: Looks like I managed to get through to the right person. its back http://consultingblogs.emc.com/jamiethomson/ 

    Read the article

  • Using the Data Form Web Part (SharePoint 2010) Site Agnostically!

    - by David Jacobus
    Originally posted on: http://geekswithblogs.net/djacobus/archive/2013/10/24/154465.aspxAs a Developer whom has worked closely with web designers (Power users) in a SharePoint environment, I have come across the issue of making the Data Form Web Part reusable across the site collection! In SharePoint 2007 it was very easy and this blog pointed the way to make it happen: Josh Gaffey's Blog. In SharePoint 2010 something changed! This method failed except for using a Data Form Web Part that pointed to a list in the Site Collection Root! I am making this discussion relative to a developer whom creates a solution (WSP) with all the artifacts embedded and the user shouldn’t have any involvement in the process except to activate features. The Scenario: 1. A Power User creates a Data Form Web Part using SharePoint Designer 2010! It is a great web part the uses all the power of SharePoint Designer and XSLT (Conditional formatting, etc.). 2. Other Users in the site collection want to use that specific web part in sub sites in the site collection. Pointing to a list with the same name, not at the site collection root! The Issues: 1. The Data Form Web Part Data Source uses a List ID (GUID) to point to the specific list. Which means a list in a sub site will have a list with a new GUID different than the one which was created with SharePoint Designer! Obviously, the List needs to be the same List (Fields, Content Types, etc.) with different data. 2. How can we make this web part site agnostic, and dependent only on the lists Name? I had this problem come up over and over and decided to put my solution forward! The Solution: 1. Use the XSL of the Data Form Web Part Created By the Power User in SharePoint Designer! 2. Extend the OOTB Data Form Web Part to use this XSL and Point to a List by name. The solution points to a hybrid solution that requires some coding (Developer) and the XSL (Power User) artifacts put together in a Visual Studio SharePoint Solution. Here are the solution steps in summary: 1. Create an empty SharePoint project in Visual Studio 2. Create a Module and Feature and put the XSL file created by the Power User into it a. Scope the feature to web 3. Create a Feature Receiver to Create the List. The same list from which the Data Form Web Part was created with by the Power User. a. Scope the feature to web 4. Create a Web Part extending the Data Form Web a. Point the Data Form Web Part to point to the List by Name b. Point the Data Form Web Part XSL link to the XSL added using the Module feature c. Scope The feature to Site i. This is because all web parts are in the site collection web part gallery. So in a Narrative Summary: We are creating a list in code which has the same name and (site Columns) as the list from which the Power User created the Data Form Web Part Using SharePoint Designer. We are creating a Web Part in code which extends the OOTB Data Form Web Part to point to a list by name and use the XSL created by the Power User. Okay! Here are the steps with images and code! At the end of this post I will provide a link to the code for a solution which works in any site! I want to TOOT the HORN for the power of this solution! It is the mantra a use with all my clients! What is a basic skill a SharePoint Developer: Create an application that uses the data from a SharePoint list and make that data visible to the user in a manner which meets requirements! Create an Empty SharePoint 2010 Project Here I am naming my Project DJ.DataFormWebPart Create a Code Folder Copy and paste the Extension and Utilities classes (Found in the solution provided at the end of this post) Change the Namespace to match this project The List to which the Data Form Web Part which was used to make the XSL by the Power User in SharePoint Designer is now going to be created in code! If already in code, then all the better! Here I am going to create a list in the site collection root and add some data to it! For the purpose of this discussion I will actually create this list in code before using SharePoint Designer for simplicity! So here I create the List and deploy it within this solution before I do anything else. I will use a List I created before for demo purposes. Footer List is used within the footer of my master page. Add a new Feature: Here I name the Feature FooterList and add a Feature Event Receiver: Here is the code for the Event Receiver: I have a previous blog post about adding lists in code so I will not take time to narrate this code: using System; using System.Runtime.InteropServices; using System.Security.Permissions; using Microsoft.SharePoint; using DJ.DataFormWebPart.Code; namespace DJ.DataFormWebPart.Features.FooterList { /// <summary> /// This class handles events raised during feature activation, deactivation, installation, uninstallation, and upgrade. /// </summary> /// <remarks> /// The GUID attached to this class may be used during packaging and should not be modified. /// </remarks> [Guid("a58644fd-9209-41f4-aa16-67a53af7a9bf")] public class FooterListEventReceiver : SPFeatureReceiver { SPWeb currentWeb = null; SPSite currentSite = null; const string columnGroup = "DJ"; const string ctName = "FooterContentType"; // Uncomment the method below to handle the event raised after a feature has been activated. public override void FeatureActivated(SPFeatureReceiverProperties properties) { using (SPWeb spWeb = properties.GetWeb() as SPWeb) { using (SPSite site = new SPSite(spWeb.Site.ID)) { using (SPWeb rootWeb = site.OpenWeb(site.RootWeb.ID)) { //add the fields addFields(rootWeb); //add content type SPContentType testCT = rootWeb.ContentTypes[ctName]; // we will not create the content type if it exists if (testCT == null) { //the content type does not exist add it addContentType(rootWeb, ctName); } if ((spWeb.Lists.TryGetList("FooterList") == null)) { //create the list if it dosen't to exist CreateFooterList(spWeb, site); } } } } } #region ContentType public void addFields(SPWeb spWeb) { Utilities.addField(spWeb, "Link", SPFieldType.URL, false, columnGroup); Utilities.addField(spWeb, "Information", SPFieldType.Text, false, columnGroup); } private static void addContentType(SPWeb spWeb, string name) { SPContentType myContentType = new SPContentType(spWeb.ContentTypes["Item"], spWeb.ContentTypes, name) { Group = columnGroup }; spWeb.ContentTypes.Add(myContentType); addContentTypeLinkages(spWeb, myContentType); myContentType.Update(); } public static void addContentTypeLinkages(SPWeb spWeb, SPContentType ct) { Utilities.addContentTypeLink(spWeb, "Link", ct); Utilities.addContentTypeLink(spWeb, "Information", ct); } private void CreateFooterList(SPWeb web, SPSite site) { Guid newListGuid = web.Lists.Add("FooterList", "Footer List", SPListTemplateType.GenericList); SPList newList = web.Lists[newListGuid]; newList.ContentTypesEnabled = true; var footer = site.RootWeb.ContentTypes[ctName]; newList.ContentTypes.Add(footer); newList.ContentTypes.Delete(newList.ContentTypes["Item"].Id); newList.Update(); var view = newList.DefaultView; //add all view fields here //view.ViewFields.Add("NewsTitle"); view.ViewFields.Add("Link"); view.ViewFields.Add("Information"); view.Update(); } } } Basically created a content type with two site columns Link and Information. I had to change some code as we are working at the SPWeb level and need Content Types at the SPSite level! I’ll use a new Site Collection for this demo (Best Practice) keep old artifacts from impinging on development: Next we will add this list to the root of the site collection by deploying this solution, add some data and then use SharePoint Designer to create a Data Form Web Part. The list has been added, now let’s add some data: Okay let’s add a Data Form Web Part in SharePoint Designer. Create a new web part page in the site pages library: I will name it TestWP.aspx and edit it in advanced mode: Let’s add an empty Data Form Web Part to the web part zone: Click on the web part to add a data source: Choose FooterList in the Data Source menu: Choose appropriate fields and select insert as multiple item view: Here is what it look like after insertion: Let’s add some conditional formatting if the information filed is not blank: Choose Create (right side) apply formatting: Choose the Information Field and set the condition not null: Click Set Style: Here is the result: Okay! Not flashy but simple enough for this demo. Remember this is the job of the Power user! All we want from this web part is the XLS-Style Sheet out of SharePoint Designer. We are going to use it as the XSL for our web part which we will be creating next. Let’s add a web part to our project extending the OOTB Data Form Web Part. Add new item from the Visual Studio add menu: Choose Web Part: Change WebPart to DataFormWebPart (Oh well my namespace needs some improvement, but it will sure make it readily identifiable as an extended web part!) Below is the code for this web part: using System; using System.ComponentModel; using System.Web; using System.Web.UI; using System.Web.UI.WebControls; using System.Web.UI.WebControls.WebParts; using Microsoft.SharePoint; using Microsoft.SharePoint.WebControls; using System.Text; namespace DJ.DataFormWebPart.DataFormWebPart { [ToolboxItemAttribute(false)] public class DataFormWebPart : Microsoft.SharePoint.WebPartPages.DataFormWebPart { protected override void OnInit(EventArgs e) { base.OnInit(e); this.ChromeType = PartChromeType.None; this.Title = "FooterListDF"; try { //SPSite site = SPContext.Current.Site; SPWeb web = SPContext.Current.Web; SPList list = web.Lists.TryGetList("FooterList"); if (list != null) { string queryList1 = "<Query><Where><IsNotNull><FieldRef Name='Title' /></IsNotNull></Where><OrderBy><FieldRef Name='Title' Ascending='True' /></OrderBy></Query>"; uint maximumRowList1 = 10; SPDataSource dataSourceList1 = GetDataSource(list.Title, web.Url, list, queryList1, maximumRowList1); this.DataSources.Add(dataSourceList1); this.XslLink = web.Url + "/Assests/Footer.xsl"; this.ParameterBindings = BuildDataFormParameters(); this.DataBind(); } } catch (Exception ex) { this.Controls.Add(new LiteralControl("ERROR: " + ex.Message)); } } private SPDataSource GetDataSource(string dataSourceId, string webUrl, SPList list, string query, uint maximumRow) { SPDataSource dataSource = new SPDataSource(); dataSource.UseInternalName = true; dataSource.ID = dataSourceId; dataSource.DataSourceMode = SPDataSourceMode.List; dataSource.List = list; dataSource.SelectCommand = "" + query + ""; Parameter listIdParam = new Parameter("ListID"); listIdParam.DefaultValue = list.ID.ToString( "B").ToUpper(); Parameter maximumRowsParam = new Parameter("MaximumRows"); maximumRowsParam.DefaultValue = maximumRow.ToString(); QueryStringParameter rootFolderParam = new QueryStringParameter("RootFolder", "RootFolder"); dataSource.SelectParameters.Add(listIdParam); dataSource.SelectParameters.Add(maximumRowsParam); dataSource.SelectParameters.Add(rootFolderParam); dataSource.UpdateParameters.Add(listIdParam); dataSource.DeleteParameters.Add(listIdParam); dataSource.InsertParameters.Add(listIdParam); return dataSource; } private string BuildDataFormParameters() { StringBuilder parameters = new StringBuilder("<ParameterBindings><ParameterBinding Name=\"dvt_apos\" Location=\"Postback;Connection\"/><ParameterBinding Name=\"UserID\" Location=\"CAMLVariable\" DefaultValue=\"CurrentUserName\"/><ParameterBinding Name=\"Today\" Location=\"CAMLVariable\" DefaultValue=\"CurrentDate\"/>"); parameters.Append("<ParameterBinding Name=\"dvt_firstrow\" Location=\"Postback;Connection\"/>"); parameters.Append("<ParameterBinding Name=\"dvt_nextpagedata\" Location=\"Postback;Connection\"/>"); parameters.Append("<ParameterBinding Name=\"dvt_adhocmode\" Location=\"Postback;Connection\"/>"); parameters.Append("<ParameterBinding Name=\"dvt_adhocfiltermode\" Location=\"Postback;Connection\"/>"); parameters.Append("</ParameterBindings>"); return parameters.ToString(); } } } The OnInit method we use to set the list name and the XSL Link property of the Data Form Web Part. We do not have the link to XSL in our Solution so we will add the XSL now: Add a Module in the Visual Studio add menu: Rename Sample.txt in the module to footer.xsl and then copy the XSL from SharePoint Designer Look at elements.xml to where the footer.xsl is being provisioned to which is Assets/footer.xsl, make sure the Web parts xsl link is pointing to this url: Okay we are good to go! Let’s check our features and package: DataFormWebPart should be scoped to site and have the web part: The Footer List feature should be scoped to web and have the Assets module (Okay, I see, a spelling issue but it won’t affect this demo) If everything is correct we should be able to click a couple of sub site feature activations and have our list and web part in a sub site. (In fact this solution can be activated anywhere) Here is the list created at SubSite1 with new data It. Next let’s add the web part on a test page and see if it works as expected: It does! So we now have a repeatable way to use a WSP to move a Data Form Web Part around our sites! Here is a link to the code: DataFormWebPart Solution

    Read the article

  • Upgrading to SharePoint 2010? Get started by evaluating

    - by juanlarios
    I recently spoke at Tech Days 2010 in Winnipeg. These are some tools that will I showcased to help you evaluate where you are now.   ·       PreUpgradeCheck o   http://technet.microsoft.com/en-us/library/dd789638(office.12).aspx ·       SharePoint BPA o   http://www.microsoft.com/downloads/en/details.aspx?familyid=cb944b27-9d6b-4a1f-b3e1-778efda07df8&displaylang=en ·       SPSReport o   http://spsreport.codeplex.com/ ·       SPSFarmReport o   http://spsfarmreport.codeplex.com/ I also showed a Solution Downloader found here: http://spsolutiondownloader.codeplex.com/ I also wanted to give you some useful Power Shell commands to work with visual upgrade: Find out Which UI Version a site is at:   $sc = Get-SPSite <URL>; $sc.GetVisualReport() | Format-Table Upgrade UI for an entire WebApp:   $webapp = Get-SPWebApplication <URL>   foreach ($s in $webapp.sites)   {$s.VisualUpgradeWebs() } Upgrade UI for a single-site:   $site = Get-SPSite <URL>   $site.VisualUpgradeWebs() Revert UI for single site:   Get-SPSite <URL> | Get-SPWeb "webname" | Foreach{$_.UIVersionConfigurationEnabled=1;$_.UIVersion=3;$_.Update();} Revert UI for all sites:   Get-SPSite <URL> | Foreach{$_. UIVersionConfigurationEnabled=1;$_.UIVersion=3;$_.Update();}     Hope it helps you out!

    Read the article

  • Building a Mafia&hellip;TechFest Style

    - by David Hoerster
    It’s been a few months since I last blogged (not that I blog much to begin with), but things have been busy.  We all have a lot going on in our lives, but I’ve had one item that has taken up a surprising amount of time – Pittsburgh TechFest 2012.  After the event, I went through some minutes of the first meetings for TechFest, and I started to think about how it all came together.  I think what inspired me the most about TechFest was how people from various technical communities were able to come together and build and promote a common event.  As a result, I wanted to blog about this to show that people from different communities can work together to build something that benefits all communities.  (Hopefully I've got all my facts straight.)  TechFest started as an idea Eric Kepes and myself had when we were planning our next Pittsburgh Code Camp, probably in the summer of 2011.  Our Spring 2011 Code Camp was a little different because we had a great infusion of some folks from the Pittsburgh Agile group (especially with a few speakers from LeanDog).  The line-up was great, but we felt our audience wasn’t as broad as it should have been.  We thought it would be great to somehow attract other user groups around town and have a big, polyglot conference. We started contacting leaders from Pittsburgh’s various user groups.  Eric and I split up the ones that we knew about, and we just started making contacts.  Most of the people we started contacting never heard of us, nor we them.  But we all had one thing in common – we ran user groups who’s primary goal is educating our members to make them better at what they do. Amazingly, and I say this because I wasn’t sure what to expect, we started getting some interest from the various leaders.  One leader, Greg Akins, is, in my opinion, Pittsburgh’s poster boy for the polyglot programmer.  He’s helped us in the past with .NET Code Camps, is a Java developer (and leader in Pittsburgh’s Java User Group), works with Ruby and I’m sure a handful of other languages.  He helped make some e-introductions to other user group leaders, and the whole thing just started to snowball. Once we realized we had enough interest with the user group leaders, we decided to not have a Fall Code Camp and instead focus on this new entity. Flash-forward to October of 2011.  I set up a meeting, with the help of Jeremy Jarrell (Pittsburgh Agile leader) to hold a meeting with the leaders of many of Pittsburgh technical user groups.  We had representatives from 12 technical user groups (Python, JavaScript, Clojure, Ruby, PittAgile, jQuery, PHP, Perl, SQL, .NET, Java and PowerShell) – 14 people.  We likened it to a scene from a Godfather movie where the heads of all the families come together to make some deal.  As a result, the name “TechFest Mafia” was born and kind of stuck. Over the next 7 months or so, we had our starts and stops.  There were moments where I thought this event would not happen either because we wouldn’t have the right mix of topics (was I off there!), or enough people register (OK, I was wrong there, too!) or find an appropriate venue (hmm…wrong there, too) or find enough sponsors to help support the event (wow…not doing so well).  Overall, everything fell into place with a lot of hard work from Eric, Jen, Greg, Jeremy, Sean, Nicholas, Gina and probably a few others that I’m forgetting.  We also had a bit of luck, too.  But in the end, the passion that we had to put together an event that was really about making ourselves better at what we do really paid off. I’ve never been more excited about a project coming together than I have been with Pittsburgh TechFest 2012.  From the moment the first person arrived at the event to the final minutes of my closing remarks (where I almost lost my voice – I ended up being diagnosed with bronchitis the next day!), it was an awesome event.  I’m glad to have been part of bringing something like this to Pittsburgh…and I’m looking forward to Pittsburgh TechFest 2013.  See you there!

    Read the article

  • Android 5.0 (ou 4.1) dévoilé ce soir au Google I/O, la statue de « Jelly Beans » a été installée dans les jardins de Google

    Android 5.0 ou 4.1 dévoilé ce soir au Google I/O La statue de « Jelly Beans » a été installée dans les jardins de Google C'est donc quasi-officiel, la prochaine version d'Android sera présentée ce soir lors du Google I/O, la conférence annuelle de Google dédiée aux développeurs. La représentation du nouveau dessert qui sert de surnom à cette version vient en effet d'être installée dans les jardins du siège social de Google. Sa photo a été publiée hier soir sur un des Google+ officiels de l'éditeur : [IMG]http://ftp-developpez.com/gordon-fowler/Jelly%20Beans%20Garden.jpg[/IMG]Les Jelly Beans (ou « bonbon haricot ») sont l'équivalent américain des Dragibus.

    Read the article

  • Paying by Cash

    - by David Dorf
    I'll grant you paying by cash in the context of stores isn't particularly interesting, but in my quest to try new payment methods I decided to pay by cash at an online store. Using a credit card means I have to hoist myself off the couch, find the card, and enter all those digits. Google Checkout certainly makes that task easier by storing my credit card information, but what happens to all those people that don't have a credit card? What about the ones that are afraid to use credit cards over the internet. There are three main options for cash payment, not all of which are accepted by every merchant. The most popular is PayPal. The issue I have with them is that returns and disputes have to be handled with PayPal, not the merchant. I once used PayPal at a shady online store and lost my money. Yeah, my bad but they wouldn't help me at all. PayPal was purchased by eBay in 2002. BillMeLater is best for larger purchases, because at checkout they actually run a credit check to make sure you're credit worthy. Assuming you are, they pay the merchant on your behalf and mail you a bill, which you better pay quickly or interest will start to accrue. That's nice for the merchant because they get paid right away, and I presume there's no charge-backs. BillMeLater was purchased by eBay in 2008. Last night I tried eBillMe for the first time. After checkout, they send you a bill via email and expect you to pay either via online banking (they provide the instructions to set everything up) or walk-in locations across the US (typically banks). The process was quick and easy. The merchant doesn't ship the product until the bill is paid, so there's a day or two delay. For the merchant there are no charge-backs, and the fees are less than credit cards. For the shopper, they provide buyer protection similar to that offered by credit cards, and 1% cashback on purchases. Once the online bill-pay is setup, its easy to reuse in the future. Seems like a win-win for merchants and shoppers.

    Read the article

  • Why Haven’t NFC Payments Taken Off?

    - by David Dorf
    With the EMV 2015 milestone approaching rapidly, there’s been renewed interest in smartcards, those credit cards with an embedded computer chip.  Back in 1996 I was working for a vendor helping Visa introduce a stored-value smartcard to the US.  Visa Cash was debuted at the 1996 Olympics in Atlanta, and I firmly believed it was the beginning of a cashless society.  (I later worked on MasterCard’s system called Mondex, from the UK, which debuted the following year in Manhattan). But since you don’t have a Visa Cash card in your wallet, it’s obvious the project never took off.  It was convenient for consumers, faster for merchants, and more cost-effective for banks, so why did it fail?  All emerging payment systems suffer from the chicken-and-egg dilemma.  Consumers won’t carry the cards if few merchants accept them, and merchants won’t install the terminals if few consumers have cards. Today’s emerging payment providers are in a similar pickle.  There has to be enough value for all three constituents – consumers, merchants, banks – to change the status quo.  And it’s not enough to exceed the value, it’s got to be a leap in value, because people generally resist change.  ATMs and transit cards are great examples of this, and airline kiosks and self-checkout systems are to a lesser extent. Although Google Wallet and ISIS, the two leading NFC payment platforms in the US, have shown strong commitment, there’s been very little traction.  Yes, I can load my credit card number into my phone then tap to pay, but what was the incremental value over swiping my old card?  For it to be a leap in value, it has to offer more than just payment, which I can do very easily today.  The other two ingredients are thought to be loyalty programs and digital coupons, but neither Google nor ISIS really did them well. Of course a large portion of the mobile phone market doesn’t even support NFC thanks to Apple, and since it’s not in their best interest that situation is unlikely to change.  Another issue is getting access to the “secure element,” the chip inside the phone where accounts numbers can be held securely.  Telco providers and handset manufacturers own that area, and they’re not willing to share with banks.  (Host Card Emulation, which has been endorsed by MasterCard and Visa, might be a solution.) Square recently gave up on its wallet, and MCX (the group of retailers trying to create a mobile payment platform) is very slow out of the gate.  That leaves PayPal and a slew of smaller companies trying to introduce easier ways to pay. But is it really so cumbersome to carry and swipe (soon to insert) a credit card?  Aren’t there more important problems to solve in the retail customer experience?  Maybe Apple will come up with some novel way to use iBeacons and fingerprint identification to make payments, but for now I think we need to focus on upgrading to Chip-and-PIN and tightening security.  In the meantime, NFC payments will continue to struggle.

    Read the article

  • ODI 11g - Oracle Data Integrator 11g – A Hands-On Tutorial

    - by David Allan
    I've have been asked by Packt publishing to review a brand new book on Oracle Data Integrator: Getting Started with Oracle Data Integrator 11g – A Hands-On Tutorial. Waiting on this book to arrive and see what goodies are inside, I'll blog a review later. The book can be found at Oracle Data Integrator 11g – A Hands-On Tutorial Looking at the table of contents, it looks like it gives a good broad introduction (including various data formats) to the product; Chapter 1: Product Overview Chapter 2: Product Installation Chapter 3: Using Variables Chapter 4: ODI Sources, Targets, and Knowledge Modules Chapter 5: Working with Databases Chapter 6: Working with MySQL Chapter 7: Working with Microsoft SQL Server Chapter 8: Integrating File Data Chapter 9: Working with XML Files Chapter 10: Creating Workflows—Packages and Load Plans Chapter 11: Error Management Chapter 12: Managing and Monitoring ODI Components Chapter 13: Concluding Remarks Looking forward to it.

    Read the article

  • .NET 4.5 agora é suportado nos Web Sites da Windows Azure

    - by Leniel Macaferi
    Nesta semana terminamos de instalar o .NET 4.5 em todos os nossos clusters que hospedam os Web Sistes da Windows Azure. Isso significa que agora você pode publicar e executar aplicações baseadas na ASP.NET 4.5, e usar as bibliotecas e recursos do .NET 4.5 (por exemplo: async e o novo suporte para dados espaciais (spatial data type no Entity Framework), com os Web Sites da Windows Azure. Isso permite uma infinidade de ótimos recursos - confira o post de Scott Hanselman com vídeos (em Inglês) que destacam alguns destes recursos. O Visual Studio 2012 inclui suporte nativo para publicar uma aplicação na Windows Azure, o que torna muito fácil publicar e implantar sites baseados no .NET 4.5 a partir do Visual Studio (você pode publicar aplicações + bancos de dados). Com o recurso de Migrações da abordagem Entity Framework Code First você também pode fazer atualizações incrementais do esquema do banco de dados, como parte do processo de publicação (o que permite um fluxo de trabalho de publicação extremamente automatizado). Cada conta da Windows Azure é elegível para hospedar até 10 web-sites gratuitamente, usando nossa camada Escalonável "Compartilhada". Se você ainda não tem uma conta da Windows Azure, você pode inscrever-se em um teste gratuito para começar a usar estes recursos hoje mesmo. Nos próximos dias, vamos também lançar o suporte para .NET 4.5 e Windows Server 2012 para os Serviços da Nuvem da Windows Azure (Web e Worker Roles) - juntamente com algumas novas e ótimas melhorias para o SDK da Windows Azure. Fique de olho no meu blog para mais informações sobre estes lançamentos em breve. Espero que ajude, - Scott PS Além do blog, eu também estou agora utilizando o Twitter para atualizações rápidas e para compartilhar links. Siga-me em: twitter.com/ScottGu Texto traduzido do post original por Leniel Macaferi.

    Read the article

< Previous Page | 240 241 242 243 244 245 246 247 248 249 250 251  | Next Page >