Search Results

Search found 5 results on 1 pages for 'sfdc'.

Page 1/1 | 1 

  • Designing a Content-Based ETL Process with .NET and SFDC

    - by Patrick
    As my firm makes the transition to using SFDC as our main operational system, we've spun together a couple of SFDC portals where we can post customer-specific documents to be viewed at will. As such, we've had the need for pseudo-ETL applications to be implemented that are able to extract metadata from the documents our analysts generate internally (most are industry-standard PDFs, XML, or MS Office formats) and place in networked "queue" folders. From there, our applications scoop of the queued documents and upload them to the appropriate SFDC CRM Content Library along with some select pieces of metadata. I've mostly used DbAmp to broker communication with SFDC (DbAmp is a Linked Server provider that allows you to use SQL conventions to interact with your SFDC Org data). I've been able to create [console] applications in C# that work pretty well, and they're usually structured something like this: static void Main() { // Load parameters from app.config. // Get documents from queue. var files = someInterface.GetFiles(someFilterOrRegexPattern); foreach (var file in files) { // Extract metadata from the file. // Validate some attributes of the file; add any validation errors to an in-memory // structure (e.g. List<ValidationErrors>). if (isValid) { var fileData = File.ReadAllBytes(file); // Upload using some wrapper for an ORM or DAL someInterface.Upload(fileData, meta.Param1, meta.Param2, ...); } else { // Bounce the file } } // Report any validation errors (via message bus or SMTP or some such). } And that's pretty much it. Most of the time I wrap all these operations in a "Worker" class that takes the needed interfaces as constructor parameters. This approach has worked reasonably well, but I just get this feeling in my gut that there's something awful about it and would love some feedback. Is writing an ETL process as a C# Console app a bad idea? I'm also wondering if there are some design patterns that would be useful in this scenario that I'm clearly overlooking. Thanks in advance!

    Read the article

  • Designing Content-Based ETL Process with .NET and SFDC

    - by Patrick
    As my firm makes the transition to using SFDC as our main operational system, we've spun together a couple of SFDC portals where we can post customer-specific documents to be viewed at will. As such, we've had the need for pseudo-ETL applications to be implemented that are able to extract metadata from the documents our analysts generate internally (most are industry-standard PDFs, XML, or MS Office formats) and place in networked "queue" folders. From there, our applications scoop of the queued documents and upload them to the appropriate SFDC CRM Content Library along with some select pieces of metadata. I've mostly used DbAmp to broker communication with SFDC (DbAmp is a Linked Server provider that allows you to use SQL conventions to interact with your SFDC Org data). I've been able to create [console] applications in C# that work pretty well, and they're usually structured something like this: static void Main() { // Load parameters from app.config. // Get documents from queue. var files = someInterface.GetFiles(someFilterOrRegexPattern); foreach (var file in files) { // Extract metadata from the file. // Validate some attributes of the file; add any validation errors to an in-memory // structure (e.g. List<ValidationErrors>). if (isValid) { // Upload using some wrapper for an ORM an someInterface.Upload(meta.Param1, meta.Param2, ...); } else { // Bounce the file } } // Report any validation errors (via message bus or SMTP or some such). } And that's pretty much it. Most of the time I wrap all these operations in a "Worker" class that takes the needed interfaces as constructor parameters. This approach has worked reasonably well, but I just get this feeling in my gut that there's something awful about it and would love some feedback. Is writing an ETL process as a C# Console app a bad idea? I'm also wondering if there are some design patterns that would be useful in this scenario that I'm clearly overlooking. Thanks in advance!

    Read the article

  • Salesforce deployment guideline using Sandbox

    - by ybbest
    Create Deployment connection Enable the inbound change set settings on the destination Environment you would like to deploy the solution to. Enable the outbound change set settings on the source Environment where you package your application. The best practice is to Package everything in the changeset and salesforce will only deploy the change into your destination environment. If you only package the change, you could miss some of the changes. You can clone the change set on the source destination however the initial packaging takes some time as you need to go through everything and select the components manually. After the change set is packaged, you need to upload the chagneset so that destination environment can see the change set in its incoming change set list. Click Validate the change set before deployment. References: Development Lifecycle Guide Change Sets Best Practices

    Read the article

  • invite: Oracle Fusion Applications Partner Update Webcast

    - by mseika
    Oracle Fusion Applications: Thursday's Partner UpdatesIn order to keep you up to date with partner-specific news and information regarding Oracle Fusion Applications, we are expanding our Fusion Applications Webcast Series to include these additional Thursday sessions.All sessions will be recorded and replays will be posted to this Oracle PartnerNetwork page.Please mark your calendar for these NEW Fusion Partner Update specific sessions: Click Here for logistics and dial-in details for each webcast. 11/29/12 Win Cloud SFA with Fusion CRM: Sales Positioning 12/6/12 Win Cloud SFA with Fusion CRM: Fusion CRM against SFDC 12/13/12 Implementing Fusion Applications: ERP Cloud Services, Back Office Solutions that Keep You in Front 12/20/12 Understanding Fusion Supply Chain Management (SCM) Opportunities PLEASE NOTE: This webcast series is for Oracle Partners and Oracle Employees ONLY.

    Read the article

  • Oracle Fusion Applications Partner Update Webcasts

    - by Richard Lefebvre
    Every Thursday from November 29th - December 20th! In order to keep you up to date with partner-specific news and information regarding Oracle Fusion Applications, we are expanding our Fusion Applications Webcast Series to include these additional Thursday sessions. All sessions will be recorded and replays will be posted to this Oracle PartnerNetwork page. Please mark your calendar for these NEW Fusion Partner Update specific sessions: Click Here for logistics and dial-in details for each webcast. 11/29/12 Win Cloud SFA with Fusion CRM: Sales Positioning 12/6/12 Win Cloud SFA with Fusion CRM: Fusion CRM against SFDC 12/13/12 Implementing Fusion Applications: ERP Cloud Services, Back Office Solutions that Keep You in Front 12/20/12 Understanding Fusion Supply Chain Management (SCM) Opportunities PLEASE NOTE: This webcast series is for Oracle Partners and Oracle Employees ONLY.

    Read the article

1