Search Results

Search found 1687 results on 68 pages for 'sharepoint2013 workflow'.

Page 15/68 | < Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >

  • Announcing SO-Aware Test Workbench

    - by gsusx
    Yesterday was a big day for Tellago Studios . After a few months hands down working, we announced the release of the SO-Aware Test Workbench tool which brings sophisticated performance testing and test visualization capabilities to theWCF world. This work has been the result of the feedback received by many of our SO-Aware and Tellago customers in terms of how to improve the WCF testing. More importantly, with the SO-Aware Test Workbench we are trying to address what has been one of the biggest challenges...(read more)

    Read the article

  • Tips/Process for web-development using Django in a small team

    - by Mridang Agarwalla
    We're developing a web app uing Django and we're a small team of 3-4 programmers — some doing the UI stuff and some doing the Backend stuff. I'd love some tips and suggestions from the people here. This is out current setup: We're using Git as as our SCM tool and following this branching model. We're following the PEP8 for your style guide. Agile is our software development methodology and we're using Jira for that. We're using the Confluence plugin for Jira for documentation and I'm going to be writing a script that also dumps the PyDocs into Confluence. We're using virtualenv for sandboxing We're using zc.buildout for building This is whatever I can think of off the top of my head. Any other suggestions/tips would be welcome. I feel that we have a pretty good set up but I'm also confident that we could do more. Thanks.

    Read the article

  • Handling HumanTask attachments in Oracle BPM 11g PS4FP+ (II)

    - by ccasares
    Retrieving uploaded attachments -UCM- As stated in my previous blog entry, Oracle BPM 11g 11.1.1.5.1 (aka PS4FP) introduced a new cool feature whereby you can use Oracle WebCenter Content (previously known as Oracle UCM) as the repository for the human task attached documents. For more information about how to use or enable this feature, have a look here. The attachment scope (either TASK or PROCESS) also applies to UCM-attachments. But even with this other feature, one question might arise when using UCM attachments. How can I get them from within the process? The first answer would be to use the same getTaskAttachmentContents() XPath function already explained in my previous blog entry. In fact, that's the way it should be. But in Oracle BPM 11g 11.1.1.5.1 (PS4FP) and 11.1.1.6.0 (PS5) there's a bug that prevents you to do that. If you invoke such function against a UCM-attachment, you'll get a null content response (bug#13907552). Even if the attachment was correctly uploaded. While this bug gets fixed, next I will show a workaround that lets me to retrieve the UCM-attached documents from within a BPM process. Besides, the sample will show how to interact with WCC API from within a BPM process.Aside note: I suggest you to read my previous blog entry about Human Task attachments where I briefly describe some concepts that are used next, such as the execData/attachment[] structure. Sample Process I will be using the following sample process: A dummy UserTask using "HumanTask2" Human Task, followed by an Embedded Subprocess that will retrieve the attachments payload. In this case, and here's the key point of the sample, we will retrieve such payload using WebCenter Content WebService API (IDC): and once retrieved, we will write each of them back to a file in the server using a File Adapter service: In detail:  We will use the same attachmentCollection XSD structure and same BusinessObject definition as in the previous blog entry. However we create a separate variable, named attachmentUCM, based on such BusinessObject. We will still need to keep a copy of the HumanTask output's execData structure. Therefore we need to create a new variable of type TaskExecutionData (different one than the other used for non-UCM attachments): As in the non-UCM attachments flow, in the output tab of the UserTask mapping, we'll keep a copy of the execData structure: Now we get into the embedded subprocess that will retrieve the attachments' payload. First, and using an XSLT transformation, we feed the attachmentUCM variable with the following information: The name of each attachment (from execData/attachment/name element) The WebCenter Content ID of the uploaded attachment. This info is stored in execData/attachment/URI element with the format ecm://<id>. As we just want the numeric <id>, we need to get rid of the protocol prefix ("ecm://"). We do so with some XPath functions as detailed below: with these two functions being invoked, respectively: We, again, set the target payload element with an empty string, to get the <payload></payload> tag created. The complete XSLT transformation is shown below. Remember that we're using the XSLT for-each node to create as many target structures as necessary.  Once we have fed the attachmentsUCM structure and so it now contains the name of each of the attachments along with each WCC unique id (dID), it is time to iterate through it and get the payload. Therefore we will use a new embedded subprocess of type MultiInstance, that will iterate over the attachmentsUCM/attachment[] element: In each iteration we will use a Service activity that invokes WCC API through a WebService. Follow these steps to create and configure the Partner Link needed: Login to WCC console with an administrator user (i.e. weblogic). Go to Administration menu and click on "Soap Wsdls" link. We will use the GetFile service to retrieve a file based on its dID. Thus we'll need such service WSDL definition that can be downloaded by clicking the GetFile link. Save the WSDL file in your JDev project folder. In the BPM project's composite view, drag & drop a WebService adapter to create a new External Reference, based on the just added GetFile.wsdl. Name it UCM_GetFile. WCC services are secured through basic HTTP authentication. Therefore we need to enable the just created reference for that: Right-click the reference and click on Configure WS Policies. Under the Security section, click "+" to add the "oracle/wss_username_token_client_policy" policy The last step is to set the credentials for the security policy. For the sample we will use the admin user for WCC (weblogic/welcome1). Open the composite.xml file and select the Source view. Search for the UCM_GetFile entry and add the following highlighted elements into it:   <reference name="UCM_GetFile" ui:wsdlLocation="GetFile.wsdl">     <interface.wsdl interface="http://www.stellent.com/GetFile/#wsdl.interface(GetFileSoap)"/>     <binding.ws port="http://www.stellent.com/GetFile/#wsdl.endpoint(GetFile/GetFileSoap)"                 location="GetFile.wsdl" soapVersion="1.1">       <wsp:PolicyReference URI="oracle/wss_username_token_client_policy"                            orawsp:category="security" orawsp:status="enabled"/>       <property name="weblogic.wsee.wsat.transaction.flowOption"                 type="xs:string" many="false">WSDLDriven</property>       <property name="oracle.webservices.auth.username"                 type="xs:string">weblogic</property>       <property name="oracle.webservices.auth.password"                 type="xs:string">welcome1</property>     </binding.ws>   </reference> Now the new external reference is ready: Once the reference has just been created, we should be able now to use it from our BPM process. However we find here a problem. The WCC GetFile service operation that we will use, GetFileByID, accepts as input a structure similar to this one, where all element tags are optional: <get:GetFileByID xmlns:get="http://www.stellent.com/GetFile/">    <get:dID>?</get:dID>   <get:rendition>?</get:rendition>   <get:extraProps>      <get:property>         <get:name>?</get:name>         <get:value>?</get:value>      </get:property>   </get:extraProps></get:GetFileByID> and we need to fill up just the <get:dID> tag element. Due to some kind of restriction or bug on WCC, the rest of the tag elements must NOT be sent, not even empty (i.e.: <get:rendition></get:rendition> or <get:rendition/>). A sample request that performs the query just by the dID, must be in the following format: <get:GetFileByID xmlns:get="http://www.stellent.com/GetFile/">   <get:dID>12345</get:dID></get:GetFileByID> The issue here is that the simple mapping in BPM does create empty tags being a sample result as follows: <get:GetFileByID xmlns:get="http://www.stellent.com/GetFile/"> <get:dID>12345</get:dID> <get:rendition/> <get:extraProps/> </get:GetFileByID> Although the above structure is perfectly valid, it is not accepted by WCC. Therefore, we need to bypass the problem. The workaround we use (many others are available) is to add a Mediator component between the BPM process and the Service that simply copies the input structure from BPM but getting rid of the empty tags. Follow these steps to configure the Mediator: Drag & drop a new Mediator component into the composite. Uncheck the creation of the SOAP bindings and use the Interface Definition from WSDL template and select the existing GetFile.wsdl Double click in the mediator to edit it. Add a static routing rule to the GetFileByID operation, of type Service and select References/UCM_GetFile/GetFileByID target service: Create the request and reply XSLT mappers: Make sure you map only the dID element in the request: And do an Auto-mapper for the whole response: Finally, we can now add and configure the Service activity in the BPM process. Drag & drop it to the embedded subprocess and select the NormalizedGetFile service and getFileByID operation: Map both the input: ...and the output: Once this embedded subprocess ends, we will have all attachments (name + payload) in the attachmentsUCM variable, which is the main goal of this sample. But in order to test everything runs fine, we finish the sample writing each attachment to a file. To that end we include a final embedded subprocess to concurrently iterate through each attachmentsUCM/attachment[] element: On each iteration we will use a Service activity that invokes a File Adapter write service. In here we have two important parameters to set. First, the payload itself. The file adapter awaits binary data in base64 format (string). We have to map it using XPath (Simple mapping doesn't recognize a String as a base64-binary valid target): Second, we must set the target filename using the Service Properties dialog box: Again, note how we're making use of the loopCounter index variable to get the right element within the embedded subprocess iteration. Final blog entry about attachments will handle how to inject documents to Human Tasks from the BPM process and how to share attachments between different User Tasks. Will come soon. Again, once I finish will all posts on this matter, I will upload the whole sample project to java.net.

    Read the article

  • Is WCF suitable for writing an application which is shared among applications?

    - by RPK
    I have developed and deployed few ASP.NET applications. Sometimes I want to stop the users from either inserting or updating a record when: Maintenance is going on. Stop operations due to payment delay. In one of my recent application I have implemented this feature to first check the database operations for locked status. If any of the above condition fulfils, database operations like insert and update are not carried out. I now need this feature in all the old applications and the future applications I build. I want to know whether WCF is suitable in this scenario as I want to share methods or an independent locking application among various other applications. Is WCF appropriate for this type of scenario?

    Read the article

  • SSIS and StreamInsight Working Together.

    I have been thinking a lot recently about what it would be like to have StreamInsight and SSIS working together.  Well the CAT team have produced a paper on some of our options here. Here are some of my thoughts. There is of course a slight mismatch in their types of usage.  StreamInsight is an Event Stream processing engine capable of operating on new data in the sub second timeframe.  The engine allows you to do real time analytics and take decisions on events that have potentially only just happened.  SSIS on the other hand is a batch processing engine.  In general I do not like having to invoke the same package more than once every 90 seconds or so as it can start to get expensive.  Usually when doing batch processing we have an hour or longer of grace before we have to move data from A –> B. StreamInsight operates on streams of data.  Before anyone mentions it yes I know StreamInsight is equally adept at using the IEnumerable interface, but I would argue live streaming and real-time analytics is a primary goal of the product.  SSIS does not have an “Always On” button I do not like the idea of embedding StreamInsight inside SSIS using a transform particularly.  It means StreamInsight becomes a batch processing engine because it can only operate when the SSIS package is running and SSIS is in charge of when that happens. If I am to have StreamInsight within SSIS then I prefer to have StreamInsight on the adapters.  This way you can force the adapters to stay open and introduce events into your Pipeline.   SSIS has a much richer set of transforms out of the box than StreamInsight.  Although “Always On” was not a design goal of SSIS I have used it like this and it works just fine. SSIS being called from within StreamInsight, now that excites me.  see below   For a while now I have been thinking what it would be like to decouple the Data Flow task from the SSIS package and expose it as something with which you can interact.  Anything can instantiate this version of a DFT as it would expose one or more  input interfaces and one or more output interfaces.  I can imagine that this would be a big hit when moving to “The Cloud” as well.  I could see the Data Flow task maybe being hosted in Azure Appfabric or some such layer. StreamInsight would be able to take advantage of this as well.   I am interested to see where this goes and will be pressing for more meat around the subject when I visit Redmond soon.

    Read the article

  • How to improve designer and developer work flow?

    - by mbdev
    I work in a small startup with two front end developers and one designer. Currently the process starts with the designer sending a png file with the whole page design and assets if needed. My task as front end developer is to convert it to a HTML/CSS page. My work flow currently looks like this: Lay out the distinct parts using html elements. Style each element very roughly (floats, minimal fonts and padding) so I can modify it using inspection. Using Chrome Developer Tools (inspect) add/change css attributes while updating the css file. Refresh the page after X amount of changes Use Pixel Perfect to refine the design more. Sit with the designer to make last adjustments. Inferring the paddings, margins, font sizes using trial and error takes a lot of time and I feel the process could become more efficient but not sure how to improve it. Using PSD files is not an option since buying Photoshop for each developer is currently not considered. Design guide is also not available since design is still evolving and new features are introduced. Ideas for improving the process above and sharing how the process looks like in your company will be great.

    Read the article

  • How can I find out a file's path in the text encoding used by PosteRazor?

    - by ændrük
    PosteRazor uses an apparently outdated GUI that is incapable of properly displaying my filenames: For the sake of convenience, I want to be able to open any file in PosteRazor by copying and pasting its path from Nautilus. This works in other applications, but sadly, PosteRazor in unable to understand the path: How can I convert the path that Nautilus generates into a text encoding that is compatible with PosteRazor?

    Read the article

  • How do you remember where in your code you want to continue next time?

    - by bitbonk
    When you interrupt the work on some code (be it because you have to work on something else or go on vacation or simply because it is the end of the day), once you close that Visual Studio project, what is your preferred way to remember what you want to do next when you start working on that code again. Do you set a Visual Studio bookmark or do write down something like // TODO: continue here next time? Maybe you have a special tag like // NEXT:? Do you put a sticky note on your monitor? Do you use a cool tool or Visual Studio plugin I should know? Do you have any personal trick that helps you find the place in your code where you left off the last time you worked on your code?

    Read the article

  • Git Workflow With Capistrano

    - by jerhinesmith
    I'm trying to get my head around a good git workflow using capistrano. I've found a few good articles, but I'm either not grasping completely what they're suggesting (likely) or they're somewhat lacking. Here's kind of what I had in mind so far, but I get caught up when to merge back into the master branch (i.e. before moving to stage? after?) and trying to hook it into capistrano for deployments: Make sure you’re up to date with all the changes made on the remote master branch by other developers git checkout master git pull Create a new branch that pertains to the particular bug you're trying to fix git checkout -b bug-fix-branch Make your changes git status git add . git commit -m "Friendly message about the commit" So, this is usually where I get stuck. At this point, I have a master branch that is healthy and a new bug-fix-branch that contains my (untested -- other than unit tests) changes. If I want to push my changes to stage (through cap staging deploy), do I have to merge my changes back into the master branch (I'd prefer not to since it seems like master should be kept free of untested code)? Do I even deploy from master (or should I be tagging a release first and then modifying my production.rb file to deploy from that tag)? git-deployment seems to address some of these workflow issues, but I can't seem to find out how on earth it actually hooks into cap staging deploy and cap production deploy. Thoughts? I assume there's a likely canonical way to do this, but I either can't find it or I'm too new to git to recognize that I have found it. Help!

    Read the article

  • Understanding the workflow of the messages in a generic server implementation in Erlang

    - by Chiron
    The following code is from "Programming Erlang, 2nd Edition". It is an example of how to implement a generic server in Erlang. -module(server1). -export([start/2, rpc/2]). start(Name, Mod) -> register(Name, spawn(fun() -> loop(Name, Mod, Mod:init()) end)). rpc(Name, Request) -> Name ! {self(), Request}, receive {Name, Response} -> Response end. loop(Name, Mod, State) -> receive {From, Request} -> {Response, State1} = Mod:handle(Request, State), From ! {Name, Response}, loop(Name, Mod, State1) end. -module(name_server). -export([init/0, add/2, find/1, handle/2]). -import(server1, [rpc/2]). %% client routines add(Name, Place) -> rpc(name_server, {add, Name, Place}). find(Name) -> rpc(name_server, {find, Name}). %% callback routines init() -> dict:new(). handle({add, Name, Place}, Dict) -> {ok, dict:store(Name, Place, Dict)}; handle({find, Name}, Dict) -> {dict:find(Name, Dict), Dict}. server1:start(name_server, name_server). name_server:add(joe, "at home"). name_server:find(joe). I tried so hard to understand the workflow of the messages. Would you please help me to understand the workflow of this server implementation during the executing of the functions: server1:start, name_server:add and name_server:find?

    Read the article

  • .NET WF4: Should it be in the middle of everything?

    - by stimpy77
    I am aware that WF4 (Windows Workflow 4.0, part of .NET 4.0) is a significant rework and redesign of WF3, where much of what made WF3 a poor technology choice has been cleaned up in WF4. For example, as far as I can tell, WF4 (Windows Workflow 4.0) activities are more or less testable with [TestMethod] and mocking. This among other things, like improved performance, has grabbed my attention about the technology again, whereas I had previously pooh-poohed WF3. I'm working on a new architecture for essentially an n-tier collaborative application (not enterprise-class, just a smallish project with potential to grow significantly) where I'm already trying to discipline myself to use IoC and, to some extent, TDD, and I'm wondering, in general terms, whether it is wiser to just hand-code workflow logic or if I should delve into learning and integrating WF4 so that WF becomes literally the controller of the entire application, i.e. the practical C in "MVC" (not ASP.NET MVC but rather the pattern). So should workflow activities in WF4 be the primary controller for a highly expandable/growable web-based collaborative application? Or am I asking entirely the wrong question? This is a vague question, I'm sure, so abstract answers are as welcome as specific ones.

    Read the article

  • What workflow engines are companies using and would you use it again? [on hold]

    - by cbmeeks
    I've been asked to find out "what's out there" when it comes to workflow engines. We have projects where a workflow based development environment makes sense. I've looked a little into jBPM but it seemed to have a steep learning curve. Google seems to take me to commercial products or products that I think are open source but instead have very limited "community editions". I could simply be searching for the wrong terms. What I would like to know are what actual workflow based products have you used at your company and to what degree of success or failure was it? Would you use it again? Thanks.

    Read the article

  • Workflow Automation software for SVN

    - by KyleMit
    We're currently using IBM's ClearQuest for task management and ClearCase for change management. They plug and play very well with each other. Users can create tasks in ClearCase as defects and enhancements, and developers can use those tasks to check out and modify code in source control. We're looking to upgrade to a better, more modern Source Control system, like SVN, although we're not married to that as our Source Control system. There are loads of source control systems out there, but I'm having difficulty finding one that also includes the ability to have users enter tasks and track them, especially in a native way to the source control system itself. Are there any products that replace ClearQuest for systems like SVN? Are there any other cheap / open source application pairs that handle both sides of the coin?

    Read the article

  • F# Async workflow

    - by akaphenom
    Is there a way to look at the definition of the Async workflow? What goes under the hood that would make a line of code behave differently in there, than outside of it?

    Read the article

  • What's the workflow of Continuous Integration With Hudson?

    - by Satoru.Logic
    Hi, all. I am referred to Hudson today. I have heard about continuous integration before, but I have no idea what the heck is a ci-server. Hudson is really easy to install in Ubuntu and in several minutes I managed to set up an instance of it. But I don't quite understand the workflow of a ci-server, or how am I supposed to use it? Please tell me if you have experience about ci, thanks in advance.

    Read the article

  • Describe your workflow of using version control (VCS or DVCS)

    - by edwin.nathaniel
    I'd like to learn other people workflow when using either SVN or GIT. Please describe your strategy to handle the following tasks: Implement a feature Fixing bugs (during development and deployed app) Code Review Refactoring code (post code-review) Incorporate patches Releasing the newer version of your app (desktop, web, mobile, would you treat them differently?) Feel free to organize your answer not grouped by the tasks but grouped by whatever you think is relevant but please organize it by VCS/DVCS (please don't mix them). Thank you.

    Read the article

  • Workflow for reading and writing files

    - by AIR_PhillipSenn
    In Workflow for reading and writing files the authors use these two lines of code: var file = air.File.documentsDirectory; file = file.resolvePath("AIR Test/testFile.txt"); But I think that it's using one variable for two different meanings, isn't it? Wouldn't it be better to write them as: var myDocumentsDirectory = air.File.documentsDirectory; var myTestFile = myDocumentsDirectory.resolvePath("AIR Test/testFile.txt");

    Read the article

  • MOSS Collect Data from user custom email

    - by nav
    Hi, I am trying to send out a custom email, after the Collect Data from user step in my primary workflow which starts when an item in list X is created. I have created a secondary workflow to start when a new Task item is created (this is created by the Collect Data from user action in the primary workflow). But I am having problem how I retrieve the information on the list X. I know the ID of the referenced item in List X is stored in a URL within the "Link" column in the Tasks lists. But can't see any string manipulation function that will grab this ID so I can use it to link back to relevant item in List X. Is there an easier way to do this? Many Thanks, Nav

    Read the article

  • moss 2007 workflows

    - by nav
    Hi, I'm new to MOSS 2007. I need create a workflow to look at a document's review date (a select list predefined to values of 3 , 6 or 12 months) and send an email if the review date has passed. So the workflow needs to get the documents review date then convert this to date time add to the created date if greater than current date send an email. Can anyone tell me if this is possible to do using SP designer to create the workflow? I'dd be grateful for any pointers. Many Thanks, Nav

    Read the article

  • javascript flowchart library for workflow visualization

    - by jonny
    I need to generate flowchart from business process specification (tasks, their input, output points, roles applicable for each task... ) stored in a database. What I need is javacript (preferably, open-sourced) library which can generate a shiny flowchart with swimlines. Ideally I should be able to edit workflow items connections and send changes back to database. Any recommendations? UPDATE By flowchart I mean something like this: UPDATE Found open-source project which that allows create/edit basic flowcharts here Tt seems abanddoned since 2007.

    Read the article

  • Fossil gpg workflow for teams

    - by Alex_coder
    I'm learning fossil and trying to reproduce a workflow for two people modifying the same source code tree. So, Alice and Bob both have local repositories of some source code. Both have autosync off. Alice hacks some more, does some commits signing check-ins with her gpg key. This part is fine, as Alice I've managed to generate gpg keys, fossil asked me the key password when commiting. I'm also aware of gpg-agent but don't use it yet, because I'm trying to keep things as simple as possible for now. Now, at some point Bob pulls changes from Alice's fossil repo. How would he verify Alice's signed check-ins?

    Read the article

  • using workflow with asp.net ?

    - by haansi
    hello, We have to do a project which is a web based workflow application. I have never used WWF. I am thinking should we use WWF with asp.net ? This application has 10 tasks with multiple options (each task may have 4 options) with a good amount of data. Just like banks have customer dealing work flow applications. I am not getting will it hard to switch to WWF ? (we have short time to deliver/ learning curve) Please advice on this. thanks

    Read the article

< Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >