Search Results

Search found 3767 results on 151 pages for 'workflow foundation'.

Page 8/151 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • Content type with workflow and lookup column

    - by Sachin
    Hi All, I have a requirment where I want to upload a document based on category and subcategory. I have added this columns as an lookup column which pulls data from category and subcategory list. Now want the document should be passed from series of approval so I have attached SharePoint out of the box Approval workflow to this document library. Now I want to create a content type which contains these two lookup column and approval workflow. So that I can user these setting for rest of the document library. Can any one tell me how to create a content type with workflow and lookup column. Thanks in advance Sachin

    Read the article

  • Weird Workflow Behavior in Sharepoint 2007

    - by frbry
    I have a Document Library A and a list B. When a document added to A, an item is created in B with the Title = A.Url. Another workflow runs whenever a document is updated in A which makes a lookup: B.Title = A.Url, and changes another column in B in found item. Item Change Workflow always gives "Error Occurred: List item is not found". I modified the workflow to send me an e-mail containing the new (but unchanged) A.Url. It sent me the exact string with the one already in list B. Anyways, why it can't find the item when the two columns are equal? Thanks in advance. Edit I literally hate Microsoft Sharepoint.

    Read the article

  • Start SharePoint workflow only with new file versions

    - by NeiOliver
    I am trying to create a workflow to send an e-mail whenever a new version of a file is uploaded to a Document Library. The Document Library has lots of fields that, if updated, will create a new version of the ListItem, but I don't want to start the workflow for this cases. Only when a new version of the document is uploaded (including the first version) I want the workflow started. My document library does not need approval, only major versions are enabled and it does not need to checkout files before editing. Is there a way of doing this, even programmatically?

    Read the article

  • Visualise Workflow Diagram from plain text

    - by Dmitriy Nagirnyak
    Assuming there is a plain text with description of the workflow (Just plain English in some predefined format). Are there any tools (better online) to visualize the flow based on a plain text? What for: to store the description of the workflow in a source control system and be able to quickly remember/understand that.

    Read the article

  • Usecase for Workflow Engine

    - by Icarus
    Hi, We have an issue where a Database table has to be updated on the status for a particular entity. Presently, its all Java code with a lot of if conditions and an update to the status. I was thinking along lines of using a Workflow engine since there can be multiple flows in future. Is it an overkill to use a Workflow Engine here... where do you draw the line ?

    Read the article

  • Mercurial Workflow for small team

    - by Tarski
    I'm working in a team of 3 developers and we have recently switched from CVS to Mercurial. We are using Mercurial by having local repositories on each of our workstations and pulling/pushing to a development server. I'm not sure this is the best workflow, as it is easy to forget to Push after a Commit, and 3 way merge conflicts can cause a real headache. Is there a better workflow we could use, as I think the complexity of distributed VC is outweighing the benefits at the moment. Thanks

    Read the article

  • How do I remotely run a Powershell workflow that uses a custom module?

    - by drawsmcgraw
    I have a custom Powershell module that I wrote for various tasks. Now I want to craft a workflow whose activities will use commands from the module. Here's my test workflow: workflow New-TestWorkflow{ InlineScript { Import-Module custom.ps1 New-CommandFromTheModule } } Then I run the workflow with: New-TestWorkflow -PSComputerName remoteComputer When I do this, the import fails because it can't find the module. I imagine this is because the workflow is executing on the remote machine, where my module does not exist. I can see myself running this across many machines so I'd really rather not have to install this module and maintain it on all of the machines. Is there some way to have my module in a central place and use it in workflows?

    Read the article

  • Git workflow for two tight-knit projects

    - by Pioul
    Two very similar projects I'm maintaining an online Markdown editor using Git as RCS (and accessorily made available on GitHub). From this web app, I've created a Chrome app: the code is the same, aside from some Chrome technicalities. I care about open sourcing these two projects. Still, the Chrome app's code being the same as the web app's except for some dull details, I've first chosen to (1) not publish the Chrome app on GitHub, and (2) not use Git to manage its code. Instead, I would manually review the web app's commits, then replicate the few changes in the Chrome app. … slightly drifting apart However, I've decided to add a feature to the Chrome app only. So, even though both codebases will remain broadly similar, they'll be diverging enough to make me reconsider the rationale behind my initial decision to not version control nor share the Chrome app's source code. Since I'm now willing to use Git to version control both apps, and that I want to share both of them on GitHub, how should I go about it? Should I use two different repositories, or one repo with two long-running branches? What would be the pros and cons of each approach in that context? What would be the easiest/fastest way to regularly "import" commits from the web app to the Chrome app, since the web app is going to remain the master branch? Is cherry-picking the only solution?

    Read the article

  • What to pass parameters to start an workflow through WCF

    - by Rubens Farias
    It's possible to define some start values to an workflow using WorkflowInstance.CreateWorkflow, like this: using(WorkflowRuntime runtime = new WorkflowRuntime()) { Dictionary<string, object> parameters = new Dictionary<string, object>(); parameters.Add("First", "something"); parameters.Add("Second", 42); WorkflowInstance instance = runtime.CreateWorkflow(typeof(MyStateMachineWorkflow), parameters); instance.Start(); waitHandle.WaitOne(); } This way, a MyStateMachineWorkflow instance is created and First and Second public properties gets that dictionary values. But I'm using WCF; so far, I managed to create a Start method which accepts that two arguments and I set that required fields by using bind on my ReceiveActivity: using (WorkflowServiceHost host = new WorkflowServiceHost(typeof(MyStateMachineWorkflow))) { host.Open(); ChannelFactory<IMyStateMachineWorkflow> factory = new ChannelFactory<IMyStateMachineWorkflow>("MyStateMachineWorkflow"); IMyStateMachineWorkflow proxy = factory.CreateChannel(); // set this values through binding on my ReceiveActivity proxy.Start("something", 42); } While this works, that create an anomaly: that method should be called only and exactly once. How can I start an workflow instance through WCF passing those arguments? On my tests, I just actually interact with my workflow through wire after I call that proxy method. Is there other way?

    Read the article

  • Best workflow with Git & Github

    - by Tom Schlick
    Hey guys, im looking for some advice on how to properly structure the workflow for my team with git & github. we are recent svn converts and its kind of confusing on how we should best setup our day-to-day workflow. Here is a little background, im comfortable with command line and my team is pretty new to it but can follow use commands. We all are working on the same project with 3 environments (development, staging, and production). We are a mix of developers & designers so some use the Git GUI and some command line. Our setup in svn went something like this. We had a branch for development, staging and production. When people were confident with code they would commit and then merge it into the staging. The server would update itself and on a release day (weekly) we would do a diff and push the changes to the production server. Now i setup those branches and got the process with the server running but its the actual workflow that is confusing the hell out of me. It seems like overkill that every time someone makes a change on a file they would create a new branch, commit, merge, and delete that branch... from what i have read they would be able to do it on a specific commit (using the hash), do i have that right? is this an acceptable way to go about things with git? any advice would be greatly appreciated.

    Read the article

  • SVN on Team Foundation Server - Delphi 2010

    - by BahaiResearch.com
    We're looking to upgrade to Delphi 2010 and have Team Foundation Server as our source Control. Is there a plug in for TFS that allows clients to talk to it via SVN? I noticed that CodePlex, Microsoft's open source web service, supports TFS and SVN so am hoping that there is a SVN plug in for TFS. Ian

    Read the article

  • Team Foundation Server Setup/Access

    - by Angel Brighteyes
    What I need: A TFS 2010 Setup that allows 2 application developers to access the TFS from remote locations. How it is setup: Server 2008 Standard 2g Ram 300g HD space SharePoint Server 2007, using SQL Server 2005 SQL Server 2008 Standard Team Foundation Server 2010 IIS 7 Sharepoint Bindings: TFS.DynAccount.Me:80; TFS:80 TFS Bindings: TFS.DynAccount.Me:8080; TFS:8080 Using DynDNS service to account for the dynamic ip address being used, this is a requirement for the moment until I can get a better isp package. Access using Local Accounts Server is not setup on a domain, or as a domain. Consequently I did not setup AD services. Problem: When logged into TFS using my credentials TFS\AdminUser through the DynDNS account TFS.DynAccount.Me I recieve the 'Red X of Death' on the Documents and Reports folder. When logged into the TFS through the local peer to peer network using the same credentials TFS\AdminUser I do not receive the 'Red X of Death' problem. Further Troubleshooting: When users 'Right Click' the 'TeamProject1' Click 'Show Project Portal' it tries to take them to http://TFS:8080 instead of http://TFS.DynAccount.Me:8080, which doing further research I am assuming that it is because team foundation server was setup with a local name of TFS instead of 'TFS.DynAccount.Me' as specified here in Visual Studio Magazines: The Red X of Death. Users can Access the Team Portal for SharePoint via http://TFS.DynAccount.Me/TeamCollection/TeamProject so it is not like we are dead in the water or anything. However, as most employees/staff are prone to do, they have expressed a great distaste for having to do it this way and just be patient until the current project is finished since we are under a very strict deadline. Is there a way to set this up differently, or change some settings someplace, reinstall it, point a CName record for our domain website to the DynAccount (e.g. TFS.OurDomain.com points to TFS.DynAccount.Me, which consequently does allow access to the http site without issues), or something. I really don't feel like after all the time and effort I have spent into, first the cost, second the bloody install, third learning SharePoint well enough, fourth the hours into days spent on this, fifth more troubleshooting, sixth employee headaches to just let it lay where it is at. I figure in my spare/off time I would keep trying to get this to work. So I really appreciate any help any one can give me. I know this is probably something really stupid simple that I will 'Face Palm' over, but at the moment the stress and frustration just has me beat. Thank you again, this community has always been a great help.

    Read the article

  • Apple Core Foundation license

    - by Shane
    Hi all, A short but sweet question: Can I use Apple's open source Core Foundation (CF classes) in a commercial product for free? That is, can I compile and link against the libraries without open sourcing my own applications's code? Obviously if I alter the original CF code, I would submit the changes. It's a very well constructed API and I'd hate to have to reinvent the wheel. Cheers, Shane

    Read the article

  • How can I prevent IIS from trying to load a dll?

    - by Abtin Forouzandeh
    My project is a Speech Server application using Windows Workflow. It runs as an app under IIS. It supports a plug-in system. Here is what is happening: Load DLL into memory and set the type on an InvokeWorkflow control. When the InvokeWorkflow control runs, it appears to correctly instantiate the workflow from the loaded assembly - it completes the Initialize method. Everything crashes an burns, the target workflow is never executed. I can resolve this by putting a copy of the DLL in the application's executing directory. The workflow then executes correctly So it appears that IIS is trying to reload the assembly, even though its already in memory. Is there anyway to alter or disable this behavior in IIS? Perhaps a hook I can write that will intercept the request to load the dll and use my own logic to do so?

    Read the article

  • How to solve concurrency problems in ASP.NET Windows-Workflow and ActiveRecord/NHibernate?

    - by Famous Nerd
    I have found that ActiveRecord uses the Session-Scope object within the ASP.NET application and that if the web-site is read-write we can have a tug-o-war between the Workflow's own Data-Access SessionScope and that of the ASP.NET site. I would really like to have the WindowsWorkflow Runtime use the same object session as the web-site however, they have different lifetimes. Sometimes, a web-request may save a very simple piece of data which would execute quickly however, if the web-site kicks off a workflow process.. how can that workflow make data-modifications while still allowing the Appliaction_EndRequest to dispose the ASP.NET SessionScope ... it's like ownership of the SessionScope should be shared between the workflow runtime and the ASP.NET website. Manual Workflow Scheduler may be the Savior... if a workflow is synchronous and merely uses CallExternalMethod to interact with the Host then we could constrain all the data-access to the host.. then the sessionScope can exist once. This however, won't solve the problem of a delay activity... if this delay fires, we could need to update data... in this case we'd need an isolated Session Scope and concurrency may arise. This however, differs from SharePoint workflows where it seems that the SharePoint workflow can save data from the web and the workflow and that concurrency is handled through other means. Can anyone offer any suggestions on how to allow the workflow to manage data and play nice with ASP.NET web sites?

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >