Search Results

Search found 32922 results on 1317 pages for 'control path'.

Page 225/1317 | < Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >

  • Having a fork match the original repo when the original master branch can't be merged in?

    - by a2h
    The related questions that SO offer me only answer simple cases that can be solved with a pull - however, that won't work for my case. There's a repository I've forked, with just a master branch, and I've forked it, and I've worked in both my master, and a new branch of my own, rw-style. The owner of the forked repository's committed some of my changes but not others; the black dots on the top right below represent commits from both my master and rw-style branches. I'm aware using the fork queue is not a good idea, so I'm staying away from it. Using git pull does work, but it creates a conflict that I would then need to resolve, and it also results in duplicate history for my master branch, and that doesn't look particularly pretty. I don't know any other solutions right now, so I'm currently considering just creating a patch from two commits that I haven't yet pushed, deleting my fork, creating it again from the original, and then applying my patches on top of it. Is that the only solution?

    Read the article

  • Subversion: svn status displays tons of undesired .metadata files

    - by FarmBoy
    I'm trying to set up Subversion on Ubuntu Linux. It seems to be working, except that when I made one change and tried svn status, I found about 100 files had been changed, in the .metadata directory. My ~/.subversion/config file currently contains the following line: global-ignores = *.o *.lo *.la *.al .libs *.so *.so.[0-9]* *.a *.pyc *.pyo *.rej *~ .*.swp .DS_Store What do I need to add to ignore the .metadata files? The directory under consideration is used by Eclipse for Python development using PyDev, if that matters.

    Read the article

  • Continuous integration with multiple branch development

    - by ryanprayogo
    In the project that I'm working on, we are using SVN with 'Stable Trunk' strategy. What that means is that for each bug that is found, QA opens a bug ticket and assigns it to a developer. Then, a developer fixes that bug and checks it in a branch (off trunk, let's call this the bug branch) and that branch will only contain fixes for that particular bug ticket When we decided to do a release, for each bug fixes that we want to release to the customer, a developer will merge all the fixes from several bug branch to trunk and proceed with the normal QA cycle. The problem is that we use trunk as the codebase for our CI job (Hudson, specifically), and therefore, for all commits to the bug branch, it will miss the daily build until it gets merged to trunk when we decided to release the new version of the software. Obviously, that defeats the purpose of having CI. What is the proper way to fix this issue?

    Read the article

  • How to manage IoC containers in tests?

    - by frosty
    I'm very new to testing and IoC containers and have two projects: MySite.Website (MVC) MySite.WebsiteTest Currently I have an IoC container in my website. Should I recreate another IoC container for my test? Or is there a way to use the IoC in both?

    Read the article

  • Introduction to Subversion for Developers

    - by wandiscoGeorge
    The second course in the series, "Introduction to Subversion for Developers" will take place on Wednesday, May 5, 2010 at 9AM PDT. Subversion's architecture and design principles will be covered and attendees will be introduced to using Subversion for software development. http://wandisco.com/webinar/subversion/training/intro_for_devs

    Read the article

  • Why it is necessary to put comments on check-ins? [closed]

    - by Mik Kardash
    In fact, I always have something to put in when I perform a check-in of my code. However, the question I have is - Is it really so necessary? Does it help so much? How? From one point of view, comments can help you to keep track of changes performed with every check-in. Thus, I will be able to analyze the changes and identify a hypothetic problem a little bit quicker. On the other hand, it takes some time to write useful information into check-in. Is it worth it? What are the pros and cons of writing comments to every check-in? Is there any way to write "efficient" check-in comments?

    Read the article

  • Unity view Registered Types during debug

    - by Jon Archway
    Possibly a stupid question, but during debug I simply want to see the types that have been registered with my Unity container. I have tried going through the container in the watch window, but can't seem to find what I am looking for? I am expecting there to be a list of registered types somewhere? Thanks in advance

    Read the article

  • How can I set value to Html hidden fields from asp.net

    - by arunendra
    Hi I have scenario, where there are html hidden fields, the page can be redirected to itself, with parameters, I have sessions too. Now depending on session value I want to set some hidden values, so that it can be picked up from javascript and can do certain operation. But, the problem is I have no idea about how to get/ set values into html controls using asp.net, and also do not know whether this is possible or not. Please note, it is imperative that I need some way to hold some data that can be set using asp.net and can be picked up by javascript. Since session can not be used for this purpose, so I need some other way. Please enlighten me! Thanks and regards Arunendra

    Read the article

  • Multiple plugin instance loading with MEF

    - by Dave
    In my last application, using MEF to load plugins went just fine, but now I'm running into a new issue. I have a solution for it that I explain at the end of this question, but I'm looking for other ways to do it. Let's say I have an interface called ApplianceInterface. I also have two plugins that inherit from ApplianceInterface, let's call them Blender and Processor. Now, I would like to have multiple Blenders and Processors in my application, but I am not sure how to instantiate them properly. Before, I would simply use the ImportMany attribute and upon calling ComposeParts, my application would load Blender and Processor. For example: [ImportMany(typeof(ApplianceInterface))] private IEnumerable<ApplianceInterface> Appliances { get; set; } and my Blender and Processor plugins would be attributed like this: [PartCreationPolicy(CreationPolicy.NonShared)] [Export(typeof(MyInterface)] public class Blender : ApplianceInterface { ... } but what this ends up doing for me is populating Appliances with one Blender and one Processor. I need to be able to create an arbitrary number of Blender and Processor objects. Now, from the documentation I understand that [PartCreationPolicy(CreationPolicy.NonShared)] is what allows MEF to create a new instance each time, but is there a similar "magical" way to create a specific number of instances of something using MEF? Up until now, I've relied on [Import] and [ImportMany] to resolve the assemblies. Is my only option to use a global container, and then resolve the export manually using GetExportedValue<? I have tried GetExportedValue< and that implementation does work fine for me, but I was just curious if there is a better, more accepted way to do it.

    Read the article

  • Tools for managing code deployment/versioning for IIS / Windows enviroments

    - by RizwanK
    I've got a strong background in Linux and OSX, and just left a job where I was architecting systems based on those platforms. Now I've got a Windows Server running IIS that has a number of different websites that it hosts. Most of them are just a bunch of HTML, JS and Images, with some ASP for some customer tools. (Each website has a different set of customer tools, or they are the same tools, but with minor code changes between them.) I'm also adding a develop web server with the same code, but the 'bleeding edge' stuff. I need an effective way of managing changes and updates to the overall codebase (henceforth referring to both the images and the html and the asp, for all the sites). When a dev (or webmaster) checks in changes, I want it to show up automatically on the developer server, but should be manually pushed out to the live server. I'd be tempted to just make the websites SVN repositories, but I'd be concerned about the overhead of having the webdeveloper having to log into the server and trigger an SVN update via commandline/tortise (and heaven forbid, manage tags). Ideally I'd also manage IIS profile settings between the systems, but the major need is to be able to manage the process, and expose it to our ASP developer, and our webmaster, both of which are used to just FTPing up the files to the live site. So, any recommendations on tools (beyond some SVN hacking with BAT files + teaching the webmaster how to log into the server and do updates) or workflows that would help this out? I even considered an RPM type package (or some Windows equivalent, of course) to manage the live server, but that seems like a bit too much overhead. Thanks.

    Read the article

  • ASP.NET MVC & Windsor.Castle: working with HttpContext-dependent services

    - by Igor Brejc
    I have several dependency injection services which are dependent on stuff like HTTP context. Right now I'm configuring them as singletons the Windsor container in the Application_Start handler, which is obviously a problem for such services. What is the best way to handle this? I'm considering making them transient and then releasing them after each HTTP request. But what is the best way/place to inject the HTTP context into them? Controller factory or somewhere else?

    Read the article

  • How to automatically check out a database file in a source controlled web application ?

    - by TheRHCP
    Hello, I am working on an ASP.NET web application, we are a small team (4 students) and we do not have access to a dedicated server to host the database instance. So for this web application we decided just to put the database file in the App_Data folder. The problem is that our project is source controled on TFS, so every time you open the solution and try to launch the web application, we get an expcetion saying that database is read-only. That is logical because the databse file is not automatically checked-out. Is there a workaround to avoid a manual check-out of the database file everytime we open the solution ? Thanks.

    Read the article

  • Versioned cloud-based social code snippet management

    - by Chapso
    It seems a lot to ask, but I'm looking for a cloud-based solution to managing code snippets. I am looking for: Tags User accounts (I want to be able to see all of my snippets on a single page) syntax highlighting versioning - myself or others should be able to edit my snippets to improve them\ straightforward UI with minimal advertising if any Does anyone know of a solution which meets these requirements? If not, would anyone be interested in something like this? As a software engineer, after step zero (does it already exist), I'm perfectly willing to go onto step 1 (would other people use it? If so, make it).

    Read the article

  • TortoiseSVN - When I delete a folder I got trouble.

    - by Mendy
    A lot of times I need to delete a folder and copy another one with the same name. Always this is a place to trouble. What is the best way do do this? The error I got when I trying to commit: Error: Directory Error: '..\trunk\bin\MVCContrib\InputBuilderTemplates\.svn' Error: containing working copy admin area is missing Error: Please execute the 'Cleanup' command. The error I got when I trying to cleanup: '..\trunk\bin\MVCContrib\InputBuilderTemplates\.svn' is not a working copy directory.

    Read the article

  • No more memory available in Mathematica, Fit the parameters of system of differential equation

    - by user1058051
    I encountered a memory problem in Mathematica, when I tried to process my experimental data. It's a system of two differential equations and I need to find most suitable parameters. Unfortunately I am not a Pro in Mathematica, so the program used a lot of memory, when the parameter epsilon is more than 0.4. When it less than 0.4, the program work properly. The command 'historylength = 0' and attempts to reduce the Accuracy Goal and WorkingPrecision didn`t help. I can't use ' clear Cache ', because there isnt a circle. I'm trying to understand what mistakes I made, and how I may limit the memory usage. I have already bought extra-RAM, now its 4GB, and now I haven't free memory-slots in motherboard Remove["Global`*"]; T=13200; L = 0.085; e = 0.41; v = 0.000557197; q = 0.1618; C0 = 0.0256; R = 0.00075; data = {{L,600,0.141124587},{L,1200,0.254134509},{L,1800,0.342888644}, {L,2400,0.424476295},{L,3600,0.562844542},{L,4800,0.657111356}, {L,6000,0.75137817},{L,7200,0.815876516},{L,8430,0.879823594}, {L,9000,0.900771775},{L,13200,1}}; model[(De_)?NumberQ, (Kf_)?NumberQ, (Y_)?NumberQ] := model[De, Kf, Y] = yeld /.Last[Last[ NDSolve[{ v (Ci^(1,0))[z,t]+(Ci^(0,1))[z,t]== -((3 (1-e) Kf (Ci[z,t]-C0))/ (R e (1-(R Kf (1-R/r[z,t]))/De))), (r^(0,1))[z,t]== (R^2 Kf (Ci[z,t]-C0))/ (q r[z,t]^2 (1-(R Kf (1-R/r[z,t]))/De)), (yeld^(0,1))[z,t]== Y*(v e Ci[z,t])/(L q (1-e)), r[z,0]==R, Ci[z,0]==0, Ci[0,t]==0, yeld[z,0]==0}, {r[z,t],Ci[z,t],yeld},{z,0,L},{t,0,T}]]] fit = FindFit[data, {model[De, Kf, Y][z, t], {Y > 0.97, Y < 1.03, Kf > 10^-6, Kf < 10^-4, De > 10^-13, De < 10^-9}}, {{De,7*10^-13}, {Kf, 10^-5}, {Y, 1}}, {z, t}, Method -> NMinimize] data = {{600,0.141124587},{1200,0.254134509},{1800,0.342888644}, {2400,0.424476295},{3600,0.562844542},{4800,0.657111356}, {6000,0.75137817},{7200,0.815876516},{8430,0.879823594}, {9000,0.900771775},{13200,1}}; YYY = model[ De /. fit[[1]], Kf /. fit[[2]], Y /. fit[[3]]]; Show[Plot[Evaluate[YYY[L,t]],{t,0,T},PlotRange->All], ListPlot[data,PlotStyle->Directive[PointSize[Medium],Red]]] the link on the .nb file http://www.4shared.com/folder/249TSjlz/_online.html

    Read the article

< Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >