Search Results

Search found 3100 results on 124 pages for 'svn externals'.

Page 95/124 | < Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >

  • Mercurial repository narrow clone?

    - by Berry Langerak
    Hi. I'm currently in the process of moving from Subversion to Mercurial, and I have to say I don't regret that decision. However, when trying to convert my project, I ran into a problem of Mercurial, which I can't seem to get fixed. I have two distinct projects: one is a framework, and the other is an application that relies on that framework. Here's what the repositories look like: The Framework repository: docs/ deploy/ lib/ tests/ The Application repository: application/ config/ lib/ tests/ www/ What I'd like is for the application's lib directory to contain a copy of the frameworks' lib/ directory. I used to do this using svn:externals. Now, I am aware that Mercurial supports the concept of subrepositories, but that doesn't seem like the "correct" solution, as it doesn't actually pull in the lib/ directory like I wanted, as you'll still have to pull and push changes manually. That, plus once you clone the framework repository, you'll get all of it, not just the lib/ directory. I only need the lib/ directory, not the tests, or the docs. Now, I thought up two different solutions to this problem, but I wonder which is the best. The first solution would be to clone the framework in a different directory altogether and create symlink in the application's lib/ directory which points to the framework's lib/ directory. Putting the symlink in .hgignore should make sure all is well, I think? That means that you could edit the frameworks code, and commit that, and you could edit the application's code and commit that, too. The other option is to have multiple repositories. The framework gets pulled as a whole, which means you'll get the docs/, deploy/, test/ etc. directories, which are not needed for usage of the framework. I thought maybe creating a repository purely for the library might be a solution, although I sincerely doubt it, as the Unit Tests are very dependant upon the library itself. Does anyone know a decent solution for this problem?

    Read the article

  • Linking Error Building 64bit Qt app on 32bit XP machine.

    - by photo_tom
    I'm trying to build a 64 bit version of my application (and yes I really do need the memory) on my 32bit xp dev box for production testing on our Vista64 server. Previously, I have built w/o any errors the Qt 4.6.2 DLL's in 64 bit mode. That step went vary smooth. Just to get started in building production, I'm trying to rebuild Qt's Star Delegate demo in 64bit mode. I converted the 32bit to 64bit app by changing the application configuration and adjusting the library's to the 64bit venisons. Now, when I go to link, I'm getting the following error when I link 1>------ Build started: Project: stardelegate, Configuration: Release x64 ------ 1>Linking... 1>MSVCRT.lib(crtexew.obj) : error LNK2001: unresolved external symbol WinMain 1>release64\stardelegate.exe : fatal error LNK1120: 1 unresolved externals Suggestions? edit - After some more searching, discovered if I link as a console app it will work and run. But not as a windows app. And I don't have this problem in 32 bit mode.

    Read the article

  • Cannot access Network Shares on Windows Server 2008 running VisualSVN Server

    - by mwillmott
    Hi, I have installed VisualSVN server on Windows Server 2008. The server is part of a domain but not the domain controller, it is just a data server and now an SVN server. VisualSVN uses port 80 and can only be access from inside the network (i do i by going to the dns name of the server). However, ever since i have installed this, other computers on the network can no longer access the shared folders on the server EXCEPT the domain controller which has no problems accessing the shares. I am stumped, i am guessing it is something to do with Apache running and not using host-headers (or whatever the Apache equivalent may be) but just being bound to the servers DNS name or IP. Any suggestions?

    Read the article

  • TextMate GetBundle stopped working in Snow Leopard

    - by Jauder Ho
    It seems that I am no longer able to Get Bundles using TextMate after upgrading to Snow Leopard. I get the following error message. Googling shows no solutions. I have updated to the latest GetBundle via svn to no avail. /Applications/TextMate.app/Contents/SharedSupport/Support/lib/osx/plist.bundle: dlopen(/Applications/TextMate.app/Contents/SharedSupport/Support/lib/osx/plist.bundle, 9): no suitable image found. Did find: (LoadError) /Applications/TextMate.app/Contents/SharedSupport/Support/lib/osx/plist.bundle: no matching architecture in universal wrapper - /Applications/TextMate.app/Contents/SharedSupport/Support/lib/osx/plist.bundle from /Users/jauderho/Library/Application Support/TextMate/Bundles/GetBundles.tmbundle/Support/getBundles.rb:4

    Read the article

  • Best way to automatically synchronize files between Linux and Windows

    - by Gregory
    My first choice was rsync but it caused some issues and is too manual. My second choice, currently under evaluation is Unison. Are there any other good options for bi-directional auto-syncing? The synching tool cannot add it's own files to the directories to be synched. Which removes CVS/SVN as a choice. Plus they are too manual. The requirements are user-level program on both sides, no root account access available. Only scanning on linux. On windows it could be a virtual drive/path. Very fast and efficient like rsync. Some other requirements include: machines are not on the same network, files cannot fall into the wrong hands, nor can they be handled by 3rd parties, this pretty much excludes all online storage sites.

    Read the article

  • Slow git clone and fetch

    - by EtienneT
    I setuped gitosis on a linux server following this tutorial: http://scie.nti.st/2007/11/14/hosting-git-repositories-the-easy-and-secure-way We are using git on our windows machines with TortoiseGit and msysgit. Pushing changes to the server is pretty fast, but when we want to clone or fetch changes from the remote server, it begins really fast (800k/s) and then drop pretty fast to around 3 to 30k/s and it can take forever to update. git-pull for small update is fast, but as soon as we have to download something of more than a few MB, it is slow. We are switching from SVN to git and this is holding us back from using git full time. Thanks!

    Read the article

  • Configure FastCGI for Python

    - by Rob
    I have nginx running on a VM and I want to run a Trac site. I need to run a python FastCGI server, but I cannot tell which is the server to use. I have found the following: <a href="http://redmine.lighttpd.net/projects/spawn-fcgi">Lighttpd spawn-fcgi</a> But this seems to require that you compile lighttpd just to get the fcgi server, which is weird. <a href="http://svn.saddi.com/py-lib/trunk/fcgi.py">fcgi.py</a> But this one seems to be depreciated. At the very least it is poorly documented. <a href="http://trac.saddi.com/flup">flup</a> This one comes with dependencies on ubuntu (python-cheetah{a} python-mysqldb{a} python-webpy{a}) that seem unnecessary. Also poorly documented. Are their any recent guides for setting this up. Trac's own FastCGI setup page seems to miss some steps.

    Read the article

  • Invalidating unused ssh keys

    - by JH
    I am using one ssh account for all my Subversion users. They send me their public keys and I put them in .ssh/authorized_key of the svn account, then they can check out the code from Subversion using ssh tunnel. So far everything works fine. The problem though is that I want to invalidate keys that have not been used for some time (say one month). Does anyone know a way to make sshd log the public key when a user signs in? Thanks.

    Read the article

  • ForwardAgent in Jenkins

    - by r_2
    I'm trying to enable ForwardAgent in the "Publish over SSH" Jenkins Plugin. This would allow jenkins to execute deployments, rsyncs and svn+ssh checkouts on remote servers. But there's no option for this in the GUI. ForwardAgent is set to yes in /etc/ssh/ssh_config and in /var/lib/jenkins/.ssh/config, but when Jenkins jobs login over ssh, the remote session does not have the key loaded in agent. ("Could not open a connection to your authentication agent.") Is there a way to force ForwardAgent, or a better way to do this (via a Jenkins slave)? Thanks for any ideas, much appreciated!

    Read the article

  • Remote file access.

    - by Rob Rob
    Hi, We're needing to provide remote (read/write) access to a number of files on our network to several users (some technical, some non-technical) who will be running Windows. The non technical users will need to be able to access their files in an easy to use manner. From previous experience, we could do this with: (some sort of) VPN SSH and something like Dokan (i've only previously done this on linux with sshfs) WebDav FTP VPN and SSH access are more open that we need at present, so I'm leaning towards webdav, however I only have limited experience of it (setting up an SVN server several years ago), but my understanding is that users can access it through windows explorer. FTP I haven't had much experience of, as I've always used SFTP via ssh - but i'd imagine we could make this work in a similar way to ssh. So my question is - have I missed any obvious candidates for this task, or if webdav is (or isn't) suitable what are the security implications of using it for this (obviously https will be used for the transfers, etc). Thanks, Rob.

    Read the article

  • Problem modifying read-only files on Samba NAS

    - by Felix Dombek
    Hi, I have files on a Samba server in the local company network and accessing them from a Windows Vista machine. Usually, if I want to delete a directory containing write-protected (read-only) files, Windows would ask "This file is read-only, are you sure?". However, when I do this with a dir on the server, Windows just tells me that I need permissions. The workaround is to remove the read-only flag from the directory and all contained files and then deleting. However, I have a TortoiseSVN versioned dir on the server, and the .svn dirs contain read-only files. I need to remove the read-only flags from the dir before every commit, or else it fails. This is quite distressing and shouldn't be so. Does someone know how to attack this problem? (If someone knows how to tell TortoiseSVN to not make its files read-only, that would probably be ok as well) ... Thanks!

    Read the article

  • Installing Windows Server 2003 on AMD environment

    - by santhosh kumar
    Hi all, Our organisation we have 25 computers and we are trying to setup Windows Server 2003. We are planning to configure Active Directory NAT DNS server Visual Studio Team Foundation Server Subversion (SVN) Trac (Bug tracking tool) FTP server And our Hardware configuration is AMD Athlon 64 X2 Dual (Core processor 5600+, 2.60 GHz) Asus motherboard (M2N series) Transcend 4 GB RAM (800 MHz) 500 GB Hard Disk (RAID enabled) But my colleague is advising me AMD won't suite for servers platforms, and use Intel environment. Also they telling we can't install all the services to one server. I got confusion what to do?... Is really can`t install above services to AMD computer? Thanks...

    Read the article

  • sudoers file cleanup and consolidation tool/script

    - by Prashanth Sundaram
    Hello All, I am curious to know what other folks out there might be using to keep the sudoers file in a sane manner. I am looking for a tool, that removes redundant entries, overlapping permissions and/or present sudoers file in a organized way(like sorting by permissions/users/Aliases). I use SVN and Confi Mgmt. tool to version control and deploy resp. Is there any add-on/plugin you would recommend/use? User_Alias RT1123 jappleseed, sjobs Host_Alias HOST_RT1123 wdc101.domain.com, wdc104.domain.com Cmnd_Alias ..... Our sudoers file is simple but a lot of entries and it needs to be cleaned up. Does anyone know/have a tool/script to fix/present it ? Thanks!

    Read the article

  • VisualSVN Server won't work with AD, will with local accounts

    - by frustrato
    Decided recently to switch VisualSVN from local users to AD users, so we could easily add other employees. I added myself, gave Read/Write privileges across the whole repo, and then tried to log in. Whether I'm using tortoisesvn or the web client, I get a 403 Forbidden error: You don't have permission to access /svn/main/ on this server. I Googled a bit, but only found mention of phantom groups in the authz file. I don't have any of those. Any ideas? It works just fine with local accounts. EDIT: Don't know why I didn't try this earlier, but adding the domain before the username makes it work, ie MAIN/Bob. This normally only works when there are conflicting usernames...one local, one in AD, but for whatever reason it works here too. Kinda silly, but I can live with it.

    Read the article

  • Using TortoiseHg's Repository explorer

    - by krebstar
    Hi, I'm coming from a TortoiseSVN background and decided to give TortoiseHg a try.. One thing I got really used to with TortoiseSVN was the SVN Repo-Explorer, which worked quite similarly to Windows Explorer.. However, when I tried to use TortoiseHg's Repository Explorer, what I got was something else, it was more like TortoiseSVN's Show Log. It showed me what the recent commits were and what files were changed and even had nifty graphs.. However, I'm still left wanting for TortoiseSVN's Repo-Explorer.. Does TortoiseHg have anything like this? How am I supposed to poke around the Repository if I can only view changed stuff? Thanks, kreb

    Read the article

  • I can't uninstall Git

    - by Tom
    I was researching Git so I downloaded the Windows version to test it out on a repository on GitHub. After about 30 minutes I couldn't work out how to use it, so I decided I probably wouldn't need a distributed repository as our projects aren't that big and went back to what I know - SVN. (I thought) I uninstalled all the Git related stuff I'd put on my PC, but have now got an irritating problem where if I open any folders I get an error message saying: Hello [ERROR] Could not find git path As you can imagine, this is a real pain, does anyone have any ideas on how to fix it?

    Read the article

  • Capistrano deploying to different servers with different authentication methods

    - by marimaf
    I need to deploy to 2 different server and these 2 servers have different authentication methods (one is my university's server and the other is an amazon web server AWS) I already have running capistrano for my university's server, but I don't know how to add the deployment to AWS since for this one I need to add ssh options for example to user the .pem file, like this: ssh_options[:keys] = [File.join(ENV["HOME"], ".ssh", "test.pem")] ssh_options[:forward_agent] = true I have browsed starckoverflow and no post mention about how to deal with different authentication methods this and this I found a post that talks about 2 different keys, but this one refers to a server and a git, both usings different pem files. This is not the case. I got to this tutorial, but couldn't find what I need. I don't know if this is relevant for what I am asking: I am working on a rails app with ruby 1.9.2p290 and rails 3.0.10 and I am using an svn repository Please any help os welcome. Thanks a lot

    Read the article

  • development server?

    - by ajsie
    for a project there will be me and one more programmer to develop a web service. i wonder how the development environment should be like. cause we need central storage (documents, pictures, business materials etc), file version handling, lamp (testing the web service) etc. i have never set up an environment for this before and want to have suggestions from experienced people which tools to use for effective collaboration. what crossed my mind: seperate applications: - google wave (for communication forth and back, setting up guide lines, other information) - team viewer (desktop sharing) - skype (calling) vps (ubuntu server): - svn (version tracking) - ftp (central storage) - lamp (testing the web service) - ssh (managing the vps) is this an appropriate programming environment? and regarding the vps, is it best practice to use ONE vps for all tasks listed up there? all suggestions and feedbacks are welcome!

    Read the article

  • Why use multiple partitions on a rhel server?

    - by Jakobud
    I'm about to reformat and reinstall CentOS onto an old server. The server runs on a modest 30 node small business network and has a variety of responsibilities including MySQL, a Samba share, DHCPd & SVN/Trac. The old sysadmin had this server setup with almost a dozen different partitions for various things. I'm trying to understand what the advantages of multiple partitions are as opposed to a just one filesystem at /. Speed? Flexibility? Security? It seems like if you misjudge the necessary size for any given partition and it ends up filling up too fast, it requires a sysadmin to go in and expand the partition, etc... Seems like it would be easier if everything was just one flat / filesystem. But I'm sure there are some advantages I'm not aware of. The server is currently running a handful of HDDs raided to ~2TB (raid 0).

    Read the article

  • Differences in memory consumption between two identical D7 sites?

    - by aendrew
    I'm running Drupal on a news site that has a lot of different View blocks on the front page (~5 total, all cached). In trying to reduce the memory footprint of the site, I've checked out source from SVN to a local development install to try and convert some of those blocks into more optimized code. Here's the weird thing. Devel module lists memory consumption at 50mb on the Production site (Running Nginx, PHP 5.2.17, XCache and Zend Optimizer.) but only 14mb on my development site (Running Apache2, PHP 5.2.13 and XCache). These are nearly-identical versions of the same site — frankly, the Production site should use even less memory as I've disabled some of the modules running on the Dev site. Any idea why this might be the case?

    Read the article

  • Are SANs unreliable?

    - by chaos
    So at the place where I wear one of my various hats, this one representing a development rather than admin role, there's been an initiative to move to SANs. So far, I have been spectacularly unimpressed. First it was this behavior where, when MySQL databases are on the SAN, the first few tables that anything tries to hit after the system boots come up as nonexistent and MySQL has to be restarted before it realizes they're actually there. Then today, on multiple systems (including the primary SVN repository, ever-so-wonderfully) we get SAN mounts spewing IO errors and the filesystems going into read-only, which is the kind of behavior I expect from directly mounted naked disks, not fault-tolerant managed storage. Right now, I'm at the point where if I were putting together a project and somebody said "hey we should use SANs", my response would be "GTFO". So basically I want to know whether my experience is typical or even common, or whether I'm having some kind of freakishly bad luck with SANs. The systems these SANs are attached to are all CentOS machines, if that's relevant.

    Read the article

  • Centralized sudo sudoers file?

    - by Stefan Thyberg
    I am the admin of several different servers and currently there is a different sudoers file on each one. This is getting slightly out of hand as quite often I need to give someone permissions to do something with sudo but it only gets done on one server. Is there an easy way of editing the sudoers file just on my central server and then distributing it by SFTP or something like that to the other servers in an easy way? Mostly wondering how other sysadmins solve this problem, since the sudoers file doesn't seem to be remotely accessible with NIS, for example. Operating system is SUSE Linux Enterprise Server 11 64-bit, but it shouldn't matter. EDIT: Every machine will, for now, have the same sudoers file. EDIT2: The accepted answer's comment was the closest to what I actually went ahead and did. I am right now using an SVN-supported puppet-installation and after a few headaches, it's working very well.

    Read the article

  • Possible to get IIS on Windows Server 2008 R2 to "port-forward" port 80 for certain domains to other

    - by Lasse V. Karlsen
    I have IIS set up on my server, but also Apache x2 (other products which comes with their own servers, cannot be integrated into IIS.) Is it possible for me to "port-forward" certain domains on port 80 (that IIS handles) to those other ports? For instance: www.vkarlsen.no - IIS svn.vkarlsen.no - port 81 on same machine teamcity.vkarlsen.no - port 82 on same machine Or do I just need to set up those domains and redirect to the correct port? I'd like the domain name and url to be transparent to the user, but perhaps that won't work. Can anyone shed some light on this?

    Read the article

  • Slow git clone and fetch

    - by EtienneT
    I setuped gitosis on a linux server following this tutorial: http://scie.nti.st/2007/11/14/hosting-git-repositories-the-easy-and-secure-way We are using git on our windows machines with TortoiseGit and msysgit. Pushing changes to the server is pretty fast, but when we want to clone or fetch changes from the remote server, it begins really fast (800k/s) and then drop pretty fast to around 3 to 30k/s and it can take forever to update. git-pull for small update is fast, but as soon as we have to download something of more than a few MB, it is slow. We are switching from SVN to git and this is holding us back from using git full time. Thanks!

    Read the article

  • Are whole VM images backed up on Amazon EC2/S3?

    - by John
    I've been trying to get my head around Amazon Web Services as a VPS provider. My understanding is a EC2 instance running Windows is basically a Windows VM, very similar to renting a VPS from a more traditional hosting provider. I don't want to have complex backups, either to administer or to restore - if my restore involves installing SVN, MySQL, Jira, etc on a new box before I can even try to restore the backup then it's not great to me. What I really want is a service which backs up my entire VM... if the PC running the VPS dies then the VM image is installed on a new PC and off we go again. With Amazon being all about flexibility and elasticity, I wondered if they have this service? I can't figure it out from reading their docs.

    Read the article

< Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >