Search Results

Search found 25998 results on 1040 pages for 'home folder'.

Page 206/1040 | < Previous Page | 202 203 204 205 206 207 208 209 210 211 212 213  | Next Page >

  • Branching and Merging Improvements in TFS2010

    - by jehan
    Introducing the concept of “first class branches” is a significant improvement as part of the 2010 release with respect to version control.  Not only does it help to distinguish between folders and branches, but it enables branch visualizations. Let us see improvements in detail. ·         In TFS2008, you don’t know which of the folders are Branches: All folders looks the same, all have the folder icon. Now, In TFS 2010 there is a new icon that shows which of the folder is a Branch.       ·      There is no visual means to manage branches in TFS2008:   You dont have any means to identify which branches are related and the relation type. Now, In TFS 2010 you have visual tools to see the Branches Hierarchy. In order to see a Branch Hierarchy just Right Click the Branch and choose: Branching and Merging –> View Hierarchy     ·         In TFS2008, there is no option to track changes path between the Branches:  If you have made a merge in a Branch you can’t track from which Branch this Merge came from. Now, you have the tools that shows the path of change between the Branches, you can also see where change was added on a timeline.  In order to track a change do the following: Step1: Right click the Branch and click View History   Step 2: Choose a changeset to track and click the “Track Changeset” button.     Step 3: Choose the branches that will be in the view and click “Visualize”. In above visual, you can see that Changesets 108,109,110 and 119 where merged from Main to Release1.0 Branch and then “Release_1.0” Branched to “Dev1.0. Step4: You can also see the Merges on a Timeline by clicking on the “Timeline Tracking” button.   Creating New Branches: In TFS 2010, the creation of branches has been streamlined a bit from the process in 2008.  In 2008, creating a new branch was like every other action in the system – changes were pended on the client, and then checked in to the server. Because of this creating new branch in TFS2008 was time-consuming process.  In TFS2010, the step where changes are pended has been bypassed and now performing the branch creation is entirely on the server.  With this approach, the round trip time for downloading a copy of each file on the branch and then uploading each file again has been eliminated.  Note: In TFS2010, the new branch will be created and committed as a single operation on the server. Pending changes will not be created, it doesn’t require a check-in as it will be carried out as a single operation and it’s not possible to cancel.     Manage Branch Permissions: The properties view for branches is also different than that of ordinary folders or file, containing some metadata for the branch, relationship information, and permissions for the branch. In TFS2008, the users who have checkout and Check-in permissions can create a branch. But, In TFS2010 you can control the permissions for Branches using Manage Branch permissions.   Reparent option in TFS2010: In TFS2008, if we have two branches which don’t have parent-child relation and we want perform merge between these two branches then we have to perform baseless merge using tf.exe command line. I have two branches Release_1.0 and Dev1.0_F2 which don’t have any relation between them, that’s why when I click on merge option in Release_1.0, in Target Branch it’s not showing Dev1.0_F2 branch to perform the merges.     Let us see what can we do for this in TFS2010, first perform a TFS baseless merge to establish a relationship between the parent branch and the child branches.  It will only merge the folder, not its contents. TFS baseless merges are performed via the command line using VS2010 command prompt and do the following:   tf merge /baseless <ParentBranch> <childBranch> Check in your pending changes. It will create the link between the branches but the relationships are still not completed.  Now, select the child branch in Source Control Explorer and from the File menu choose Source Control –> Branching and Merging –> Reparent.      In the dialog box,  choose the appropriate branch as the new parent.   Click Reparent and then go to parent branch and click merge. Now, will see that in Target Branch option Dev1.0_F2 branch is added.         Converting Folders to Branches and Branches to Folders: You can convert any Folder as Branch from Context Menu by performing right click on the folderàBranching and MergingàConvert to Branch. In similar way, you can convert the Branches to Folder using Convert to Folder option available in File Menu (FileàSource ControlàBranching and MergingàConvert to Branch). This option is not available in context menu.

    Read the article

  • Backup files with rsync: error 23

    - by maria
    Hi I'm trying to make a backup of my /home to transfer all data from one computer to another. I wanted to save the backup on the same computer and than transfere it to another one. For safety reasons, I'm trying to learn how does it work on the computer without a lot of data (the new one) to be sure I won't delete something instead of copying it. I've run in terminal: sudo rsync -avz /home/maria /home/guest/backup and I had as the result: sent 58797801 bytes received 23050 bytes 4705668.08 bytes/sec total size is 100202958 speedup is 1.70 rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1060) [sender=3.0.7] I've tried once again, with the same result. I have no idea, which files were not transferred, what makes the whole backup useless for me (I wanted to do it automatically in order not to forget about something and loose it). On both computers I have the same system (Ubuntu 10.04). Rsync version: 3.0.7-1ubuntu1. Thanks for any tips

    Read the article

  • MySQL vs. SQL Server Go daddy, What is the difference bewteen hosted DB and App_Data Db

    - by Nate Gates
    I'm using Goddady for site hosting, and I'm currently using MySQL, because there are less limits on size,etc. My question is what is the difference between using a hosted Godaddy Db such as MySQL vs. creating a SQL Serverdatabase in the the App_Data folder? My guess is security? Would it be a bad idea to use a SQL ServerDB thats located in the App_Data folder? Additional Well I am able to create a .mdf (SQL Server DB file) in the App_Data folder, but I'm really unsure if should use that or not, If I did use it it would simplify using some of the Microsoft tools. Like I said my guess is that it would be less secure, but I don't really know. I know I have a 10gb, file system limit, so I'm assuming my db would have to share that space.

    Read the article

  • Deeper Unity search indexing

    - by Chris Bauer
    Unity is currently only indexing and displaying a shallow set of file results. Suppose I want to open the file "/home/Music/Creedence Clearwater Revival/Willy and the Poor Boys/The-Midnight-Special.mp3". I open the "Files and Folders" lens and type "The Midnight Special". Unfortunately, the song doesn't display. I try "Willy and the Poor Boys" but that folder doesn't display either. The only folder that does display in the lens is "Music". Therefore I must open the "Music" folder then navigate through the entire directory tree to open the file I want. How do I get a deeper index of files to display in the "Files and Folders" lens? Thanks for your help!

    Read the article

  • Where are bluetooth received files saved

    - by Luis Alvarado
    I have used Blueman and the default Bluetooth manager an every time I accept the file transfer from the Phone to the PC, it shows that it is transferring and even notifies me that the transfer was successful but I can not find the image anywhere. Where are files stored after sending them via Bluetooth to the PC? I already checked my Home folder. The Picture folder which I thought it would be the one for images. Then the document folder and afterwards I did a huge check on all folders under /home. No luck. Is nowhere. I even checked this answer with no luck.

    Read the article

  • TOR Error permission issue

    - by LeChiffte
    I've tried reinstalling, updating, and removing and then reinstalling. Nothing seems to work. See screenshot below: the output of gedit /home/skynet/.tor-browser-en/LOG (The installation log) is: /usr/bin/tor-browser-en.sh: Your version in /home/skynet/.tor-browser-en is outdated or you do not have installed tor-browser-en yet. /usr/bin/tor-browser-en.sh: Extracting files to /home/skynet/.tor-browser-en/INSTALL. tar (child): /opt/tor-browser-en/tor-browser-linux64-3.6.2_en-US.tar.xz: Cannot open: No such file or directory tar (child): Error is not recoverable: exiting now tar: Child returned status 2 tar: Error is not recoverable: exiting now

    Read the article

  • Bad Bot blocking Revisited

    - by Tom
    I've read a lot about bad bot blocking, php scripts, .htaccess techniques, etc... Is this a valid method? Since .htacces can rewrite and send a bad bot a 403 deny or forward to something like spam poison, is it possible to Disallow a folder, then through .htaccess in that specific folder redirect to spampoison? Since Apache reads each .htaccess independently and follows specific instructions, then a bad bot not following robots.txt would just be redirected. Or anyone trying to access, /badbot/ or whatever I choose to call my trap folder. Thanks Tom

    Read the article

  • Unable to ping ip address between two locations

    - by Derek
    I have two locations office and home. From home I am unable to access my mail server. and from the office I am unable to access my personal web server. now these two locations are about 100 yards away and share the same connection from the isp. Its payed for on the same account. Also they both have different static public ip addresses. office is 216.248.94.xxx and home is 216.51.158.xxx. I cannot ping each address from each location.

    Read the article

  • Why is my partition claiming to be out of space?

    - by Dr C
    My file system claims to only have 4.5 GB left. While my OS (a folder with in file system) still has 75.2 GB left. I put something near 130 GB on my Ubuntu partition, it should have enough space. I confirmed that I can put things in OS that exceed the space in available file systems, but that makes no sense, OS is listed as a folder inside of file system, why would it have more space than it's parent folder? What is going on? Here is the output of df: Filesystem 1K-blocks Used Available Use% Mounted on /dev/sda5 113773200 103741440 4252408 97% / udev 2004600 4 2004596 1% /dev tmpfs 804756 848 803908 1% /run none 5120 0 5120 0% /run/lock none 2011884 436 2011448 1% /run/shm /dev/sda2 127526908 54045584 73481324 43% /media/OS /dev/sda3 39144708 89016 39055692 1% /media/DATA`

    Read the article

  • Ping works , but unable to do ssh

    - by gpuguy
    I disabled the firewall with sudo ufw disable, I can ping the server, the server can ping me but I can't ssh to it: root@ubuntu:/home/acme# ssh 192.168.1.6 ssh: connect to host 192.168.1.6 port 22: Connection refused I removed ssh and reinstalled : sudo apt-get remove openssh-client openssh-server sudo apt-get install openssh-client openssh-server But still ssh is not working and I get the error connection refused How do I tackle this issue? Here are some other stuff I have tried so far: root@ubuntu:/home/acme# sudo service ssh start start: Job is already running: ssh root@ubuntu:/home/acme# ps aux | grep ssh acme 6548 0.0 0.0 12576 320 ? Ss 04:09 0:00 /usr/bin/ssh-agent /usr/bin/dbus-launch --exit-with-session gnome-session --session=ubuntu root 22219 0.0 0.1 50040 2852 ? Ss 05:10 0:00 /usr/sbin/sshd -D root 22277 0.0 0.0 8116 896 pts/0 S+ 05:17 0:00 grep --color=auto ssh Update for future visitors removing and reinstalling ssh on the server worked for me : sudo apt-get remove openssh-client openssh-server sudo apt-get install openssh-client openssh-server

    Read the article

  • recover data in linux removed with rm

    - by user3717896
    Today i deleted my home directory (in wrong action) with: sudo rm -rf * And when i use extundelete i get this message: root@ubuntu:~# sudo extundelete --restore-directory /home/hamed/ /dev/sda2WARNING: Extended attributes are not restored. Loading filesystem metadata ... 746 groups loaded. Loading journal descriptors ... 29931 descriptors loaded. Searching for recoverable inodes in directory /home/hamed/ ... 498 recoverable inodes found. Looking through the directory structure for deleted files ... 498 recoverable inodes still lost. No files were undeleted. why it can't recover? Anyone can help me to return my Desktop, Documents and etc? I have ubuntu 14.04.

    Read the article

  • Flaws in my PHP development setup - sharing sources causing lags

    - by Wiktor
    I have following development setup for my PHP projects: Working station running on Windows 7 with PhpStorm IDE. GIT for version controlling. CentOS on virtual machine (VirtualBox) with Apache and MySQL (copy of production server). So far, I've been sharing project's source folders between host and guest systems and it was working quite well only really slow. The reason behind this is that Apache was reading files from remote folder (mounted locally). After doing some research, I found out that this set up can be improved by using disk mapping (Samba) instead of folder sharing. So I did that change. I configured my PhpStorm to automatically deploy files to mapped drive. Everything works like a charm now, except for one problem - when I change branches I need to synchronize project's local folder with the one on mapped drive and that takes time, a lot of time (like branching in SVN). Is there another way to handle this than just working on files directly on mapped drive?

    Read the article

  • How can I force Google to re-index my site?

    - by Matthias
    I changed the structure of my URLs. The pages are already indexed by Google and have the following structure: http://mypage.com/myfolder/page.apsx The new structure is: http://mypage.com/page.aspx Now all URLs that Google knows are wrong. How can I tell Google to re-index and that the structure has changed? Internally I redirect in ASP.NET when the URL contains myfolder by I want Google to update the URLs. Thanks for the answers - I use IIS 6 and I do not know how to configure a redirect of all pages that contains the folder to page one folder below. So I did the trick in the Begin_Request method and did a Context.Response.Redirect. This is no 301 redirect, only a redirect done with ASP.NET via code. Will this also do the trick so that Google notices that the URL /folder/page1.aspx now is redirected to /page1.aspx?

    Read the article

  • Best tool to recover removed files

    - by plua
    Using Ubuntu 10.10, I have a startup script that automatically removes my 'working directory'. This is a simple folder on my Desktop where I place a bunch of files that I use throughout the day. These are temporary files I need to store just for that one session. In order to keep things clean, my startup script does: rm -rf /home/user/Desktop/workdir mkdir /home/user/Desktop/workdir Works great. Till the moment I had some important files there and forgot to move them before shutting down. A few (2-3) sessions ago this happened and I now realize I need to recover the "workdir" directory. But several new ones have been created and removed in the meantime. What is the best way to recover this - if possible? I read about tools like scalpel but it seems they will scan my whole HD. I know the name of the folder and would like to just look for this workdir folder. What is best?

    Read the article

  • What is the benefit of writing to a temp location, And then copying it to the intended destination?

    - by Devdatta Tengshe
    I am writing an application that works with satellite Images, and my boss asked me to look at some of the commercial application, and see how they behave. I found a strange behavior and then as I was looking, I found it in other standard applications as well. These Programs first write to the temp folder, and then copy it to the intended destination. Example: 7zip first extracts to the temp folder, and then copies the extracted data to the location that you had asked it to extract the data to. I see several problems with this approach: 1.The temp folder might not have enough space, while the intended location might have that much space. 2.If it is a large file, it can take a non-negligible amount of time for the copy operation. I thought about it a lot, but I couldn't see one single positive point to doing this. Am I missing something, or is there a real benefit to doing this?

    Read the article

  • ASP.Net Development Tips

    Opening ASP.NET 3.5 websites in VWD 2010 VWD Express 2010 by default supports ASP.NET 4.0. If you are opening old projects that are based on either ASP.NET 3.5 or ASP.NET 2.0, you need to make some adjustments. Refer to the steps below: 1. Back up the folder containing your ASP.NET 3.5 website files and place it in another directory. For example, suppose this is the path of your original ASP.NET website that needs to be opened in VWD 2010: L:aspdotnetprojectsareaofcirclefunction Copy that folder (do not cut it) and put it in a separate folder that can be accessed by VWD 2010. By copying the fo...

    Read the article

  • Framework Folders and Duplicate File Names

    - by Kevin Smith
    I have been working with Framework folders a little bit in the past few days and found one unexpected behavior that is different from Contribution Folders (Folders_g). If you try and check a file into a Framework Folder that already exists in the folder it will allow it and rename the file for you. In Folders_g this would have generated an error and prevented you from checking in the file. A quick check of the Framework Folder configuration settings in the Application Administrator’s Guide for Content Server does not show a configuration parameter to control this. I'm still thinking about this and not sure if I like this new behavior or not. I guess from a user perspective this more closely aligns Framework Folders to how Windows handle duplicate file names, but if you are migrating from Folders_g and expect a duplicate file name to be rejected, this might cause you some problems.

    Read the article

  • Configuring A Subdomain (cPanel) - www works, subdomain on it's own doesn't

    - by Matthew
    I've created a sub domain on my website using cPanel at test.mydomain.com, and this created a folder in my main 'www' directory called test. In this folder is a folder called cgi-bin, and it seems to redirect the page to say "It works!", but when I upload my own index.html file to the test directory it keeps showing http://test.mydomain.com/cgi-sys/defaultwebpage.cgi instead of the index file. If I go to www.test.mydomain.com then it works OK. How do I host my content at the sub domain? It's my first time setting one up so I'm a bit lost.

    Read the article

  • Where can I find programming work online ?

    - by explorest
    I have setup an ideal, quiet, non-interrupting environment at home. I am extremely productive here. I dont want to leave my home, not my room, not even my couch. How/where do I find work online so that I don't have to travel to it? Kindly post about your own personal experiences. Have you done it full time from home? Where and how? I am outside United States in a third world country so a lower pay is not an issue. The issue is the work-enviroment.

    Read the article

  • Ubuntu One messes up with my thunderbird folders

    - by xpanta
    I have added .thunderbird folder to my Ubuntu One's folders to be synced. This is to protect my mails and my thunderbird's settings and plugins. However, U1 constantly keeps messing up with my folders and a lot of .u1conflict files appear in various places in my .thunderbird folder. Some of them I see on my thunderbird application when I start it up. Why is that? Can't I just select my home folders files to be uploaded and not synced? Or set U1, in case of conflicts, to leave the home folder structure and files as is. PS: I have subscribed to U1 service and this is important to me.

    Read the article

  • Use your own domain email and tired of SPAM? SPAMfighter FTW

    - by Dave Campbell
    I wouldn't post this if I hadn't tried it... and I paid for it myself, so don't anybody be thinking I'm reviewing something someone sent me! Long ago and far away I got very tired of local ISPs and 2nd phone lines and took the plunge and got hooked up to cable... yeah I know the 2nd phone line concept may be hard for everyone to understand, but that's how it was in 'the old days'. To avoid having to change email addresses all the time, I decided to buy a domain name, get minimal hosting, and use that for all email into the house. That way if I changed providers, all the email addresses wouldn't have to change. Of course, about a dozen domains later, I have LOTS of pop email addresses and even an exchange address to my client's server... times have changed. What also has changed is the fact that we get SPAM... 'back in the day' when I was a beta tester for the first ISP in Phoenix, someone tried sending an ad to all of us, and what he got in return for his trouble was a bunch of core dumps that locked up his email... if you don't know what a core dump is, ask your grandfather. But in today's world, we're all much more civilized than that, and as with many things, the criminals seem to have much more rights than we do, so we get inundated with email offering all sorts of wild schemes that you'd have to be brain-dead to accept, but yet... if people weren't accepting them, they'd stop sending them. I keep hoping that survival of the smartest would weed out the mental midgets that respond and then the jumk email stop, but that hasn't happened yet anymore than finding high-quality hearing aids at the checkout line of Safeway because of all the dimwits playing music too loud inside their car... but that's another whole topic and I digress. So what's the solution for all the spam? And I mean *all*... on that old personal email address, I am now getting over 150 spam messages a day! Yes I know that's why God invented the delete key, but I took it on as a challenge, and it's a matter of principle... why should I switch email addresses, or convert from [email protected] to something else, or have all my email filtered through some service just because some A-Hole somewhere has a site up trying to phish Ma & Pa Kettle (ask your grandfather about that too) out of their retirement money? Well... I got an email from my cousin the other day while I was writing yet another email rule, and there was a banner on the bottom of his email that said he was protected by SPAMfighter. SPAMfighter huh.... so I took a look at their site, and found yet one more of the supposed tools to help us. But... I read that they're a Microsoft Gold Partner... and that doesn't come lightly... so I took a gamble and here's what I found: I installed it, and had to do a couple things: 1) SPAMfighter stuffed the SPAMfighter folder into my client's exchange address... I deleted it, made a new SPAMfighter folder where I wanted it to go, then in the SPAMfighter Clients settings for Outlook, I told it to put all spam there. 2) It didn't seem to be doing anything. There's a ribbon button that you can select "Block", and I did that, wondering if I was 'training' it, but it wasn't picking up duplicates 3) I sent email to support, and wrote a post on the forum (not to self: reply to that post). By the time the folks from the home office responded, it was the next day, and first up, SPAMfighter knocked down everything that came through when Outlook opend... two thumbs up! I disabled my 'garbage collection' rule from Outlook, and told Outlook not to use the junk folder thinking it was interfering. 4) Day 2 seemed to go about like Day 1... but I hung in there. 5) Day 3 is now a whole new day... I had left Outlook open and hadn't looked at the PC since sometime late yesterday afternoon, and when I looked this morning, *every bit* of spam was in the SPAMfighter folder!! I'm a new paying customer After watching SPAMfighter work this morning, I've purchased a 1-year license, and I now can sit and watch as emails come in and disappear from my inbox into the SPAMfighter folder. No more continual tweaking of the rules. I've got SPAMfighter set to 'Very Hard' filtering... personally I'd rather pull the few real emails out of the SPAMfighter folder than pull spam out of the real folders. Yes this is simply another way of using the delete key, but you know what? ... it feels good :) Here's a screenshot of the stats after just about 48 hours of being onboard: Note that all the ones blocked by me were during Day 1 and 2... I've blocked none today, and everything is blocked. Stay in the 'Light!

    Read the article

  • How do I get rid of the drive mount confirmation question for sshFS on boot?

    - by Dave M G
    With help from this site, I was able to set up an SSHFS connection between two computers on my LAN so that one auto mounts on the other at boot time. Everything works, but there is this annoying confirmation that comes up whenever I boot: An error occurred while mounting /home/dave/Mythbuntu. Press S to skip mounting or M or Manual recovery If I press S, then booting continues, and my drive is mounted as hoped, so it seems like even though I "skipped" it, maybe it tried again and succeeded later in the boot process. I followed the instructions here to set up "if up / if down" scripts, and here is my current /etc/fstab: sshfs#[email protected]:/home/mythbuntu /home/dave/Mythbuntu fuse auto,users,exec,uid=1000,gid=1000,allow_other,reconnect,transform_symlinks,BatchMode=yes 0 0 Although the mounting is working, this step of having to press S every time I boot is obviously kind of a hassle. How do I configure my computer so I don't have to do that, and so that my other computer will still automount?

    Read the article

  • Wiped data, and duplicated folders into files.

    - by Kaustubh P
    Something weird happened today, and I dont know how. Within a folder, all folders have a file by the same name, with a colon appended to it. And all the files from the most inner-most directory in my home, have been dumped to ~, with a size of 0 bytes. I have not executed any scripts or anything. I was just checking out some easter eggs, namely the gegls from outer space and free the fish and was away from the computer and was logged because of the screensaver. I couldnt log-back in with my password, so I just reset the PC, and while booting, the PC went into a drive check. BUT, IIRC, i saw the duplicate "folder files" before I had logged out, so thats not the reason! All the files have a timestamp of 14 Jan. Also, the contents of my eclipse folder have been dumped into ~. Right down to the jars and ini files. HELP!

    Read the article

  • The layout page "~/Views/Shared/_Layout.cshtml" could not be found

    - by Rei Brazilva
    I got this error and I can't figure out what is going on. I am positive the _layout.cshtml resides in the shared folder and for the sake of trying things out, I moved to the Home folder and it then told that the Views/Home/_Layout.cshtml couldn't be found there either. So now I'm thinking the problem is in the call for this file for some reason. I'm not going to pretend I know ASP.NET MVC4, so please when you answer, explain it as you would to someone who is not familiar with the system at all. Believe it or not, this error came from tutorial #1 ha ha Here's the code to show that I did code it right: @{ ViewBag.Title = "Home Page"; Layout = "~/Views/Shared/_Layout.cshtml"; } And here is a picture of the location p.s. I did my research, Google has nothing and there is another question here but it was asked on 2008 with MVC3 which is completely different I am running ASP.NET MVC4 on Azure Thanks

    Read the article

  • Whenever I try to remove a Debian package I receive an Error

    - by Brenton Horne
    Whenever I type into the terminal the command: sudo dpkg -r '/home/brentonhorne/Downloads/virtualbox.deb' I receive the error: dpkg: error: --remove needs a valid package name but '/home/brentonhorne/Downloads/virtualbox.deb' is not: illegal package name in specifier '/home/brentonhorne/Downloads/virtualbox.deb': must start with an alphanumeric character Type dpkg --help for help about installing and deinstalling packages [*]; Use `dselect' or `aptitude' for user-friendly package management; Type dpkg -Dhelp for a list of dpkg debug flag values; Type dpkg --force-help for a list of forcing options; Type dpkg-deb --help for help about manipulating *.deb files; Options marked [*] produce a lot of output - pipe it through `less' or `more' ! How do I get around this problem?

    Read the article

< Previous Page | 202 203 204 205 206 207 208 209 210 211 212 213  | Next Page >