Search Results

Search found 16616 results on 665 pages for 'home sharing'.

Page 381/665 | < Previous Page | 377 378 379 380 381 382 383 384 385 386 387 388  | Next Page >

  • Rejoining two partitions

    - by Alex
    I was courious about Ubuntu so I decided I would give it a chance, therefore I installed it on a parition on my harddrive. Now a couple of months later I haven't used windows once so i decided to go with ubuntu only. I deleted my windows partition with GParted and thought that it was all good. BUT the now formated diskspace that used to be home to windows is now only a formated partition. How do I connect it to the partition where my ubuntu installation lays iow go back to having a non parted hard drive?

    Read the article

  • Ubuntu 12.04 installer does not recognize Windows 7

    - by trainofk
    I recently purchased an ASUS N56VZ-ES71 laptop which came with Windows 7 Home Premium installed on it. I wish to dual boot Windows 7 and Ubuntu 12.04 on it. I shrank the hard drive partitions to leave about 150 GB unallocated for Ubuntu 12.04. When I boot the Live CD of Ubuntu and attempt to install, the installer does not recognize any other operating systems. Through reading a few questions, I have found that this is due to a GPT partitioning table that Windows uses. I ran boot-repair as per other threads' suggestions. This was my output: http://paste.ubuntu.com/1176988/ I suppose my question is: how do I proceed in order to get the installer to recognize Windows, so that I don't have to erase the current partition table and can get a safe install? Thanks in advance.

    Read the article

  • How to set up Ubuntu Server as a NAS?

    - by rifferte
    I am looking to set up Ubuntu Server as a headless NAS for my home. I would like to have file storage there, as well as a central hub for my MP3s and pictures. What are the best packages out there to handle this? Can someone post a link to a good tutorial or post some tips? One constraint I have is that it has to be Windows 7 friendly. By that I mean the shares and streaming should work for a Windows machine.

    Read the article

  • How to Unlock Applications folder

    - by Mark Coleman
    This question relates to Ubuntu 12.04 LTS. I wish to move a folder from the desktop into the applications folder at home/usr/share/applications. When I attempt to drag the folder from the desktop to the applications folder, I get the following message: "Error moving file: Permission denied" Permission or no permission, I want to move a folder from the desktop to the above said applications folder. How do I authenticate so I can make this move? There is no opportunity to authenticate when I get the error message, only "Skip" or "Cancel". I don't want to skip or cancel, I want to authenticate and move the folder. How do I do this? Thank you!

    Read the article

  • Fluent MVC Route Testing Helper

    - by Nettuce
    static class GetUrlFromController<T> where T : Controller     {         public static string WithAction(Expression<Func<T, ActionResult>> expression)         {             var controllerName = typeof(T).Name.Replace("Controller", string.Empty);             var methodCall = (MethodCallExpression)expression.Body;             var actionName = methodCall.Method.Name;             var routeValueDictionary = new RouteValueDictionary();             for (var i = 0; i < methodCall.Arguments.Count; i++)             {                 routeValueDictionary.Add(methodCall.Method.GetParameters()[i].Name, methodCall.Arguments[i]);             }             var routes = new RouteCollection();             MvcApplication.RegisterRoutes(routes);             return UrlHelper.GenerateUrl(null, actionName, controllerName, routeValueDictionary, routes, ContextMocks.RequestContext, true);         }     } I'm using FluentAssertions too, so you get this: GetUrlFromController<HomeController>.WithAction(x => x.Edit(1)).Should().Be("/Home/Edit/1");

    Read the article

  • Is It Possible for My Router to Wear Out?

    - by Jason Fitzpatrick
    Day after day your humble and hard working router holds your home network together and links it to the greater internet. Is it possible to work it to death? Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-drive grouping of Q&A web sites. HTG Explains: How Antivirus Software Works HTG Explains: Why Deleted Files Can Be Recovered and How You Can Prevent It HTG Explains: What Are the Sys Rq, Scroll Lock, and Pause/Break Keys on My Keyboard?

    Read the article

  • Tweaking Hudson memory usage

    - by rovarghe
    Hudson 3.1 has some performance optimizations that greatly reduces its memory footprint. Prior to this Hudson used to always hold the entire data model (all jobs and all builds) in memory which affected scalability. Some installations configured heap sizes in excess of 1GB to counteract this. Hudson 3.1.x maintains an MRU cache and only loads jobs and builds as they are required. Because of the inability to change existing APIs and be backward compatible with plugins, there were limits to how far we could go with this approach. Memory optimizations almost always come with a related cost, in this case its additional I/O that has to be performed to load data on request. On a small site that has frequent traffic, this is usually not noticeable since the MRU cache will usually hold on to all the data. A large site with infrequent traffic might experience some delays when the first request hits the server after a long gap. If you have a large heap and are able to allocate more memory, the cache settings can be adjusted to take advantage of this and even go back to pre-3.1 behavior. All the cache settings can be passed as options to the JVM container (Tomcat or the default Jetty container) using the -D option. There are two caches, independant of each other, one for Jobs and the other for Builds. For the jobs cache: hudson.jobs.cache.evict_in_seconds ( default=60 ) Seconds from last access (could be because of a servlet request or a background cron thread) a job should be purged from the cache. Set this to 0 to never purge based on time. hudson.jobs.cache.initial_capacity ( default=1024 ) Initial number of jobs the cache can accomodate. Setting this to the number of jobs you typically display on your Hudson landing page or home page will speed up consecutive access to that page. If the default is too large you may consider downsizing and using that memory for the Builds cache instead. hudson.jobs.cache.max_entries ( default=1024) Maximum number of jobs in the cache. The default is large enough for most installations, but if you find I/O activity when always accessing the hudson home page you might consider increasing this, but first verify if the I/O is caused by frequent eviction (see above), rather than by the cache not being large enough. For the builds cache: The builds cache is used to store Build objects as they are read from storage. Typically this happens when a user drills down into the details of a particular Job from the hudson hom epage. The cache is shared among builds for different jobs since in most installations all jobs are not accessed with the same frequency, so a per-job builds cache would be a waste of memory. hudson.job.builds.cache.evict_in_seconds ( default=60 ) Same as the equivalent Job cache, applied to Build. hudson.job.builds.cache.initial_capacity" ( default=512 ) Same as equivalent Job cache setting. Note the smaller initial size. If your site stores a large number of builds and has frequent access to more builds you might consider bumping this up. hudson.job.builds.cache.max_entries ( default=10240 ) The default max is large enough for most installations, the builds cache has bigger sized objects, so be careful about increasing the upper limit on this. See section on monitoring below. Sample usage: java -jar hudson-war-3.1.2-SNAPSHOT.war -Dhudson.jobs.cache.evict_in_seconds=300 \ -Dhudson.job.builds.cache.evict_in_seconds=300 Monitoring cache usage The 'jmap' tool that comes with the JDK can be used to monitor cache performance in an indirect way by looking at the number of Job and Build objects in each cache. Find the PID of the hudson instance and run $ jmap -histo:live <pid | grep 'hudson.model.*Lazy.*Key$' Here's a sample output: num #instances #bytes class name 523: 28 896 hudson.model.RunMap$LazyRunValue$Key 1200: 3 96 hudson.model.LazyTopLevelItem$Key These are the keys to the Jobs (LazyTopLevelItem$Key) and Builds (RunMap$LazyRunValue$Key) in the caches, so counting the number of keys is a good indicator of the number of items in the cache at any given moment. The size in bytes can be ignored, they are just the size of the keys, not the actual sizes of the objects they hold. Those sizes can only be obtained with a profiler. With the output above we can conclude that there are 3 jobs and 28 builds in memory. The 28 builds can all be from 1 job or all 3 jobs. Over time on an idle system, these should get evicted and memory cache should be empty. In practice, because of background cron threads and triggers, jobs rarely fall down to zero. Access of a job or a build by a cron thread resets the eviction timer.

    Read the article

  • How to manually list set of urls for search engines to index

    - by MarutiB
    So I have created a video website which has thousands of videos and thousands of videos get added to it on a daily basis. Here is my problem :- I have created a website which basically loads the skeleton in html and puts all the content through javascript and Ajax. The problem is search engines aren't going anywhere except for the home page. Is there a way say in robots.txt where i give a link to a single html which has links to all these videos ? I agree my site is not accessible for a non-javascript user but stats show that this ratio is very low ( 0.2 % ). Is there a way I can still keep the complete AJAX website and still get each individual videos listed on google ?

    Read the article

  • Internet (HTTP) is not working on Sony Viao

    - by vijay
    I switched my sony viao laptop from XP to ubuntu 10.04 recently. I have it connected through router at home. The internet was working fine with XP. With ubuntu, i am not able to connect to interenet. I am able to update using apt, and i am able to ping too. It seems like there is a DNS issue, when i try to goto sites from firefox, it doesn't work. I tried disabling the ipv6 in firefox config, it doesn't work. My router is on 192.168.2.1 instead of 192.168.1.1 any ideas on what config i might need to change to make this work? or could this be a drive issue?

    Read the article

  • How to copy the file from source to destination only once at a time?

    - by Viswa
    I have to copy the file from my desktop to my mounted directory. I was using the following command to copy the file from my desktop to mounted directory. os.system("cp -f /home/Desktop/filename /media/folder_1"). It works fine. But the problem is while copying the file from my source to mounted directory(folder_1) if any interruption is happens like network down, then the system continuously keep on trying. It couldn't skip that process. Finally, when the network comes the files are again copy to my mounted directory. Due to this continuous trying, next time i try to move the content it throws "permission denied" error. How do i copy the file only once, if any network issues happen then it will not keep try to copy, instead of that, it throws the error. If you know, Let me. Its very useful to me.

    Read the article

  • Where can someone store >100GB of pictures online? [closed]

    - by sbi
    A person who is not very computer-savvy needs to store 130GB of photos. The key parameters are: an non-negligible probability that the company selling the storage will be existing, and the data accessible, for at least five years data should be considered safe once uploaded reasonable terms of service: google drive reserving the right to literally do anything they want with their user's data is not acceptable; the possibility that the CIA might look at those pictures is not considered a threat easy to use from Windows, preferably as a drive no nerve-wracking limitations ("cannot upload 10GB/day" or "files 500MB" etc.) that serve no purpose other than pushing the user to the next-higher price plan some upgrade plan: there's currently 10-30GB of new photos per year, with a tendency to increase, which might bust a 150GB limit next January ability to somehow sort the pictures: currently they are sorted into folders, but something alike (tags) would be just as good, if easy enough to apply of course, the pricing is important (although there's a reason this is the last bullet; reasonable data safety is considered more important) Nice to have, but not necessary features would be: additional features related to photos (thumbnail generation, album sharing etc.) access from web and other platforms than Windows (smart phones) Let me stress this again: The person in need of that is able to copy pictures from the camera to the computer, can copy files in the explorer, and uses a web email service. That's about it, there's almost no understanding of what happens under the hood.

    Read the article

  • Windows 7 and Ubuntu Boot issue

    - by user115137
    I had the idea to dual boot Win 7 and Ubuntu and what I did was the following: Made a clean install of win 7 using all of my hard drive, next I used the Ubuntu live cd and gparted to partition my drive to be the following: /dev/sda1 ext4 20GB (Linux root) /dev/sda2 ntfs 100GB(Win7) /dev/sda3 ext4 350GB(Home) /dev/sda4 extended 4GB(swap) The thing is, when installing ubuntu I deleted the partition win 7 creates for its boot sector and recovery and then resized the drive to look like what I mentioned, and Ubuntu installed GRUB to the MBR. When GRUB boots I can see Ubuntu but not Windows, how can I chainload it? Or should I fix the windows mbr with the windows 7 installation disk and try to set the dual boot from there? I don't really care which one of the 2 bootloaders I end up using, I just want the dual boot to work out. Thanks

    Read the article

  • My Computer in Ubuntu?

    - by Casey Hungler
    I was wondering if Ubuntu has an equivalent to the Windows feature "My Computer", which lists all available drives/storage devices. Typically, My Computer shows C:, which can be opened to view all of your directories and files. At this point, it is very similar to Ubuntu's Home Folder, but I am looking for something that allows me to view/select all available HDD's/Partitions. Here's my reason: I found a 6gb IDE HDD in my basement, and got an IDE cable for it. My desktop computer has a SATA drive in it, but has an IDE slot, so I wanted to plug it in and see what might be on it. The drive seems to be recognized in BIOS, but I can't find it in Ubuntu to view files, and Ubuntu is the only OS on that particular computer. If anyone has any ideas as to how to view the contents of that drive WITHOUT formatting it or tampering with the contents in any way, I would greatly appreciate your help. Thanks in advance!

    Read the article

  • How productive do you think/know you are?

    - by leinad
    I'm a good programmer, and I know it. Yet, I often find myself thinking that I actually work very slowly. I worked on a feature from Monday to Wednesday this week. On the way home on Wednesday I was wondering what had taken so long. It seemed like I should have been done in a single day but I wasted three to finish it up. This is not the first time I had this feeling. Do you ever feel this or the opposite?

    Read the article

  • localhost .htaccess not working?

    - by diff
    Now i install WordPress then BuddyPress, after the install buddypress i run the site, home page was run, when click the other page link the page was not working. show this error Not Found The requested URL /Repo/website/groups/ was not found on this server. Apache/2.2.17 (Ubuntu) Server at localhost Port 80 After then i check my local htaccess, here is all are ok, but why the problem is showing. Here is my local htaccess .. <Directory /> Options FollowSymLinks AllowOverride all </Directory> <Directory /var/www/> Options Indexes FollowSymLinks MultiViews AllowOverride all Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory>

    Read the article

  • DualBoot 32Win & 64Mac from Live USB +Persistance

    - by josephsmendoza
    So I have a 32 bit Live USB with persistance that I use to write code, cause that's just how I roll. I can boot it onto my school computers(32 bit Win) no problem, but obviously not my Mac (2007 iMac, 64 Bit). Currently I use VMWare Fusion 6 Pro and a Plop Linux image to use the usb at home, but I can only get 900mb of ram for my VM. I was hoping to make a Live USB that can boot 32 bit and 64 bit for mac, with a shared persistance file, this way I can use my computer's full 2GB. Also, I'm not allowed edit any part of this Mac. Do Not reply telling me it's impossible. Please only solutions. Thank you, and have a unicorntastic day!

    Read the article

  • How can I effectively use a netbook and a desktop computer together for programming?

    - by Mana
    Currently, in my workspace, I have a netbook sitting off to the side gathering dust while I write code on my desktop. As a result, the only use my netbook gets coding-wise is when I'm writing up a quick Python script to model a given problem or concept in class; I never use it at home for coding, or for anything at all, as it is all possible and faster on my (much more powerful) desktop. I feel like this is wrong and that I should be making better use of my netbook. What effective uses have you found for a netbook and a desktop together when programming (or for software development in general)? What are the merits of this practice?

    Read the article

  • How do I secure a Tomcat installation?

    - by spangeman
    I have installed tomcat on my Ubuntu Home 11.10 system and can successfully access the test page online after port forwarding on 8080 within my router. I have not made any other changes to the router, Ubuntu or the tomcat install, everything else has remained standard. I intend on using this to play around with Java Servlets and basic web development for my own personal use. What steps, if any, would you suggest I take to ensure this is secure? Should I change anything within the Tomcat configuration? This seemed like a good idea to limit access - http://www.seankilleen.com/2010/09/how-to-allow-only-specific-ip-addresses_30.html But I am open to any other recommendations.

    Read the article

  • Unable to connect FileZilla to ubuntu ec2

    - by user1775063
    I have a micro ubuntu instance on ec2. I have done a passwd to set it to simple password. I have installed vsftpd on the ec2 instance. And imported the ec2 pem file via FileZilla-Settings-SFTP, and configured vsftpd.conf with following listen=YES anonymous_enable=NO local_enable=YES write_enable=YES local_umask=022 dirmessage_enable=YES use_localtime=YES xferlog_enable=YES connect_from_port_20=YES secure_chroot_dir=/var/run/vsftpd/empty pam_service_name=vsftpd rsa_cert_file=/etc/ssl/private/vsftpd.pem local_root=/home/ubuntu pasv_enable=YES pasv_max_port=12100 pasv_min_port=12000 port_enable=YES I am using username ubuntu, password that_i_set, port 21. I get the following error Error: Critical error Error: Could not connect to server

    Read the article

  • CPU heating up too much and locking the notebook [12.10 32bits]

    - by Lucas Coutinho
    I have an AMD A6-3400m. When installing ubuntu 12.10 the CPU gets too hot. When I go to see the cpu usage not see anything unusual. I think she is working at or above the specifications of the cpu. I can not finish the installation because the notebook warms both the system disarms and the cooler is maximum. Do not know if it's just me but I never had a good experience using Linux on AMD. When I used Intel these problems did not happen. What is happening can? Ps: The laptop works perfectly in Windows 7 Home Premium 64. No crashes, no overheat, simply works.

    Read the article

  • How much will be difference between building php sites with SVN and not using it

    - by user1315279
    I have been developing PHP sites for about 5 years but never used SVN. But i am going to start in new company and they use SVN. Tomorrow is my first day. I want to know that how site developement differes with SVN. WITHOUT SVN I make test account on Linux server via cpanel I make database tables Put all files in /home/user/public_html and i use PHP designer as IDE to edit files and i can see the results on test.mydomain.com I just want to know how much will be the difference with SVN. I mean what will be the steps

    Read the article

  • Is there an application which organizes my "Downloads" folder automatically?

    - by rearlight
    I'm looking for an application which puts all files from my Downloads folder into a new generated folder (called like the date) per button press or automatically is able to move files to its destinated directory automatically (p.e. *.png files should be put into /home/user/pictures/random/, *.avi to /videos/, ...) If you are familiar with the DayFolder application: I'm looking for an application like that but for any folder (not only the Desktop). In my case that's Downloads because this folder gets cluttered on my PC very fast. Thanks for your advice/help!

    Read the article

  • Actinic and Google Analytics: does it mess with my stats?

    - by tjcss
    My fathers website is built using Actinic - not by me but by a local company - and since he went with them some years ago his traffic never went down, but has stayed more or less consistent which is fine. My question is this; does using actinic somehow confuse analytics? As it shows that 99% of all visitors come "direct", as in, not by organic search. Previous to using Actinic he would get 70 to 85% of new hits exclusively from organic search terms. So I'm wondering if Actinic somehow messes with these new hits and redirects them to a "home" page.. Not sure exactly what I mean but this change in stats is concerning and I'm struggling to find an explanation.

    Read the article

  • Backup networks

    - by MegaBrutal
    My USB Wi-Fi adapter tends to overheat and when it happens, it stops working until I pull it out and plug it in again. Since I often access my machine remotely from other places, when it happens, my machine gets inaccessible for me until I get home to fix it. I'm thinking, I have another USB Wi-Fi adapter in spare. Would it be possible to plug it in that too but tell Ubuntu to only use it when the primary Wi-Fi adapter becomes off-line? (I won't use the other Wi-Fi adapter as primary because it provides significantly less quality.)

    Read the article

  • cannot unlock login screen 14.04

    - by LittleNooby
    When my computer boots, entering the correct password won't start my session. I found out the problem is /home/user/.Xauthority ownership. root owns this folder and giving the ownership to the user will solve the problem... for a while. I don't know how or when, but the ownership will go back to root pretty often; It can happen just after one boot or ten. Is there a definitive solution to this problem?

    Read the article

< Previous Page | 377 378 379 380 381 382 383 384 385 386 387 388  | Next Page >