Search Results

Search found 19157 results on 767 pages for 'shared folder'.

Page 379/767 | < Previous Page | 375 376 377 378 379 380 381 382 383 384 385 386  | Next Page >

  • Updating iOS application content which include images

    - by azamsharp
    I am working on a Vegetable gardening application. Apart from the vegetable name and description I also have vegetable image. Currently, I have all the images in the Supported Files folder in the Xcode project. But later on I want to update the application dynamically without having the user download a new version. When the user updates the application or downloads new data from the server that data will include the images. Can I store those images in the supporting file folder or somewhere where they can be references by just the name. RELATED QUESTION: I will also allow the user to take pictures of their vegetables and then write notes about the vegetables like "just planted", "about to harvest" etc. What is the recommended approach for storing pictures/photos. I can always store them in the user's photo library and then store the reference in the local database and then fetch and display the picture using the reference. The problem with that approach might be that if the user accidentally deletes the picture from the library then it will no longer be displayed in my application. The only way I see if to store the picture in the app local database as a BLOB.

    Read the article

  • What is the best way to configure the number of workers in Apache?

    - by rbm
    My site receives a lot of traffic for 2 hours during the day (2000 hits per minute). The rest of the day receives less traffic(500e hits per minute). I have been experimenting with the MaxClients and MaxSpareServers values but I still get some downtime during peek hours. How can I calculate the best values for my configuration based on the amount of ram that I have ? Each process is like 36-40 M of Memory total used free shared buffers cached Mem: 3096 793 2302 0 0 0 -/+ buffers/cache: 793 2302 Swap: 0 0 0 Values that I am using now <IfModule prefork.c> StartServers 10 MinSpareServers 22 MaxSpareServers 60 ServerLimit 90 MaxClients 90 MaxRequestsPerChild 400 </IfModule>

    Read the article

  • Ruby on Rails: Uploading a modifed site.

    - by Califer
    I'm having a heck of a time getting a site I modified to work correctly. I didn't set the site up originally, and since the person that set it up no longer works with me I had to learn ruby just to make some changes. I made all the changes in the development server and everything worked fine. Then I did a diff on the production and development and moved only my changes over. Unfortunately when I loaded my changes onto the production server I got a lot of errors. I've changed all of the permissions to 755, which took care being able to access anything at all, but after that I started getting a lot of 500 errors. Nothing showed up in the production.log file. I really have no clue what's going wrong except that perhaps things are not noticing the new changes. I moved the old site to a backup folder, and the new site crashes whenever it goes to anything that I've changed. In particular, I added a link to a new setup with an extra controller/model/view group. It works fine on development but in production it gives me a 404. Yes, I did copy all the files up. I even put everything back how it was, but the website is still showing the broken version of it. I checked the tmp/cache folder but it was empty. Running dispatch.fcgi shows the old site (which I expected) but it still shows the flawed new site when I connect through a browser. I've been tearing my hair out trying to get this to work. Any ideas as to how I can get this to work?

    Read the article

  • How to exclude IP from htaccess domain redirect

    - by ijujym
    I'm trying to write a custom redirect rule for some testing purposes on 2 domains with exactly same site. The code I am using is: RewriteEngine on RewriteCond %{REMOTE_ADDR} !^1\.2\.3\.4$ RewriteCond %{HTTP_HOST} ^.*site1.com [NC] RewriteRule ^(.*)$ http://www.site2.com/$1 [R=301,L] What I want is to redirect all requests for site1 to site2 except for requests from IP address 1.2.3.4. But currently requests from that IP are also being redirected to site2. Is there something I've missed in settings? ( note: both domains are on the same shared hosting account )

    Read the article

  • Permission denied when trying to execute a binary burned to a CD-R

    - by user16654
    On an Ubuntu 9.10 (Karmic Koala) machine, I burned a CD from the command prompt using: cdrecord -v speed=16 dev=0,1,0 /FPS.iso The CD now contains an executable and some files. I tested the CD by loading it onto another machine (Red Hat 5.3) and when I try to run the program I get the following message: bash: ./FPS1_1: Permission denied I can open other files like text documents (the executable also comes with shared libraries). I realized I had burned the CD as root so I burned another one as another user but I still have the same problem. How can I remove this permission or what is the problem? P.S. the image was in / if that helps

    Read the article

  • My Local Fileshare ClickOnce Update Is Not Working, Help?

    - by Soo
    I have a C# application that I'm trying to get to update automatically via ClickOnce. After publishing newer versions of software, I see the new versions in my publish folder, but when I open the application, it checks for updates, and does nothing (even though there are new files in the publish folder). What do I need in place for updates to be made automatically? Edit What version of Visual Studio are you using? Visual Studio 2008 Are you deploying the upgrades to the same location as the old version? They are being published to the same location (not sure about deployed) Is the installation URL the same? Have you incremented the version number? Yes In the Updates dialog reached by clicking the Updates button in the Publish page, do you have "The application should check for updates" checked? Yes Do you have "Before the application starts" selected? Yes How are you deploying the files? Not sure Are you copying them over to the file share or publishing the directly? Publishing directly

    Read the article

  • How to move mail accounts when migrating webhosting

    - by pkswatch
    I am migrating my website abc.com from one webhosting company to another in a shared hosting environment. Both have cpanel. And the second hosting account i am preparing to move is my multi-domain hosting account with 3 domains already in it. The problem is, i have many email accounts associated with my website abc.com, which are accessed using webmail. So if i move it to the other host, will i lose all those accounts and their emails? If yes, then how should i synchronise the email accounts so that all the accounts and the contained emails remain intact? I saw some several sync tools like IMAP Sync, etc. But these require two hosts while synchronizing, and as you see, i have just one domain name to be synchronized over 2 servers. PS, i do not have any ssh access on either of them, and i have made complete backup of all files using backup wizard in cpanel.

    Read the article

  • Homegroup libraries visible but can't be opened

    - by majocha
    I have 2 Windows 7 PCs in a home network. When I create a homegroup, the libraries I share are visible on the opposing machine but when I doubleclick them nothing happens - no error message, nothing at all. It is the same on both computers. The homegroup works to the extent that the printer I shared is visible and can print from the other computer. Only the libraries are inaccessible. What I tried and checked so far, without result: disabling firewall file and printer sharing, MS Networks client enabled in network properties deleting and recreating homegroup reset homegroup by deleting C:\Windows\ServiceProfiles\LocalService\AppData\Roaming\PeerNetworking unsharing and sharing again the libraries

    Read the article

  • Type 1 Hypervisor on the desktop

    - by Blazemore
    I have a powerful home PC, and I've used VirtualBox to run Linux distros in Windows (and vice versa). I'm interested in trying out a lightweight type 1 hypervisor to run all my operating systems (Windows 7, Debian, Arch) and was looking for suggestions of which to pick and how to implement this. From what I gather, a type 1 hypervisor is a lightweight OS which simply provides VM management functionality. Will I get reasonable performance under each guest OS? Can all the guest OSs have access to a shared data drive, or is is best to have a storage server in another guest OS and mount it over the virtual network? What about gaming, is this feasible, or will I realistically need to run Win7 on bare metal? I'd appreciate any input.

    Read the article

  • Is it possible, in ASP.net, to reference private assembly in a non-virtual subdirectory?

    - by Bago
    Is it possible to reference a private assembly in asp .net from a sub-folder that is not setup as a virtual directory? In other words, my page is setup in ~/subdir, I don't have access to ~/, and I am not an IIS admin. Can I reference a private assembly? How would I do this? I've tried <%@ Assembly Src="/subdir/bin/Assembly.dll % and <%@ Assembly Src="/subdir/bin/Assembly.dll % , but I get the messages "There is no build provider to match the extension .dll" or "Failed to map to path" respectively. Here is my folder structure: / | -subdir | | - Bin | | | *Assembly.dll | | *Default.aspx I've heard that in web.config might do the trick, but when I've tried it, it doesn't seem to work. Furthermore, I've read that only works in the application .config file. (i.e., the one in ~/). Anyhow, I already tried adding the following to web.config: <runtime> <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1"> <probing privatePath="/subdir/bin" /> <dependentAssembly> <codeBase href="/subdir/bin/Assembly.dll"/> </dependentAssembly> </assemblyBinding> For more background on my problem, I am simply using a shared host, all I have is access to is that subdirectory, and I am trying to use fckeditor.

    Read the article

  • Is it possible, in ASP .net, to reference private assembly in a non-virtual subdirectory?

    - by Bago
    Is it possible to reference a private assembly in asp .net from a sub-folder that is not setup as a virtual directory? In other words, my page is setup in ~/subdir, I don't have access to ~/, and I am not an IIS admin. Can I reference a private assembly? How would I do this? I've tried <%@ Assembly Src="/subdir/bin/Assembly.dll % and <%@ Assembly Src="/subdir/bin/Assembly.dll % , but I get the messages "There is no build provider to match the extension .dll" or "Failed to map to path" respectively. Here is my folder structure: / | -subdir | | - Bin | | | *Assembly.dll | | *Default.aspx I've heard that in web.config might do the trick, but when I've tried it, it doesn't seem to work. Furthermore, I've read that only works in the application .config file. (i.e., the one in ~/). Anyhow, I already tried adding the following to web.config: <runtime> <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1"> <probing privatePath="/subdir/bin" /> <dependentAssembly> <codeBase href="/subdir/bin/Assembly.dll"/> </dependentAssembly> </assemblyBinding> For more background on my problem, I am simply using a shared host, all I have is access to is that subdirectory, and I am trying to use fckeditor.

    Read the article

  • addField type image and thumbnail path

    - by Christiaan
    Hello, I have a Magento webshop and just created a custom module with the extension (Modulecreator). This module comes with a standard admin interface that can handle file uploads. I found out that if you want to show thumbnails you can use the 'image' field type (addField('myfield', 'image')) instead of the 'file' type field. But now i've got a problem. When I upload files, I save them in a subdirectory called 'slides' in the 'media' directory. But when I edit the item, the image path for the thumbnail next to the uploadfield is set to the 'media' folder. Not my folder as I set it to 'media/slides/'. I use the following code: $fieldset->addField('filename', 'image', array( 'label' => Mage::helper('slideshow')->__('File'), 'required' => true, 'name' => 'filename', )); I tried to set a 'path' key in the array, but this doesn't get picked up by Magento. Hate the lack of support for good and easy to use documentation by Magento... Maybe you can help me out to find a solution?

    Read the article

  • Serve PHP page in web root but show contents in subdirectories

    - by David
    I have a web site on a shared hosting server. My directory layout looks like this /home /user /public_html /pics /family There is an index.php file in public_html. I need help writing .htaccess rules that will Serve the index.php file when www.domain.org is requested Force the user back to public_html when www.domain.org/pics is requested Allow the user to see the directory contents when www.domain.org/pics/family is requested I experimented with a lot of combinations of RewriteCond and RewriteRule, but I don't understand the documentation and examples well enough to know if what I want to do is even possible. The web server application is some version of Apache.

    Read the article

  • Linux apache developing configuration

    - by Jeffrey Vandenborne
    Recenly reinstalled my system, and came to a point where I need apache and php. I've been searching a long time, but I can't figure out how to configure apache the best way for a developer computer. The plan is simple, I want to install apache 2 + mysql server so I can develop some php website. I don't want to install lamp though, just the apache2, php5 and mysql. The problem that I've been looking an answer for is the permissions on the /var/www/ folder. I've tried making it my folder using the chown command, followed by a chmod -R 755 /var/www. Most things work then, but fwrite for example won't work, because I need to give write permissions to everyone, unless I change my global umask to 000 I'm not sure what I can do. In short: I want to install apache2, php5, mysql-server without using lamp, but configured in a way so I can open up netbeans, start a project with root in /var/www/, and run every single function without permission faults. Does anyone have experiences or workarounds to this? Extra: OS: Ubuntu 10.04 ARCH: x86_64

    Read the article

  • Can't copy files from network drive

    - by user630320
    I have weird problem with copying files. When I copy file from network drive into C drive nothing happens but when I copy file from network drive to desktop I can copy the file. Also if I copy files from desktop into C it works fine. I have full local admin permission on this PC and the network drive. I have try these things Created new profile Run Windows Update Run checkdisk I'm using Windows XP 32bit pro Update: Network path: \\server1\shared\folder PC: C:\ (this doesn't work) C:\Documents and settings\Userid\Desktop (This works fine)

    Read the article

  • Optimize Windows file access over network

    - by Djizeus
    At my company I frequently need to access shared files over a Windows network. These files are located on the other side of the planet, so I guess the file share goes through some kind of VPN over Internet, but I don't control this and it is supposed to be "transparent" for me. However it is extremely slow. Displaying the content of a directory in the file explorer takes about 10s. Even if over the Internet, I did not expect that retrieving a list of file names would be that long. Are there any settings to optimize this from my Windows XP workstation, or is it mostly related to the way the network is configured? The only thing I have found so far is to cache all file names, while by default only short file names are cached (http://support.microsoft.com/kb/843418).

    Read the article

  • How to keep programs from source up to date?

    - by wizard
    I'm designing a new server setup for hosting multiple websites. (Shared hosting for my clients over at SliceHost.) I've recently moved away from the traditional LAMP setup and chosen Ubuntu, Nginx, php-fpm and mysql. I like it a lot better then my old Apache, suphp, mysql setup. It works great, provided encapsulation between sites and uses substantiallly less memory. However I have one major maintenance problem. In order to have a recent version of Nginx and in order to use php-fpm I've had to compile these programs from source. The reason I see this as a problem is that keeping track of updates, and build configurations will end up being a lot of work. For two programs (and a patch) I can handle it, but it seems like this setup would not scale with many packages and servers. Are there good ways to manage this situation? I'm sure people do this all the time.

    Read the article

  • Windows server 2003SP2 as LDAP replica master for Mac OSX 10.6

    - by FrancoR
    Hello there, we have a single domain controller with Windows 2003 with few child. All the users are in the main DC. We have already created a connection from AD to Mac Xserve 10.6 and can read all the users, but: 1. If the DC goes down (or the net), Mac lose all the users, so no file access, no emails, no nothing. 2. the users are in read only. Mac admin cannot reset password, change attribute and so on. What we need is a stable environment where both AD admins and LDAP admins can manage the users; if one server goes offline the users of the other server should work (email, shared folders) just fine. Thanks in advance P.S. we already tried to connect the MacOSX to Windows LDAP, instead of AD, but we're unable to do it: MacOSX requires DNS IP (gotcha), user admin and password (ok) and a root LDAP password we're unable to find any reference of it in Windows 2003.

    Read the article

  • Running Sitecore Production Site under a Virtual Directory

    - by danswain
    We are using Sitecore 6 on a Windows Server 2003 (32bit) dev machine. I know it's not recommended for the CMS editing site, but we've been told it is possible to get the front-end Sitecore websites to run from within a virtual directory. Here's the issue: we'd like to achieve what the below poor mans diagram shows. We have a website (.net 1.1) /WebSiteRoot (.net 1.1) | | |---- Custom .net 1.1 Web Application | |---- SiteCore frontend WebApplication (.net 2.0) | |---- Custom .net 2.0 WebApplication The Sitecore webApplication would contain the Sitecore pipeline in its web.config and we'd make use of the section to configure the virtual folder to allow for where our Sitecore app sits and point it to the appropriate place in the Content Tree. Is it possible to pull this off? This is just the customer facing website, there will be no CMS editing functionality on these servers, that will be done from a more standard Sitecore install inside the firewall on a different server. The errors we're encountering are centered around loading the the various config files in the App_Config folder. It seems to do a Server.MapPath on "/" initially (which is wrong for us) so we've tried putting absolute paths in the web.config and still no joy (I think there must be some hardcoded piece that looks for the Include directory). Any help would be greatly appreciated. Thanks

    Read the article

  • git submodule svn external

    - by Jason
    Let's say I have 3 git repositories, each with a lib and tests folder in the root. All 3 repositories are part of what I want to be a single package, however it is important to me to keep the repositories separate. I am new to git coming from svn, so I have been reading up on submodules and how they differ from svn:externals. In SVN I could have a single lib/vendor/package directory, and inside package I could setup 3 externals pointing to each of my 3 repositories lib directory, renaming it appropriately like lib/vendor/package/a -> repo1/lib lib/vendor/package/b -> repo2/lib lib/vendor/package/c -> repo3/lib but from my understanding this is not possible with git. Am I missing something? Really I'm hoping this can be solved in one of two ways. Someone will point out how to create a 4th git repository which has the other 3 as submodules organized as I have mentioned above (where I can have an a, b, and c folder inside the root) Someone will point out how to set this up using svn:externals in combination with githubs svn support, referencing the lib directory within each git repository (from my understanding this is impossible)

    Read the article

  • Gnome+NX clipboard behavior; auto-copy on select?

    - by threecheeseopera
    I am having issues with the Gnome(/Linux/Debian+Ubuntu) clipboard when connected remotely; it's default behavior appears to be to automatically add text to a clipboard buffer when that text is selected. This is not usually a problem, until I need to log into one of these systems remotely (w/ GUI), and attempt to use a shared clipboard. If I 'copy' text on the local machine (destined to replace some text on the remote machine), that copy buffer is overwritten as soon as I select the text on the remote machine to be replaced. Is there some way around this? It sort of drives me nuts. Thanks! UPDATE: This is really an NX server issue; X11 supports multiple clipboards ("selections": clipboard, primary. secondary; see this excellent article) that behave differently, and it appears that my problem is related to how NX server translates this over to the host machine.

    Read the article

  • Graphing/charting of CPU utilisation [on hold]

    - by Peter
    So nagios can be good at graphing particular resource utilisation or other metrics, but I'm looking for a tool that can output a chart or other graphical representation of how much CPU time/CPU utilisation /all/ services on a server are currently consuming. I think New Relic could probably achieve this to an extent, but I was wondering if there was a popular open source app used for this. In case I am explaining this in a bad way, my actual problem is that I have a shared server with suexec enabled (ie. httpd cgi running under multiple user accounts). I'd like to know which users are using the most CPU during periods of the day.

    Read the article

  • time on files differ by 1 sec. FAIL Robocopy sync

    - by csmba
    I am trying to use Robocopy to sync (/IMG) a folder on my PC and a shared network drive. The problem is that the file attributes differ by 1 sec on both locations (creation,modified and access). So every time I run robocopy, it syncs the file again... BTW, problem is the same if I delete the target file and robocopy it from new... still, new file has 1 sec different properties. Env Details: Source: Win 7 64 bit Target: WD My Book World Edition NAS 1TB which takes its time from online NTP pool.ntp.org (I don't know if file system is FAT or not)

    Read the article

  • Directory permissions on Ubuntu Server 10.04 LTS

    - by SebastianOpperman
    I have set up a second drive on Ubuntu Server. The directory displays correctly but Windows users cannot write or create files on the directory. I have Samba set up so Windows can access the drives. here is the last bit of my /etc/samba/smb.conf [personeel] path = /media/windows browsable = yes guest ok = yes writable = yes read only = no create mask = 0775 directory mask = 0775 I want the directory to be shared with writable permissions to everyone who can access the Ubuntu Server. I have tried sudo chmod but to no success. Any help would be appreciated

    Read the article

  • How can I let my users set PHP.ini settings for wordpress?

    - by jldugger
    I set up a wordpress server from a fairly standard Ubuntu 9.10 for a class and they're constantly running into problems with the default PHP.ini settings. First memory settings were too low, then the file upload limits were too small, etc. And more concerning was a wordpress wide blank page that I suspect was killed for ram consumption but turning on php errors in php.ini didn't reveal anything! I'm not familiar with shared hosting, but I feel there's a way such places allow users to edit such things without needing me to intervene and restart Apache.

    Read the article

< Previous Page | 375 376 377 378 379 380 381 382 383 384 385 386  | Next Page >