Search Results

Search found 4900 results on 196 pages for 'upload'.

Page 127/196 | < Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >

  • How can I run my program on a large number of computers? [closed]

    - by zenpoy
    I'm looking for a (preferably free) service for running an executable I wrote? It's not malicious, it's not a virus, it's not scam, and if this is really important I can upload the python source code instead. I wrote a small crawler to gather information regarding the style of web pages for my MA project, and I need a lot more data. EDIT Here is more information on my problem and how I approach on solving it, and where I'm stuck. As part of my research I'm trying to classify text based on it's style (font-family for now), my data is based web pages, so I wrote a client/server application - the client is a crawler that gathers this data and send it to the server. The problem is that like 99% of the internet is Arial, Verdana and Helvetica - other fonts are far more rare, so I need to spend very long time to gather enough data regarding these fonts. Hope this explains it.

    Read the article

  • Retrieving multiple rows from a loop-created form... Stuck.

    - by hangston
    Let me start by saying that I'm new to PHP, but I'm here to learn and would really appreciate your help. I use the following code to pull in data and create a form. This creates up to 40 lines for a user to fill out. Each line consists of the same information: Description, Amount, and Frequency. The remainder of the information needed is generated by the database. (See hidden fields) <?php $row = 0; do { $optid = $row_options['option_id']; echo "<tr>\n\t<td>" . htmlentities($row_options['option']) . "</td>\n"; echo "\t<td>" . "<input name='description' type='text' size='40' maxlength='120'/>" . "</td>\n"; echo "\t<td>" . "<input name='option_id' type='hidden' value='$optid' />$<input name='amount' type='text' size='10' maxlength='7'/>" . "</td>\n"; echo "\t<td>" . "<select name='assisted_frequency'> <option value='Monthly'>Monthly</option> <option value='Weekly'>Weekly</option> <option value='Daily'>Daily</option> <option value='Hourly'>Hourly</option> <option value='One-Time'>One-Time</option> </select>" . "</td>\n</tr>\n"; $array[$row] = array( $arraydesc[$row] = $_POST['description'], $arrayamto[$row] = $_POST['amount'], $arrayoptid[$row] = $optid, $arrayfreq[$row] = $_POST['frequency'], ); $row ++; } while ($row_options = mysql_fetch_assoc($options)); $counter = $row - 1; ?> I'm having troubles retrieving the information that the user inputs. My intent is to loop through each row after the user has input their information, then upload the mix of my database information and the user's information into another database. For example, the user would see, albeit prettier: form1 Option 1: description [input box] amount [input box] frequency [option box] Option 2: description [input box] amount [input box] frequency [option box] Option 3: description [input box] amount [input box] frequency [option box] Option 4: description [input box] amount [input box] frequency [option box] submit Upon submitting the form above, I'm using a query similar to the following to input the data into the database: for($row=0; $row<=$counter; $row++){ $insertSQL2 = sprintf("INSERT INTO table (option_id, amount, description, frequency) VALUES (%s, %s, %s, %s)", GetSQLValueString($arrayoptid[$row], "int"), GetSQLValueString($arrayamto[$row], "int"), GetSQLValueString($arraydesc[$row], "text"), GetSQLValueString($arrayfreq[$row], "text")); // code to submit query } I've tried for, foreach, arrays (what feels like the everything I know) to post each row (row by row) into the database. I either get just the last row of data, or no data at all. I also worry that the [$row] technique is adding characters to my data. What is the best way to retrieve each row of the user's inputs, then upload this data (row by row) into the database? Also, I would really appreciate your suggestions for improving my coding technique and the approach I'm taking.

    Read the article

  • What is a good solution for an intranet video portal (YouTube-like) site?

    - by Ken Pespisa
    I would like an easy-to-setup site to handle videos to be viewed internally by my company. YouTube is essentially the perfect solution except for its being public. I'm looking for a place where a few people can upload videos, and the system will return a page where they can watch that video in a browser. I figure this would involve a dedicated Web server to run the Web application and process the videos. I've searched and I don't think such a system exists, but I perhaps there's one out there in its infancy that doesn't rank high on Google yet. Essentially the site I'm looking for is what MediaWiki is to Wikis, or what StackExchange is to Q&A sites, but for videos. Thanks in advance!

    Read the article

  • ftp users configuration in OpenSuse 12

    - by chieroz
    I usually work with MacOSX servers but this time I need to set up a ftp service on a OpenSuse 12.2 server and I am a little lost. I am using the remote YAST2 tool via ssh. I created several users who can connect via ssh and/or ftp, so the basic setup is ok. But when connecting via ftp all my users don't have write permissions. The FTP directory for authenticated users is /srv/www/htdocs, which has permissions root:root. The OpenSuse manual say it's bad practice to change these permissions, but my normal users (even the ones in the sudoers list) cannot upload files. So I am stuck: as a workaround I use rsync, but from time to time I just need to establish a working ftp connection. What's the right approach for users permissions in this scenario? Thanks a lot.

    Read the article

  • How can I set up an FTP user with a home directory inside another user's home folder?

    - by simon180
    Hi I have an Ubuntu (Hardy) server which I am using to host multiple websites. All of the sites are stored in subfolders of a public_html folder for my main login to the server and accessed via a single SSH account. I now have a website user who wants FTP (or similar) access to enable them to upload various files etc to the directory where their website is situated, however I still need the SSH account to have access to this directory as I may need to make changes using my master account. Basically I want to create an FTP account (I have VSFTPD installed) for a user with the home directory inside my own user account but they should only be able to read/write to this folder or its subfolders but not go further up the directory tree. How can I achieve this? Thanks

    Read the article

  • Why is the resulting file unplayable when I use FFMPEG to convert a MP4 video to ASF?

    - by Fuupoo
    I want to convert an .mp4 file to an .asf file, but I need to keep the size of the file below 150 mb (the server I need to upload the file to can only take files under 150 mb). Basically I want to recontain the .mp4 in an .asf file. The code I'm using currently is ffmpeg input.mp4 -c copy output.asf The command runs, however when I try to play the file on windows media player (I'm on windows xp) I get an error message saying "Windows media player has encountered a problem while playing the file." Anyone have any idea why this is happening? And what other ways I can accomplish this task?

    Read the article

  • How to allow Mac OS X's native Apache/PHP installation to access WebServer directories?

    - by Martin Bean
    I have a problem bugging me with Mac OS X's native Apache/PHP installation. With my PHP scripts, I have to alter the file permissions on each folder I want to access. For example, in an upload script I would have to set the destination directory to 'read & write' for the group 'everyone'. However, I believe this is not the best practice and would like all of my directories to be readily writable to PHP. My scripts are stored in /Library/WebServer/Documents/, which is Mac OS X's default directory to serve web pages locally.

    Read the article

  • How do I use a URL path instead of a file path in an Open File dialog in Mac OSX or ChromiumOS?

    - by Chris
    In Windows 7 (and perhaps earlier), the default "Open File" dialog box allows you to type a full URL into the "File name" section as if it were a file path, e.g. "http://www.example.com/pic.gif" instead of "C:/windows/pictures/pic.gif". When uploading a file to a website on the client side - say, an image - this allows the client to upload a picture located on a server accessible via the URL instead of downloading the image, saving it locally, then referencing the local image in the "Open File" dialog. It's a great option for Windows users. I have three separate questions: What is this procedure formally called? How do I describe this succinctly so that my searches for more information are fruitful? Can something similar be done in Mac OSX, Chromium OS, or a Linux environment? If so, how? Thanks!

    Read the article

  • How does Amazon ec2-user get its sudo rights

    - by Johan
    I am looking for where the default Amazon AMI linux image sets up the privileges for the default ec2-user account. After logging in with this account I can use sudo successfully. Checking via the sudoers file, which I open by running visudo (with no other options) I see a few default settings and permissions for root ALL ALL So ... Where is the permissions for ec2-user assigned? I have not yet tried to add a new permission but ultimately I want to resign ec2-user for systems management tasks and use a non-full root user for administering the applications (stop and start mysql, httpd, edit apache's vhost files, and upload / edit web content under the web root)

    Read the article

  • wordpress makes duplicate images of different size

    - by Mark
    I have a wordpress website. I upload my pictures to the media library. And when I import a picture in a post, you get the options, one of which ask you what size to display it in. And doing this results in creating a new image file that is a different size, rather than wordpress using html to control the size of the image. The following image shows how different sized picture copies are made of the same one with different resolution. http://imgur.com/uVqTN Thank you everyone in advance.

    Read the article

  • How do I know if 'hg clone' is doing the work remotely?

    - by jjfine
    I've got a very simple windows install of Mercurial on my machine. The 'central' repository is located at //mymachine/hg-repos/central. I want remote (VPN) users to be able to create clones of this repository in the hg-repos directory because it gets daily backups. I have given these users full control of the hg-repos directory. My question is this: If I'm on a remote machine, and I run the command: hg clone //mymachine/hg-repos/central //mymachine/hg-repos/central-copy ...is the remote machine doing most of the work? I don't want the client to have to download all of the central repository and then upload it all back because people are going to be using this from across the country. But I suspect this is what's happening here since it works so easily.

    Read the article

  • Exporting virtual machine from Windows Azure to Amazon

    - by Somebody
    As documentation says: You can import VMware ESX and VMware Workstation VMDK images, Citrix Xen VHD images and Microsoft Hyper-V VHD images for Windows Server 2003, Windows Server 2003 R2, Windows Server 2008 and Windows Server 2008 R2. What about virtual machine with Ubuntu on it? Will it import it successfully? Have anyone tried to do it? Is it possible to import VM directly from Windows Azure. Without need to download VM on my machine and then upload it to Amazon? Thanks.

    Read the article

  • Saving small text files is slow over Win Server 2008 R2 VPN

    - by Buckers
    We have a VPN connection to our Windows Server 2008 R2 machine, and the connection works fine. Large files go back and forth fairly quickly, but we use the connection mainly for working on small text files (.aspx, .asp, .php etc). What we find very annoying is that even the smallest of files, there is a noticeable delay of between 2-5 seconds when saving any changes. As we often make changes to code and are constantly saving, this is becoming a problem. Is there anything that might be causing this delay? Or is there anything we can do to speed it up? The connection is definitely not the issue as we have a constant 5Mb upload from our server, and 20Mb+ down on the remote machines. Thanks, Chris.

    Read the article

  • software to monitor internet usage on an XP PC? (browser + non-browser)

    - by user39316
    Hi Is there any (ideally open source) software for Windows that can be used on a PC, to monitor the usage of internet from that PC? It would need to include both browser and non-browser sources (e.g. a service that sync's calendar to gmail). So any software on your PC that uses would need to be configured to point to this local internet monitoring software/proxy. The monitoring software/proxy then would be configured to point to the company proxy server (address, port & credentials). Things that come to mind that might be close but not really focused on solving this might be perhaps: Charles Proxy, Fiddler 2, SQUID? The idea would be it could give you a daily/weekly/monthly report of internet upload/download usage on a per program/process/service basis for the PC it is being run on. thanks

    Read the article

  • using trickle to slow down browser

    - by tester
    according to trickle's man page, http://linux.die.net/man/1/trickle i can limit the download speed of a process, e.g. trickle -u 10 -d 20 ncftp to Launch ncftp(1) limiting its upload capacity to 10 KB/s, and download capacity at 20 KB/s. how would I go about limiting google-chrome or firefox with trickle? Edit: For those of you asking why I asked such an obvious question, I tried trickle -u 10 -d 20 firefox and I'm getting an error trickle: Could not reach trickled, working independently: No such file or directory firefox opens right after, but is definitely not rate limited...

    Read the article

  • Retreiving multiple rows from a loop-created form... Stuck.

    - by hangston
    Hi All, Let me start by saying that I'm new to PHP, but I'm here to learn and would really appreciate your help. I use the following code to pull in data and create a form. This creates up to 40 lines for a user to fill out. Each line consists of the same information: Description, Amount, and Frequency. The remainder of the information needed is generated by the database. (See hidden fields) <?php $row = 0; do { $optid = $row_options['option_id']; echo "<tr>\n\t<td>" . htmlentities($row_options['option']) . "</td>\n"; echo "\t<td>" . "<input name='description' type='text' size='40' maxlength='120'/>" . "</td>\n"; echo "\t<td>" . "<input name='option_id' type='hidden' value='$optid' />$<input name='amount' type='text' size='10' maxlength='7'/>" . "</td>\n"; echo "\t<td>" . "<select name='assisted_frequency'> <option value='Monthly'>Monthly</option> <option value='Weekly'>Weekly</option> <option value='Daily'>Daily</option> <option value='Hourly'>Hourly</option> <option value='One-Time'>One-Time</option> </select>" . "</td>\n</tr>\n"; $array[$row] = array( $arraydesc[$row] = $_POST['description'], $arrayamto[$row] = $_POST['amount'], $arrayoptid[$row] = $optid, $arrayfreq[$row] = $_POST['frequency'], ); $row ++; } while ($row_options = mysql_fetch_assoc($options)); $counter = $row - 1; ?> I'm having troubles retrieving the information that the user inputs. My intent is to loop through each row after the user has input their information, then upload the mix of my database information and the user's information into another database. For example, the user would see, albeit prettier: form1 Option 1: description [input box] amount [input box] frequency [option box] Option 2: description [input box] amount [input box] frequency [option box] Option 3: description [input box] amount [input box] frequency [option box] Option 4: description [input box] amount [input box] frequency [option box] submit Upon submitting the form above, I'm using a query similar to the following to input the data into the database: for($row=0; $row<=$counter; $row++){ $insertSQL2 = sprintf("INSERT INTO table (option_id, amount, description, frequency) VALUES (%s, %s, %s, %s)", GetSQLValueString($arrayoptid[$row], "int"), GetSQLValueString($arrayamto[$row], "int"), GetSQLValueString($arraydesc[$row], "text"), GetSQLValueString($arrayfreq[$row], "text")); // code to submit query } I've tried for, foreach, arrays (what feels like the everything I know) to post each row (row by row) into the database. I either get just the last row of data, or no data at all. I also worry that the [$row] technique is adding characters to my data. What is the best way to retrieve each row of the user's inputs, then upload this data (row by row) into the database? Also, I would really appreciate your suggestions for improving my coding technique and the approach I'm taking. Thank you, Hangston

    Read the article

  • How to give write permissions to multiple users?

    - by Daniel Rikowski
    I have a web server and I'm uploading files using an FTP client. Because of that the owner and the group of the file are taken from the user used during the upload. Now I have to make this file writable by the web server (apache/apache). One way would be to just change the owner and the group of the uploaded file to apache/apache, but that way I cannot modify the file using the FTP account. Another way would be to give the file 777 permissions. Both approaches seem not very professional and a little bit risky. Are there any other options? In Windows I can just add another user to the file. Can something similar done with Linux?

    Read the article

  • What is a good solution for an intranet video portal (YouTube-like) site? [closed]

    - by Ken Pespisa
    I would like an easy-to-setup site to handle videos to be viewed internally by my company. YouTube is essentially the perfect solution except for its being public. I'm looking for a place where a few people can upload videos, and the system will return a page where they can watch that video in a browser. I figure this would involve a dedicated Web server to run the Web application and process the videos. I've searched and I don't think such a system exists, but I perhaps there's one out there in its infancy that doesn't rank high on Google yet. Essentially the site I'm looking for is what MediaWiki is to Wikis, or what StackExchange is to Q&A sites, but for videos.

    Read the article

  • Automatize uploading the YouTube

    - by John
    Here's the problem: I would like to keep lots of home made videos. Of course, they are subject to being lost, or somebody could steal the the computer, or water or fire could destroy them. Secondly, I have to plug in my hard drive every time I want to watch something, which I find slow and cumbersome. I was thinking that perhaps I could upload the videos to Youtube with the privacy set to invite only and then delete the video from the hard drive automatically. Could this be done?

    Read the article

  • Server 2003 slow share.

    - by G V
    I am running an 03 box with shares active. When uploading to the share, the speed is average. About 15-20 mbps.. But when you think about it, it is bad because it is a direct connection to a couple machines. When uploading to another server the connection speed is twice that of the direct storage. When uploading s massive folder, 250 GB, the upload will start as normal, but as it progresses it drops in speed. Now it is sitting at around 2-7 Mbps. Any ideas on how i can boost the transfer rate? On a side note, the download speed is great. It is a speed that you would expect from this setup, the main problem is uploading and what is causing the extreme slowness in speeds. Any help would be great.

    Read the article

  • SSL on Apache seems to significantly affect WebDAV performace

    - by takesides
    I'm using Apache 2.2 running on Windows Server 2008 R2 as a WebDAV server for clients to upload large media files (roughly 100-2000MB). I am finding that when I have SSL enabled (openSSL 0.9.8o) and use HTTPS for the uploads the throughput is around 13Mbps but when I disable it and just use HTTP I get around 80Mbps. I can't understand why this is happening as my understanding was that the heavy SSL work was done at the beginning of the connection. Does anyone have any idea why the performance is so drastically affected by enabling SSL? Cheers.

    Read the article

  • ConfigurationErrorsException when serving images via UNC on IIS6

    - by Mark Richman
    I have a virtual directory in my web app which connects to a Samba share via UNC. I can browse the files via Windows Explorer without issue, but my web app throws a yellow screen with the following message: Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately. Parser Error Message: An error occurred loading a configuration file: Could not find file '\cluster\cms\qa-images\120400\web.config'. What makes no sense to me is why it's looking for a web.config in that location. I know it's not an authentication issue because the virtual directory can serve images from its root (i.e. \cluster\cms\qa-images\test.jpg serves as http://myserver/upload/test.jpg just fine).

    Read the article

  • How do I get these permissions working right so Apache can work with the files?

    - by cosmicbdog
    I am having a go at setting up my own Apache and can't seem to get my head around the permissions. Lets say I grab a file from somewhere off the web and it has permission of 600. I then upload this file via ftp to a user directory, which is also an apache virtual site, and so this file retains this permission of 600. This means that the user can read this file, but Apache can't: it will be forbidden. What is the most simple solution so that apache can read + write whatever files end up in the users directory? Can apache be granted some sort of root power over files in a directory?

    Read the article

  • What is the proper way to set up the Apache document root in terms of privileges?

    - by racl101
    I have just installed Ubuntu 9.10 server edition on my machine and I wish to run my own personal local server with other users in the same LAN. First, I was wondering what folder directory structure is best for the web root? Should I just use: /var/www/ and start throwing web documents there or should I create a folder elsewhere (maybe the home directory)? Second, in the /var/www/ directory only the root user can create documents in there, however, I wish to have other users be able to create files in the document root and upload them via FTP. Should I change the permissions or the www/ folder? Or again, should I create the document root elsewhere with different permissions? What is the safest way of doing this?

    Read the article

  • Using Dropbox API instead of a FTP server.

    - by Somebody still uses you MS-DOS
    This is a small aplication scenario. Usually, when you have to do some backups of source code/database on your server, you use a second ftp server, a cronjob to tar.gz your db dumps and source files, and send this file to your ftp server from your application server. Dropbox created an API to use it's infrastrucutre. Since they provide 2gb for free accounts, I thought about being able to upload to it instead of a ftp server. So, if you do some freelance work, you can create a free account for each client and use this approach, maybe encrypting the files you send. You even gain a revision for each sent file, like a revison control system, for free, from the last 30 days. What do you think of this approach? Is it possible? And, more importantly: what are the security risks involved? (That's why I'm asking this on serverfault, since this POV from sysadmins will be more accurate). Thanks!

    Read the article

< Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >