Search Results

Search found 69034 results on 2762 pages for 'file locking'.

Page 718/2762 | < Previous Page | 714 715 716 717 718 719 720 721 722 723 724 725  | Next Page >

  • Merge n files using a C program

    - by Amal
    I am writing a download Accelerator. So I download a file from the webserver into n parts. Now I want to merge the files into 1 single file. So I use the following code. And the file names are in the correct order. But the output file I am getting is different from the original download file. Can you tell me where could the error be ?C int cbd_merge_files(const char** filenames, int n, const char* final_filename) { FILE* fp = fopen(final_filename, "wb"); if (fp == NULL) return 1; char buffer[4097]; for (int i = 0; i < n; ++i) { const char* fname = filenames[i]; FILE* fp_read = fopen(fname, "rb"); if (fp_read == NULL) return 1; int n; while ((n = fread(buffer, sizeof(char), 4096, fp_read))) { int k = fwrite(buffer, sizeof(char), n, fp); if (!k) return 1; } fclose(fp_read); } fclose(fp); return 0; }

    Read the article

  • Set Mime_type validation in Symfony

    - by Ngu Soon Hui
    I want to make sure that during file upload time, only the file of the format jpeg, png and gif are allowed. So the "File of type:" below in the screenshot must show jpeg, png and gif: I did the following for my validator in Symfony: $this->setValidator ( 'upload your filehere', new sfValidatorFile ( array ( 'required'=>true, 'mime_types' => array ('image/jpeg, image/png, image/gif' ) ) , array( 'mime_types'=> 'only jpeg' ) ) ); but when I click on the upload button, the files are not filtered accordingly; what did I do wrong? I also try $this->setValidator ( 'upload your filehere', new sfValidatorFile ( array ( 'required'=>true, 'mime_types' => 'image/jpeg, image/png, image/gif' ) , array( 'mime_types'=> 'only jpeg' ) ) ); But it doesn't work, too. I tried the following in my form class, but unfortunately it doesn't work as well: <?php class UploadCaseForm extends sfForm { public function configure() { $this->setWidgets ( array ('Documents' => new sfWidgetFormInputFile ( ) ) ); $this->widgetSchema->setNameFormat ( 'UploadCase[%s]' ); $this->validatorSchema ['Documents'] = new sfValidatorFile ( array ('mime_types' => 'web_images' ), array ('invalid' => 'Invalid file.', 'required' => 'Select a file to upload.', 'mime_types' => 'The file must be of JPEG, PNG or GIF format.' ) ); } } ?> Here's the action code: public function executeIndex(sfWebRequest $request) { $this->form = new UploadCaseForm ( ); if ($this->getRequest ()->getMethod () == sfRequest::POST) { var_dump($request->getParameter('UploadCase')); } } Edit: The accepted answer is the answer, as far as the server side validation goes. If anyone wants a client side validation, i.e., filtering the type of file that can be uploaded even before the operation hits server, then maybe there is no reliable way to do it, because of browser's constraint.

    Read the article

  • How do I cover unintuitive code blocks?

    - by naivedeveloper
    For some reason, I'm having a hard time trying to cover the block of code below. This code is an excerpt from the UNIX uniq command. I'm trying to write test cases to cover all blocks, but can't seem to reach this block: if (nfiles == 2) { // Generic error routine } In context: int main (int argc, char **argv) { int optc = 0; bool posixly_correct = (getenv ("POSIXLY_CORRECT") != NULL); int nfiles = 0; char const *file[2]; file[0] = file[1] = "-"; program_name = argv[0]; skip_chars = 0; skip_fields = 0; check_chars = SIZE_MAX; for (;;) { /* Parse an operand with leading "+" as a file after "--" was seen; or if pedantic and a file was seen; or if not obsolete. */ if (optc == -1 || (posixly_correct && nfiles != 0) || ((optc = getopt_long (argc, argv, "-0123456789Dcdf:is:uw:", longopts, NULL)) == -1)) { if (optind == argc) break; if (nfiles == 2) { // Handle errors } file[nfiles++] = argv[optind++]; } else switch (optc) { case 1: { unsigned long int size; if (optarg[0] == '+' && posix2_version () < 200112 && xstrtoul (optarg, NULL, 10, &size, "") == LONGINT_OK && size <= SIZE_MAX) skip_chars = size; else if (nfiles == 2) { // Handle error } else file[nfiles++] = optarg; } break; } } } Any help would be greatly appreciated. Thanks.

    Read the article

  • Issue changing innodb_log_file_size

    - by savageguy
    I haven't done much tweaking in the past so this might be relatively easy however I am running into issues. This is what I do: Stop MySQL Edit my.cnf (changing innodb_log_file_size) Remove ib_logfile0/1 Start MySQL Starts fine however all InnoDB tables have the .frm file is invalid error, the status shows InnoDB engine is disabled so I obviously go back, remove the change and everything works again. I was able to change every other variable I've tried but I can't seem to find out why InnoDB fails to start even after removing the log files. Am I missing something? Thanks. Edit: Pasting of the log below - looks like it still seems to find the log file even though they are not there? Shutdown: 090813 10:00:14 InnoDB: Starting shutdown... 090813 10:00:17 InnoDB: Shutdown completed; log sequence number 0 739268981 090813 10:00:17 [Note] /usr/sbin/mysqld: Shutdown complete Startup after making the changes: InnoDB: Error: log file ./ib_logfile0 is of different size 0 5242880 bytes InnoDB: than specified in the .cnf file 0 268435456 bytes! 090813 11:00:18 [Warning] 'user' entry '[email protected]' ignored in --skip-name-resolve mode. 090813 11:00:18 [Note] /usr/sbin/mysqld: ready for connections. Version: '5.0.81-community-log' socket: '/var/lib/mysql/mysql.sock' port: 3306 MySQL Community Edition (GPL) 090813 11:00:19 [ERROR] /usr/sbin/mysqld: Incorrect information in file: './XXXX/User.frm' 090813 11:00:19 [ERROR] /usr/sbin/mysqld: Incorrect information in file: './XXXX/User.frm' 090813 11:00:19 [ERROR] /usr/sbin/mysqld: Incorrect information in file: './XXXX/User.frm' Its just a spam of the same error until I correct it When it did start after it recreated the log files so it must be looking in the same spot I am.

    Read the article

  • .NET Application with SQL Server CE Database

    - by blu
    I just started using SQL Server CE 3.5 in my WinForms Application (C# in VS 2008 SP1). I've noticed a couple of interesting things I'd like some input on: 1. Copying of sdf file to bin My sdf file is located inside of an Infrastructure project that houses my repository implementations. When the application is first debugged the sdf was copied to debug\bin. This is where all future reads/writes operate. At some point when this is deployed the file will go into a data folder using Click Once, but during development where should I be putting this sdf? Is having it in the bin typical, or are there any other recommendations? 2. Updating sdf It appears that writing to the sdf file does not immediately update the database. I am using Linq-to-SQL and am calling SubmitChanges, but on read the values are not returned. However if I close the application and re-open it the added value is there. Is there an additional flush step I need to take? What is causing this, file locking, buffering, something else? Update 3. Unit Tests I have an MS test project, and the sdf file is not being copied to the correct output directory. I have the settings: Build Action: Content Copy to Output Directory: Copy Always The message is: System.Data.SqlServerCe.SqlCeException: The database file cannot be found. Check the path to the database. I appreciate any guidance on these questions, thanks. If there is a tutorial other than what is on MSDN that you know about that would be great too. Working with CE is proving to be a difficult task and I welcome any help I can find.

    Read the article

  • Serving .docx files through Php

    - by user275074
    Hi, I'm having issues when attempting to serve a .docx file using Php. When uploading the file I detect the file mime type and upload the file using the file with the correct extension based on the mime type; e.g. below: application/msword - doc application/vnd.openxmlformats-officedocument.wordprocessingml.document - docx When attempting to serve the files for download, I do the reverse in detecting the extension and serving based on the mime type e.g. public static function fileMimeType($extention) { if(!is_null($extention)) { switch($extention) { case 'txt': return 'text/plain'; break; case 'odt': return 'application/vnd.oasis.opendocument.text'; break; case 'doc': return 'application/msword'; break; case 'docx': return 'application/vnd.openxmlformats-officedocument.wordprocessingml.document'; break; case ('jpg' || 'jpeg'): return 'image/jpeg'; break; case 'png': return 'image/png'; break; case 'pdf': return 'application/pdf'; break; default: break; } } } All files appear to download correctly and open fine but when attempting to open a docx file, Word (on multiple files) throws a error stating the file is corrupt. Any ideas would be great, thanks.

    Read the article

  • Passing System classes as constructor parameters

    - by mcl
    This is probably crazy. I want to take the idea of Dependency Injection to extremes. I have isolated all System.IO-related behavior into a single class so that I can mock that class in my other classes and thereby relieve my larger suite of unit tests of the burden of worrying about the actual file system. But the File IO class I end up with can only be tested with integration tests, which-- of course-- introduces complexity I don't really want to deal with when all I really want to do is make sure my FileIO class calls the correct System.IO stuff. I don't need to integration test System.IO. My FileIO class is doing more than simply wrapping System.IO functions, every now and then it does contain some logic (maybe this is the problem?). So what I'd like is to be able to test my File IO class to ensure that it makes the correct system calls by mocking the System.IO classes themselves. Ideally this would be as easy as having a constructor like so: public FileIO( System.IO.Directory directory, System.IO.File file, System.IO.FileStream fileStream ) { this.Directory = directory; this.File = file; this.FileStream = fileStream; } And then calling in methods like: public GetFilesInFolder(string folderPath) { return this.Directory.GetFiles(folderPath) } But this doesn't fly since the System.IO classes in question are static classes. As far as I can tell they can neither be instantiated in this way or subclassed for the purposes of mocking.

    Read the article

  • Upload image from URL to FTP server using PHP

    - by user1807556
    I want to upload a picture from another site to my FTP server using PHP. Example: File to upload("http://page.mi.fu-berlin.de/krudolph/stuff/stackoverflow.png") FTP-path("pictures/") This is what I've already tried: 1 $image = file_get_contents("http://img.youtube.com/vi/Rz8KW4Tveps/1.jpg"); file_put_contents("imgfolder/imgID.jpg", $image); 2 copy('http://img.youtube.com/vi/Rz8KW4Tveps/1.jpg', 'imgfolder/imgID.jpg'); 3 <?php set_time_limit (24 * 60 * 60); if (!isset($_POST['submit'])) die(); $file = fopen ($url, "rb"); if ($file) { $newf = fopen ($newfname, "wb"); if ($newf) while(!feof($file)) { fwrite($newf, fread($file, 1024 * 2000 ), 1024 * 2000 ); } } if ($file) { fclose($file); } if ($newf) { fclose($newf); } ?> 4 http://www.teckdevil.com/php-server-to-server-transfer-script-to-remotely-transfer-files/ 5 (kinda the same as first linked) Download files directly to my server # I don't get any errors when I'm running the scripts and I have chmod the directory to 777.

    Read the article

  • running php script manually and getting back to the command line?

    - by Simpson88Keys
    I'm running a php script via command line, and it works just fine, except that when it's finished, it doesn't go back to the command line? Just sits there, so I never when it's done... This is the script: $conn_id = ftp_connect(REMOTE) or die("Couldn't connect to ".REMOTE); $login_result = ftp_login($conn_id, 'OMITTED','OMITTED'); if ((!$conn_id) || (!$login_result)) die("FTP Connection Failed"); $dir = 'download'; if ($dir != ".") { if (ftp_chdir($conn_id, $dir) == false) { echo ("Change Dir Failed: $dir<BR>\r\n"); return; } if (!(is_dir($dir))) mkdir($dir); chdir ($dir); } $contents = ftp_nlist($conn_id, "."); foreach ($contents as $file) { if ($file == '.' || $file == '..') continue; if (@ftp_chdir($conn_id, $file)) { ftp_chdir ($conn_id, ".."); ftp_sync ($file); } else ftp_get($conn_id, $file, $file, FTP_BINARY); } ftp_chdir ($conn_id, ".."); chdir (".."); ftp_close($conn_id);

    Read the article

  • Preventing a child process (HandbrakeCLI) from causing the parent script to exit

    - by Chris
    I have a batch conversion script to turn .mkvs of various dimensions into ipod/iphone sized .mp4s, cropping/scaling to suit. Determining the original dimensions, required crop, output file is all working fine. However, on successful completion of the first conversion, HandbrakeCLI causes the parent script to exit. Why would this be? And how can I stop it? The code, as it currently stands: #!/bin/bash find . -name "*.mkv" | while read FILE do # What would the output file be? DST=../Touch/$(dirname "$FILE") MKV=$(basename "$FILE") MP4=${MKV%%.mkv}.mp4 # If it already exists, don't overwrite it if [ -e "$DST/$MP4" ] then echo "NOT overwriting $DST/$MP4" else # Stuff to determine dimensions/cropping removed for brevity HandbrakeCLI --preset "iPhone & iPod Touch" --vb 900 --crop $crop -i "$FILE" -o "$DST/$MP4" > /dev/null 2>&1 if [ $? != 0 ] then echo "$FILE had problems" >> errors.log fi fi done I have additionally tried it with a trap, but that didn't change the behaviour (although the last trap did fire) trap "echo Handbrake SIGINT-d" SIGINT trap "echo Handbrake SIGTERM-d" SIGTERM trap "echo Handbrake EXIT-d" EXIT trap "echo Handbrake 0-d" 0

    Read the article

  • Reading input files in FORTRAN

    - by lollygagger
    Purpose: Create a program that takes two separate files, opens and reads them, assigns their contents to arrays, do some math with those arrays, create a new array with product numbers, print to a new file. Simple enough right? My input files have comment characters at the beginning. One trouble is, they are '#' which are comment characters for most plotting programs, but not FORTRAN. What is a simple way to tell the computer not to look at these characters? Since I have no previous FORTRAN experience, I am plowing through this with two test files. Here is what I have so far: PROGRAM gain IMPLICIT NONE REAL, DIMENSION (1:4, 1:8) :: X, Y, Z OPEN(1, FILE='test.out', & STATUS='OLD', ACTION='READ') ! opens the first file READ(1,*), X OPEN(2, FILE='test2.out', & STATUS='OLD', ACTION='READ') ! opens the second file READ(2,*), Y PRINT*, X, Y Z = X*Y ! PRINT*, Z OPEN(3, FILE='test3.out', STATUS='NEW', ACTION='WRITE') !creates a new file WRITE(3,*), Z CLOSE(1) CLOSE(2) CLOSE(3) END PROGRAM PS. Please do not overwhelm me with a bunch of code monkey gobblety gook. I am a total programming novice. I do not understand all the lingo, that is why I came here instead of searching for help in existing websites. Thanks.

    Read the article

  • Netbeans, JPA Entity Beans in seperate projects. Unknown entity bean class

    - by Stu
    I am working in Netbeans and have my entity beans and web services in separate projects. I include the entity beans in the web services project however the ApplicaitonConfig.java file keeps getting over written and removing the entries I make for the entity beans in the associated jar file. My question is: is it required to have both the EntityBeans and the WebServices share the same project/jar file? If not what is the appropriate way to include the entity beans which are in the jar file? <?xml version="1.0" encoding="UTF-8"?> <persistence version="2.1" xmlns="http://xmlns.jcp.org/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence http://xmlns.jcp.org/xml/ns/persistence/persistence_2_1.xsd"> <persistence-unit name="jdbc/emrPool" transaction-type="JTA"> <provider>org.eclipse.persistence.jpa.PersistenceProvider</provider> <jta-data-source>jdbc/emrPool</jta-data-source> <exclude-unlisted-classes>false</exclude-unlisted-classes> <properties> <property name="eclipselink.ddl-generation" value="none"/> <property name="eclipselink.cache.shared.default" value="false"/> </properties> </persistence-unit> </persistence> Based on Melc's input i verified that that the transaction type is set to JTA and the jta-data-source is set to the value for the Glassfish JDBC Resource. Unfortunately the problem still persists. I have opened the WAR file and validated that the EntityBean.jar file is the latest version and is located in the WEB-INF/lib directory of the War file. I think it is tied to the fact that the entities are not being "registered" with the entity manager. However i do not know why they are not being registered.

    Read the article

  • fgets throwing unhandled exception while parsing stl

    - by user3478400
    I am new to c++, I am trying to parse a stl file which is of about 64MB and has about ~18K lines in it. The code works fine for first few 100 lines but then fgets throws following exception: "Unhandled exception at 0x77B0BAC5 (ntdll.dll) in STLparser.exe: 0xC0000024: There is a mismatch between the type of object required by the requested operation and the type of object that is specified in the request." I have checked manually the line for which fgets throws exception, there is nothing out of ordinary there. I am out of options for now. Any help to fix this issue will be greatly appreciated. ================CODE========================== #include<fstream> #include<iostream> #include"ParseString.h" #include"Vectors.h" using namespace std; int main(void) { //Define variables FILE *file; char *line = new char; parsestring oneline; int n_Vols = 0, n_Elms = 0, n_nods = -1, E = 0; Nod *nodes = new Nod(); Nod dummy; Elm *elements = new Elm(); int mycounter = 0; //Open file fopen_s(&file, "sample.stl", "r"); while (fgets(line, 1024, file) != NULL) //**********Getting Error Here************* { // populate required data } fclose(file); printf("%d,%d,%d", n_Vols, n_Elms, n_nods); getchar(); return 0; } ===================When broken, execution resumes at this function (not my function, something internal) void __cdecl _unlock ( int locknum ) { /* * leave the critical section. */ LeaveCriticalSection( _locktable[locknum].lock ); }

    Read the article

  • Sha256 is giving junk output

    - by user1746617
    hey can u be more specific on how to convert to bin to hex. bool HashStatus::calculate_digest_value(char * path,unsigned char * output) { FILE* file = fopen(path, "rb"); if(!file) { g_message("SignatureValidator::VerifyReferences,file not opened"); return -1; } unsigned char hash[SHA256_DIGEST_LENGTH]; SHA256_CTX sha256; SHA256_Init(&sha256); const int bufSize = 32768; unsigned char* buffer = malloc(bufSize); int bytesRead = 0; if(!buffer) return NULL; while((bytesRead = fread(buffer, 1, bufSize, file))) { g_message("calculate digest value,verify.cpp::%s",buffer); SHA256_Update(&sha256, buffer, bytesRead); } SHA256_Final(hash, &sha256); g_message("verify.cpp,after final"); sha256_hash_string(hash, output); g_message("verify.cpp,after sha256_hash_string %s",hash); fclose(file); free(buffer); return true; } this is my code to convert file data into hash using sha256 openssl function o/p is :1d54e12333988471354907a760b9cde861423615bb5255ee837e3b27b32366 but actual o/p is:HVThAjM5iEcTVJB6dgC5zehhQjYVu1JV7oN+OyezI2Y= can you guys please help me with whatz wrong with this code,ASAP and i'm new to this please guide me step by step and in detail..

    Read the article

  • Error in django using Apache & mod_wsgi

    - by Ignacio
    Hey, I've been doing some changes to my django develpment env, as some of you suggested. So far I've managed to configure and run it successfully with postgres. Now I'm trying to run the app using apache2 and mod_wsgi, but I ran into this little problem after I followed the guidelines from the django docs. When I access localhost/myapp/tasks this error raises: Request Method: GET Request URL: http://localhost/myapp/tasks/ Exception Type: TemplateSyntaxError Exception Value: Caught an exception while rendering: argument 1 must be a string or unicode object Original Traceback (most recent call last): File "/usr/local/lib/python2.6/dist-packages/django/template/debug.py", line 71, in render_node result = node.render(context) File "/usr/local/lib/python2.6/dist-packages/django/template/defaulttags.py", line 126, in render len_values = len(values) File "/usr/local/lib/python2.6/dist-packages/django/db/models/query.py", line 81, in __len__ self._result_cache = list(self.iterator()) File "/usr/local/lib/python2.6/dist-packages/django/db/models/query.py", line 238, in iterator for row in self.query.results_iter(): File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/query.py", line 287, in results_iter for rows in self.execute_sql(MULTI): File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/query.py", line 2369, in execute_sql cursor.execute(sql, params) File "/usr/local/lib/python2.6/dist-packages/django/db/backends/util.py", line 19, in execute return self.cursor.execute(sql, params) TypeError: argument 1 must be a string or unicode object ... ... ... And then it highlights a {% for t in tasks %} template tag, like the source of the problem is there, but it worked fine on the built-in server. The view associated with that page is really simple, just fetch all Task objects. And the template just displays them on a table. Also, some pages get rendered ok. Don't want to fill this Question with code, so if you need some more info I'd be glad to provide it. Thanks

    Read the article

  • C++ map performance - Linux (30 sec) vs Windows (30 mins) !!!

    - by sonofdelphi
    I need to process a list of files. The processing action should not be repeated for the same file. The code I am using for this is - using namespace std; vector<File*> gInputFileList; //Can contain duplicates, File has member sFilename map<string, File*> gProcessedFileList; //Using map to avoid linear search costs void processFile(File* pFile) { File* pProcessedFile = gProcessedFileList[pFile->sFilename]; if(pProcessedFile != NULL) return; //Already processed foo(pFile); //foo() is the action to do for each file gProcessedFileList[pFile->sFilename] = pFile; } void main() { size_t n= gInputFileList.size(); //Using array syntax (iterator syntax also gives identical performance) for(size_t i=0; i<n; i++){ processFile(gInputFileList[i]); } } The code works correctly, but... My problem is that when the input size is 1000, it takes 30 minutes - HALF AN HOUR - on Windows/Visual Studio 2008 Express (both Debug and Release builds). For the same input, it takes only 40 seconds to run on Linux/gcc! What could be the problem? The action foo() takes only a very short time to execute, when used separately. Should I be using something like vector::reserve for the map?

    Read the article

  • Why isn't the pathspec magic :(exclude) excluding the files I specify from git log's output?

    - by Jubobs
    This is a follow-up to Ignore files in git log -p and is also related to Making 'git log' ignore changes for certain paths. I'm using Git 1.9.2. I'm trying to use the pathspec magic :(exclude) to specify that some patches should not be shown in the output of git log -p. However, patches that I want to exclude still show up in the output. Here is minimal working example that reproduces the situation: cd ~/Desktop mkdir test_exclude cd test_exclude git init mkdir testdir echo "my first cpp file" >testdir/test1.cpp echo "my first xml file" >testdir/test2.xml git add testdir/ git commit -m "added two test files" Now I want to show all patches in my history expect those corresponding to XML files in the testdir folder. Therefore, following VonC's answer, I run git log --patch -- . ":(exclude)testdir/*.xml" but the patch for my testdir/test2.xml file still shows up in the output: commit 37767da1ad4ad5a5c902dfa0c9b95351e8a3b0d9 Author: xxxxxxxxxxxxxxxxxxxxxxxxx Date: Mon Aug 18 12:23:56 2014 +0100 added two test files diff --git a/testdir/test1.cpp b/testdir/test1.cpp new file mode 100644 index 0000000..3a721aa --- /dev/null +++ b/testdir/test1.cpp @@ -0,0 +1 @@ +my first cpp file diff --git a/testdir/test2.xml b/testdir/test2.xml new file mode 100644 index 0000000..8b7ce86 --- /dev/null +++ b/testdir/test2.xml @@ -0,0 +1 @@ +my first xml file What am I doing wrong? What should I do to tell git log -p not to show the patch associated with all XML files in my testdir folder?

    Read the article

  • Execute files included in Resource folder

    - by Sumeet Pujari
    In WPF application where I have included some files in resources, I want to execute them on a button click. How do I specify a path in Process.Start(). private void button1_Click_2(object sender, RoutedEventArgs e) { Process.Start("test.txt"); } Or is there any other way? private void button1_Click_2(object sender, RoutedEventArgs e) { string path = System.IO.Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location) + @"\test.txt"; if (File.Exists(path)) { Process.Start(new ProcessStartInfo(path)); } else { MessageBox.Show("No file found"+path); } I added a message box and it showed No files found. :( EDIT: I Tried to check the path after publishing and this what i got. No File Found With a Path - C:\Users\Administrator\AppData\Local\Apps\2.0... test.txt Before I published the Application I got a path which id No File Found at ..project..\bin\Debug\test.txt which is obvious since my Resource file not included there its Under a Resource Folder and not Debug when i add a test file in debug it open without any problem. Can someone Help throwing some light on this case. EDIT: I want to open a file from Resource directory @ C:\Users\Administrator\Documents\Visual Studio 2010\Projects\FastFix\FastFix\Resources Which would be included in my project when i am going to publish it is going to run as a standalone application without installation.

    Read the article

  • Single-entry implementation gone wrong

    - by user745434
    I'm doing my first single-entry site and based on the result, I can't see the benefit. I've implemented the following: .htaccess redirects all requests to index.php at the root Url is parsed and each /segment/ is stored as an element in an array First segment indicates which folder to include (e.g. "users" » "/pages/users/index.php"). index.php file of each folder parses the remaining elements in the segments array until array is empty. content.php file of each folder is included if there are no more elements in the segments array, indicating that the destination file is reached Sample File structure ( folders in [] ): [root] index.php [pages] [users] index.php content.php [profile] index.php content.php [edit] index.php content.php [other-page] index.php content.php Request: http://mysite.com/users/profile/ .htaccess redirects request to http://mysite.com/index.php URL is parsed and segments array contains: [1] users, [2] profile index.php maps [1] to "pages/users/index.php", so includes that file pages/users/index.php maps [2] to pages/users/profile/index.php, so includes that file Since no other elements in the segments array, the contents.php file in the current folder (pages/users/profile) is included. I'm not really seeing the benefit of doing this over having functions that include components of the site (e.g. include_header(), include_footer(), etc.), so I conclude that I'm doing something terribly wrong. I'm just not sure what it is.

    Read the article

  • java.awt.Desktop.open doesn’t work with PDF files?

    - by Jason S
    It looks like I cannot use Desktop.open() on PDF files regardless of location. Here's a small test program: package com.example.bugs; import java.awt.Desktop; import java.io.File; import java.io.IOException; public class DesktopOpenBug { static public void main(String[] args) { try { Desktop desktop = null; // Before more Desktop API is used, first check // whether the API is supported by this particular // virtual machine (VM) on this particular host. if (Desktop.isDesktopSupported()) { desktop = Desktop.getDesktop(); for (String path : args) { File file = new File(path); System.out.println("Opening "+file); desktop.open(file); } } } catch (IOException e) { e.printStackTrace(); } } } If I run DesktopOpenBug with arguments c:\tmp\zz1.txt c:\tmp\zz.xml c:\tmp\ss.pdf (3 files I happen to have lying around) I get this result: (the .txt and .xml files open up fine) Opening c:\tmp\zz1.txt Opening c:\tmp\zz.xml Opening c:\tmp\ss.pdf java.io.IOException: Failed to open file:/c:/tmp/ss.pdf. Error message: The parameter is incorrect. at sun.awt.windows.WDesktopPeer.ShellExecute(Unknown Source) at sun.awt.windows.WDesktopPeer.open(Unknown Source) at java.awt.Desktop.open(Unknown Source) at com.example.bugs.DesktopOpenBug.main(DesktopOpenBug.java:21) What the heck is going on? I'm running WinXP, I can type "c:\tmp\ss.pdf" at the command prompt and it opens up just fine. edit: if this is an example of Sun Java bug #6764271 please help by voting for it. What a pain. :(

    Read the article

  • displaying a physical webpage with frames in iframe

    - by ksa
    i have iframe in my webpage, i have done the coding part for browsing the folders and viewing files in iframe.each folder has a index.html.now i need to display the index.html in iframe.index.html page contains frames divided into 2,each from different sources.i tried to display a sample webpage without frames and it was a success,page with frames spoils the party. my code for getting the html file <% String path=request.getParameter("name"); File directory = new File(path); FileFilter fileFilter=new FileFilter() { @Override public boolean accept(File file) { return file.getName().endsWith("html"); } }; File[] files = directory.listFiles(fileFilter); for (int index = 0; index < files.length; index++) { String s= files[index].getName(); String s1=files[index].getAbsolutePath(); %> <a href="fileview?name=<%=s1%>" target="sss"><%=s%></a> <% } %> mycode for displaying the page in iframe. public void doGet(HttpServletRequest req,HttpServletResponse res) throws ServletException,IOException { res.setContentType("text/html"); PrintWriter out=res.getWriter(); String name=req.getParameter("name"); res.setHeader("Content-Disposition", "inline; filename=\""+name+"\""); java.io.FileInputStream fis=new java.io.FileInputStream(name); int i; while((i=fis.read())!=-1) { out.write(i); } fis.close(); out.close(); }

    Read the article

  • Accessing to Request object will lead to ReadEntityBody to return 0 (in a HttpHandler class)

    - by EBAG
    I created a httpHandler that successfully implements IHttpHandler for handling file uploads. It works perfectly fine. You send the file with the form, the class receives it and will save it to hard disk. It reads chunks of file with ReadEntityBody function of HttpWorkerRequest class. Here is the situation i'm faced with.If at any stage before trying to read the file with ReadEntityBody, I try to access Request object (even Request.InputStream.Length!) ReadEntityBody would return 0 means it won't read from file stream. After further testing I found out the reason behind it. Accessing to Context.Current.Request object will trigger some sort of functionality that will cause asp.net to handle file uploads at that moment by it's own! I believe this is a bug. for example exactly after this line of code, asp.net will upload the file completely, and so there will be no stream for ReadEntityBody to read from later. int FileSize = context.Request.InputStream.Length; Can anybody tell how to stop this?

    Read the article

  • why does vector.size() read in one line too little?

    - by ace
    when running the following code, the amount of lines will read on less then there actually is (if the input file is main itself, or otherwise) why is this and how can i change that fact (besides for just adding 1)? #include <fstream> #include <iostream> #include <string> #include <vector> using namespace std; int main() { // open text file for input string file_name; cout << "please enter file name: "; cin >> file_name; // associate the input file stream with a text file ifstream infile(file_name.c_str()); // error checking for a valid filename if ( !infile ) { cerr << "Unable to open file " << file_name << " -- quitting!\n"; return( -1 ); } else cout << "\n"; // some data structures to perform the function vector<string> lines_of_text; string textline; // read in text file, line by line while (getline( infile, textline, '\n' )) { // add the new element to the vector lines_of_text.push_back( textline ); // print the 'back' vector element - see the STL documentation cout << "line read: " << lines_of_text.back() << "\n"; } cout<<lines_of_text.size(); return 0; }

    Read the article

  • Compat Wireless Drivers Centrino N-2230

    - by user2699451
    So I am using linux and am having trouble installing the Compat Wireless drivers Hardware: Intel Centrino N-2230 OS: Linux Mint 64bit (kernel 13.08-generic) I followed this link http://www.mathyvanhoef.com/2012/09/compat-wireless-injection-patch-for.html Output: apt-get install linux-headers-$(uname -r) Reading package lists... Done Building dependency tree Reading state information... Done linux-headers-3.8.0-19-generic is already the newest version. 0 upgraded, 0 newly installed, 0 to remove and 19 not upgraded. charles-W55xEU compat-wireless-2010-10-16 # cd ~ charles-W55xEU ~ # dir adt-bundle-linux-x86_64-20130917.zip Desktop known_hosts_backup charles-W55xEU ~ # wget http://www.orbit-lab.org/kernel/compat-wireless-3-stable/v3.6/compat-wireless-3.6.2-1-snp.tar.bz2 --2013-10-29 10:28:23-- http://www.orbit-lab.org/kernel/compat-wireless-3-stable/v3.6/compat-wireless-3.6.2-1-snp.tar.bz2 Resolving www.orbit-lab.org (www.orbit-lab.org)... 128.6.192.131 Connecting to www.orbit-lab.org (www.orbit-lab.org)|128.6.192.131|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 4443700 (4,2M) [application/x-bzip2] Saving to: ‘compat-wireless-3.6.2-1-snp.tar.bz2’ 100%[======================================>] 4 443 700 13,5KB/s in 11m 3s 2013-10-29 10:39:27 (6,55 KB/s) - ‘compat-wireless-3.6.2-1-snp.tar.bz2’ saved [4443700/4443700] charles-W55xEU ~ # tar -xf compat-wireless-3.6.2-1-snp.tar.bz2 charles-W55xEU ~ # cd compat-wireless-3.6-rc6-1 bash: cd: compat-wireless-3.6-rc6-1: No such file or directory charles-W55xEU ~ # dir adt-bundle-linux-x86_64-20130917.zip Desktop compat-wireless-3.6.2-1-snp known_hosts_backup compat-wireless-3.6.2-1-snp.tar.bz2 charles-W55xEU ~ # cd compat-wireless-3.6.2-1-snp/ charles-W55xEU compat-wireless-3.6.2-1-snp # dir code-metrics.txt defconfigs linux-next-pending pending-stable compat drivers MAINTAINERS README config.mk enable-older-kernels Makefile scripts COPYRIGHT include net udev crap linux-next-cherry-picks patches charles-W55xEU compat-wireless-3.6.2-1-snp # wget http://patches.aircrack-ng.org/mac80211.compat08082009.wl_frag+ack_v1.patch --2013-10-29 10:40:52-- http://patches.aircrack-ng.org/mac80211.compat08082009.wl_frag+ack_v1.patch Resolving patches.aircrack-ng.org (patches.aircrack-ng.org)... 213.186.33.2, 2001:41d0:1:1b00:213:186:33:2 Connecting to patches.aircrack-ng.org (patches.aircrack-ng.org)|213.186.33.2|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 1049 (1,0K) [text/plain] Saving to: ‘mac80211.compat08082009.wl_frag+ack_v1.patch’ 100%[======================================>] 1 049 --.-K/s in 0s 2013-10-29 10:40:56 (180 MB/s) - ‘mac80211.compat08082009.wl_frag+ack_v1.patch’ saved [1049/1049] charles-W55xEU compat-wireless-3.6.2-1-snp # patch -p1 < mac80211.compat08082009.wl_frag+ack_v1.patch patching file net/mac80211/tx.c Hunk #1 succeeded at 792 (offset 115 lines). charles-W55xEU compat-wireless-3.6.2-1-snp # wget -Ocompatwireless_chan_qos_frag.patch http://pastie.textmate.org/pastes/4882675/download --2013-10-29 10:43:18-- http://pastie.textmate.org/pastes/4882675/download Resolving pastie.textmate.org (pastie.textmate.org)... 178.79.137.125 Connecting to pastie.textmate.org (pastie.textmate.org)|178.79.137.125|:80... connected. HTTP request sent, awaiting response... 301 Moved Permanently Location: http://pastie.org/pastes/4882675/download [following] --2013-10-29 10:43:20-- http://pastie.org/pastes/4882675/download Resolving pastie.org (pastie.org)... 96.126.119.119 Connecting to pastie.org (pastie.org)|96.126.119.119|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 2036 (2,0K) [application/octet-stream] Saving to: ‘compatwireless_chan_qos_frag.patch’ 100%[======================================>] 2 036 --.-K/s in 0,001s 2013-10-29 10:43:21 (3,35 MB/s) - ‘compatwireless_chan_qos_frag.patch’ saved [2036/2036] charles-W55xEU compat-wireless-3.6.2-1-snp # patch -p1 < compatwireless_chan_qos_frag.patch patching file drivers/net/wireless/rtl818x/rtl8187/dev.c patching file net/mac80211/tx.c Hunk #1 succeeded at 1495 (offset 8 lines). patching file net/wireless/chan.c charles-W55xEU compat-wireless-3.6.2-1-snp # make ./scripts/gen-compat-autoconf.sh /root/compat-wireless-3.6.2-1-snp/.config /root/compat-wireless-3.6.2-1-snp/config.mk > include/linux/compat_autoconf.h make -C /lib/modules/3.8.0-19-generic/build M=/root/compat-wireless-3.6.2-1-snp modules make[1]: Entering directory `/usr/src/linux-headers-3.8.0-19-generic' CC [M] /root/compat-wireless-3.6.2-1-snp/compat/main.o LD [M] /root/compat-wireless-3.6.2-1-snp/compat/compat.o CC [M] /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.o In file included from /root/compat-wireless-3.6.2-1-snp/include/linux/bcma/bcma.h:8:0, from /root/compat-wireless-3.6.2-1-snp/drivers/bcma/bcma_private.h:8, from /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.c:8: /root/compat-wireless-3.6.2-1-snp/include/linux/bcma/bcma_driver_pci.h:217:23: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘bcma_core_pci_init’ In file included from /root/compat-wireless-3.6.2-1-snp/include/linux/bcma/bcma.h:10:0, from /root/compat-wireless-3.6.2-1-snp/drivers/bcma/bcma_private.h:8, from /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.c:8: /root/compat-wireless-3.6.2-1-snp/include/linux/bcma/bcma_driver_gmac_cmn.h:95:23: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘bcma_core_gmac_cmn_init’ In file included from /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.c:8:0: /root/compat-wireless-3.6.2-1-snp/drivers/bcma/bcma_private.h:25:15: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘bcma_bus_register’ /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.c:152:15: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘bcma_bus_register’ /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.c:17:21: warning: ‘bcma_bus_next_num’ defined but not used [-Wunused-variable] /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.c:93:12: warning: ‘bcma_register_cores’ defined but not used [-Wunused-function] make[3]: *** [/root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.o] Error 1 make[2]: *** [/root/compat-wireless-3.6.2-1-snp/drivers/bcma] Error 2 make[1]: *** [_module_/root/compat-wireless-3.6.2-1-snp] Error 2 make[1]: Leaving directory `/usr/src/linux-headers-3.8.0-19-generic' make: *** [modules] Error 2 charles-W55xEU compat-wireless-3.6.2-1-snp # make install Warning: You may or may not need to update your initframfs, you should if any of the modules installed are part of your initramfs. To add support for your distribution to do this automatically send a patch against ./scripts/update-initramfs. If your distribution does not require this send a patch against the '/usr/bin/lsb_release -i -s': LinuxMint tag for your distribution to avoid this warning. make -C /lib/modules/3.8.0-19-generic/build M=/root/compat-wireless-3.6.2-1-snp modules make[1]: Entering directory `/usr/src/linux-headers-3.8.0-19-generic' CC [M] /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.o In file included from /root/compat-wireless-3.6.2-1-snp/include/linux/bcma/bcma.h:8:0, from /root/compat-wireless-3.6.2-1-snp/drivers/bcma/bcma_private.h:8, from /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.c:8: /root/compat-wireless-3.6.2-1-snp/include/linux/bcma/bcma_driver_pci.h:217:23: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘bcma_core_pci_init’ In file included from /root/compat-wireless-3.6.2-1-snp/include/linux/bcma/bcma.h:10:0, from /root/compat-wireless-3.6.2-1-snp/drivers/bcma/bcma_private.h:8, from /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.c:8: /root/compat-wireless-3.6.2-1-snp/include/linux/bcma/bcma_driver_gmac_cmn.h:95:23: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘bcma_core_gmac_cmn_init’ In file included from /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.c:8:0: /root/compat-wireless-3.6.2-1-snp/drivers/bcma/bcma_private.h:25:15: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘bcma_bus_register’ /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.c:152:15: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘bcma_bus_register’ /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.c:17:21: warning: ‘bcma_bus_next_num’ defined but not used [-Wunused-variable] /root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.c:93:12: warning: ‘bcma_register_cores’ defined but not used [-Wunused-function] make[3]: *** [/root/compat-wireless-3.6.2-1-snp/drivers/bcma/main.o] Error 1 make[2]: *** [/root/compat-wireless-3.6.2-1-snp/drivers/bcma] Error 2 make[1]: *** [_module_/root/compat-wireless-3.6.2-1-snp] Error 2 make[1]: Leaving directory `/usr/src/linux-headers-3.8.0-19-generic' make: *** [modules] Error 2 charles-W55xEU compat-wireless-3.6.2-1-snp # It keeps giving errors, same with other sites, I get the same errors??? I am lost, help needed

    Read the article

  • IIS Strategies for Accessing Secured Network Resources

    - by ErikE
    Problem: A user connects to a service on a machine, such as an IIS web site or a SQL Server database. The site or the database need to gain access to network resources such as file shares (the most common) or a database on a different server. Permission is denied. This is because the user the service is running under doesn't have network permissions in the first place, or if it does, it doesn't have rights to access the remote resource. I keep running into this problem over and over again and am tired of not having a really solid way of handling it. Here are some workarounds I'm aware of: Run IIS as a custom-created domain user who is granted high permissions If permissions are granted one file share at a time, then every time I want to read from a new share, I would have to ask a network admin to add it for me. Eventually, with many web sites reading from many shares, it is going to get really complicated. If permissions are just opened up wide for the user to access any file shares in our domain, then this seems like an unnecessary security surface area to present. This also applies to all the sites running on IIS, rather than just the selected site or virtual directory that needs the access, a further surface area problem. Still use the IUSR account but give it network permissions and set up the same user name on the remote resource (not a domain user, a local user) This also has its problems. For example, there's a file share I am using that I have full rights to for sharing, but I can't log in to the machine. So I have to find the right admin and ask him to do it for me. Any time something has to change, it's another request to an admin. Allow IIS users to connect as anonymous, but set the account used for anonymous access to a high-privilege one This is even worse than giving the IIS IUSR full privileges, because it means my web site can't use any kind of security in the first place. Connect using Kerberos, then delegate This sounds good in principle but has all sorts of problems. First of all, if you're using virtual web sites where the domain name you connect to the site with is not the base machine name (as we do frequently), then you have to set up a Service Principal Name on the webserver using Microsoft's SetSPN utility. It's complicated and apparently prone to errors. Also, you have to ask your network/domain admin to change security policy for both the web server and the domain account so they are "trusted for delegation." If you don't get everything perfectly right, suddenly your intended Kerberos authentication is NTLM instead, and you can only impersonate rather than delegate, and thus no reaching out over the network as the user. Also, this method can be problematic because sometimes you need the web site or database to have permissions that the connecting user doesn't have. Create a service or COM+ application that fetches the resource for the web site Services and COM+ packages are run with their own set of credentials. Running as a high-privilege user is okay since they can do their own security and deny requests that are not legitimate, putting control in the hands of the application developer instead of the network admin. Problems: I am using a COM+ package that does exactly this on Windows Server 2000 to deliver highly sensitive images to a secured web application. I tried moving the web site to Windows Server 2003 and was suddenly denied permission to instantiate the COM+ object, very likely registry permissions. I trolled around quite a bit and did not solve the problem, partly because I was reluctant to give the IUSR account full registry permissions. That seems like the same bad practice as just running IIS as a high-privilege user. Note: This is actually really simple. In a programming language of your choice, you create a class with a function that returns an instance of the object you want (an ADODB.Connection, for example), and build a dll, which you register as a COM+ object. In your web server-side code, you create an instance of the class and use the function, and since it is running under a different security context, calls to network resources work. Map drive letters to shares This could theoretically work, but in my mind it's not really a good long-term strategy. Even though mappings can be created with specific credentials, and this can be done by others than a network admin, this also is going to mean that there are either way too many shared drives (small granularity) or too much permission is granted to entire file servers (large granularity). Also, I haven't figured out how to map a drive so that the IUSR gets the drives. Mapping a drive is for the current user, I don't know the IUSR account password to log in as it and create the mappings. Move the resources local to the web server/database There are times when I've done this, especially with Access databases. Does the database have to live out on the file share? Sometimes, it was just easiest to move the database to the web server or to the SQL database server (so the linked server to it would work). But I don't think this is a great all-around solution, either. And it won't work when the resource is a service rather than a file. Move the service to the final web server/database I suppose I could run a web server on my SQL Server database, so the web site can connect to it using impersonation and make me happy. But do we really want random extra web servers on our database servers just so this is possible? No. Virtual directories in IIS I know that virtual directories can help make remote resources look as though they are local, and this supports using custom credentials for each virtual directory. I haven't been able to come up with, yet, how this would solve the problem for system calls. Users could reach file shares directly, but this won't help, say, classic ASP code access resources. I could use a URL instead of a file path to read remote data files in a web page, but this isn't going to help me make a connection to an Access database, a SQL server database, or any other resource that uses a connection library rather than being able to just read all the bytes and work with them. I wish there was some kind of "service tunnel" that I could create. Think about how a VPN makes remote resources look like they are local. With a richer aliasing mechanism, perhaps code-based, why couldn't even database connections occur under a defined security context? Why not a special Windows component that lets you specify, per user, what resources are available and what alternate credentials are used for the connection? File shares, databases, web sites, you name it. I guess I'm almost talking about a specialized local proxy server. Anyway, so there's my list. I may update it if I think of more. Does anyone have any ideas for me? My current problem today is, yet again, I need a web site to connect to an Access database on a file share. Here we go again...

    Read the article

< Previous Page | 714 715 716 717 718 719 720 721 722 723 724 725  | Next Page >