Search Results

Search found 101795 results on 4072 pages for 'encrypting file system'.

Page 166/4072 | < Previous Page | 162 163 164 165 166 167 168 169 170 171 172 173  | Next Page >

  • Ubuntu 14.04 crashes after several seconds

    - by AndrewJoyce86
    I just installed Ubuntu on my Compaq desktop and everything seemed to go well. The desktop appeared just like it should... Only then this happens, seconds later: Crash Something tells me that this is not supposed to be happening. I did not partition, I replaced Windows 7 with Ubuntu. When I try to run off the USB stick I used, I get this: USB So, what do I do, now? Should I try to reinstall? Is the ISO bad, somehow? System specs: Compaq Model CQ5205Y AMD Semperon LE-1300 2 GB RAM 250 GB HDD

    Read the article

  • System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse request failed with HTTP status 40

    - by John Galt
    I am trying to make some enhancements to a production web app. After quite a bit of unit testing on my WinXP IIS 5.1 development machine, everything works on my localhost so I used the Visual Studio 2008 PUBLISH dialog on my Dev PC to push the following projects to a staging server: the primary web app the "primary" webservice (the home page tries to invoke this WS) a "secondary" webservice (not yet a problem because home page does not invoke this WS) I get the following when I try to browse to the home page of the web app typing this into my browser: link text Server Error in '/zVersion2' Application. The request failed with HTTP status 404: Not Found. Description: An unhandled exception occurred during the execution of the current web request.Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Net.WebException: The request failed with HTTP status 404: Not Found. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace: [WebException: The request failed with HTTP status 404: Not Found.] System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse(SoapClientMessage message, WebResponse response, Stream responseStream, Boolean asyncCall) +431289 System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke(String methodName, Object[] parameters) +204 ProxyZipeeeService.WSZipeee.Zipeee.GetMessageByType(Int32 iMsgType) in C:\Documents and Settings\johna\My Documents\Visual Studio 2008\Projects\ProxyZipeeeService\ProxyZipeeeService\Web References\WSZipeee\Reference.vb:2168 Zipeee.frmZipeee.LoadMessage() in C:\Documents and Settings\johna\My Documents\Visual Studio 2008\Projects\Zipeee\frmZipeee.aspx.vb:43 Zipeee.frmZipeee.Page_Load(Object sender, EventArgs e) in C:\Documents and Settings\johna\My Documents\Visual Studio 2008\Projects\Zipeee\frmZipeee.aspx.vb:33 System.Web.UI.Control.OnLoad(EventArgs e) +99 System.Web.UI.Control.LoadRecursive() +50 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +627 Version Information: Microsoft .NET Framework Version:2.0.50727.3607; ASP.NET Version:2.0.50727.3082 Here is a bit of the corresponding source code: Public wsZipeee As New ProxyZipeeeService.WSZipeee.Zipeee Dim dsStandardMsg As DataSet Private Sub Page_Load(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles MyBase.Load If Not Page.IsPostBack Then LoadMessage() End If End Sub Private Sub LoadMessage() Dim iCnt As Integer Dim iValue As Integer dsStandardMsg = wsZipeee.GetMessageByType(BizConstants.MsgType.Standard) End Sub I suspect I may have configured things incorrectly on the staging server. The staging server is Win Server 2003 ServicePack 2 running IIS 6.0. When I published the primary site and the 2 webservices on the staging server called MOJITO I created the physical directories for each on the D drive. Then using INETMGR, I configured the following virtual directories: zVersion2 zVersion2wsSQL zVersion2wsEmergency All of the above are configured to use a new application pool I setup and named zVersion2aspNet20. The default web site for this machine MOJITO is configured to use ASP.NET 1.1 and the IP address is set to (All Unassigned). The production versions of the latter 2 webservices run on the MOJITO machine (named ZipeeeService and EmergencyService respectively). Can my staging versions of the above webservices (named zVersion2wsSQL and zVersion2wsEmergency respectively) co-exist on the same web server with the same IP address? Please note that when I test the zVersion2wsSQL webservice independently (from INETMGR right-mouse and click Browse) it works as expected (i.e. presenting all the methods of the webservice) like this snippet: GetMessageByType MessageName="Get_x0020_Message_x0020_By_x0020_Type" I can test this webmethod by clicking on it and it presents the Test dialog (because it takes a simple datatype and I am invoking it on localhost (i.e. MOJITO): **Get Message By Type** **Test** To test the operation using the HTTP POST protocol, click the 'Invoke' button. Parameter Value iMsgType: _______ [INVOKE button] SOAP 1.1 ....etc. I fear I may have rambled with too much information so I will stop but I hope someone can help me as I cannot understand why this request results in a "not found". Thanks.

    Read the article

  • implementing a download manager that supports resuming

    - by Idan K
    hi, I intend on writing a small download manager in C++ that supports resuming (and multiple connections per download). From the info I gathered so far, when sending the http request I need to add a header field with a key of "Range" and the value "bytes=startoff-endoff". Then the server returns a http response with the data between those offsets. So roughly what I have in mind is to split the file to the number of allowed connections per file and send a http request per splitted part with the appropriate "Range". So if I have a 4mb file and 4 allowed connections, I'd split the file to 4 and have 4 http requests going, each with the appropriate "Range" field. Implementing the resume feature would involve remembering which offsets are already downloaded and simply not request those. Is this the right way to do this? What if the web server doesn't support resuming? (my guess is it will ignore the "Range" and just send the entire file) When sending the http requests, should I specify in the range the entire splitted size? Or maybe ask smaller pieces, say 1024k per request? When reading the data, should I write it immediately to the file or do some kind of buffering? I guess it could be wasteful to write small chunks. Should I use a memory mapped file? If I remember correctly, it's recommended for frequent reads rather than writes (I could be wrong). Is it memory wise? What if I have several downloads simultaneously? If I'm not using a memory mapped file, should I open the file per allowed connection? Or when needing to write to the file simply seek? (if I did use a memory mapped file this would be really easy, since I could simply have several pointers). Note: I'll probably be using Qt, but this is a general question so I left code out of it.

    Read the article

  • Calculate total batch upload transfer percent with limited information

    - by GONeale
    Hi there, I have a system which uploads to a server file by file and displays a progress bar on file upload progress, then underneath a second progress bar which I want to indicate percentage of batch complete across all files queued to upload. Information and algorithms I can work out are: Bytes Sent / Total Bytes To Send = First progress bar (eg. 512KB of 1024KB (50%)) That works fine. However supposing I have two other files left to upload, but both file sizes are unknown (as this is only known once the file is about to commence upload, at which point it is compressed and file size is determined) how would I go about making my third progress bar? I didn't think this would be possible as I would need "Total Bytes Sent" / "Total Bytes To Send", to replicate the logic of my first progress bar on a larger scale, however I did get a version working: "Current file number we are on" / "total number of files to send" returning the percentage through the batch, however obviously will not incrementally update and it's pretty crude. So on further thinking I thought if I could incorporate the current file % with this algorithm I could perhaps get the correct progress percentage of my batch's current point. I tried this algorithm, but alas to no such avail (sorry to any math heads, it's probably quite apparent why it won't work) ("Current file number we are on" / "total number of files to send") * ("Bytes Sent" / "Total Bytes To Send") For example I thought I was on the right track when testing with this example: 2/3 (2nd of 3rd file) = 66% (this is right so far) but then when I added * 0.20 (for indicating only 20% of 2nd file has uploaded) we went back to 13%. What I need is only a little over 33%! I did try the inverse at 0.80 and a (2/3 * (2/3 * 0.2)) Can this be done without knowing entire bytes in batch to upload? Please help! Thank you!

    Read the article

  • Force Download Files Broken Headers wrong?

    - by Sinan
    if($_POST['mode']=="save") { $root = $_SERVER['DOCUMENT_ROOT']; $path = "/mcboeking/"; $path = $root.$path; $file_path = $_POST['path']; $file = $path.$file_path; if(!file_exists($file)) { die('file not found'); } else { header("Cache-Control: public"); header("Content-Description: File Transfer"); header('Content-Type: application/force-download'); header("Content-Disposition: attachment; filename=\"".basename($file)."\";" ); header("Content-Length: ".filesize($file)); readfile($file);}} As soon as i download the file and open it i get an error message. When i try to open a .doc i get the message : file structure is invalid. And when i try to open a jpg : This file can not be opened. It may be corrupt or a file format that Preview does not recognize. But when i download PDF files, they open without any problem. Can someone help me? P.s. i tried different headers including : header('Content-Type: application/octet-stream');

    Read the article

  • Read/Write Files from the Content Provider

    - by drum
    I want to be able to create a file from the Content Provider, however I get the following error: java.io.Filenotfoundexception: /0: open file failed: erofs (read-only file system) What I am trying to do is create a file whenever an application calls the insert method from my Provider. This is the excerpt of the code that does the file creation: FileWriter fstream = new FileWriter(valueKey); BufferedWriter out = new BufferedWriter(fstream); out.write(valueContent); out.close(); Originally I wanted to use openFileOutput() but the function appears to be undefined. Anyone has a workaround to this problem? EDIT: I found out that I had to specify the directory as well. Here is a more complete snippet of the code: File file = new File("/data/data/Project.Package.Structure/files/"+valueKey); file.createNewFile(); FileWriter fstream = new FileWriter(file); BufferedWriter out = new BufferedWriter(fstream); out.write(valueContent); out.close(); I also enabled the permission <uses-permission android:name="android.permission.WRITE_INTERNAL_STORAGE" /> This time I got an error saying: java.io.IOException: open failed: ENOENT (No such file or directory)

    Read the article

  • File upload fails when user is authenticated. Using IIS7 Integrated mode.

    - by Nikkelmann
    These are the user identities my website tells me that it uses: Logged on: NT AUTHORITY\NETWORK SERVICE (Can not write any files at all) and Not logged on: WSW32\IUSR_77 (Can write files to any folder) I have a ASP.NET 4.0 website on a shared hosting IIS7 web server running in Integrated mode with 32-bit applications support enabled and MSSQL 2008. Using classic mode is not an option since I need to secure some static files and I use Routing. In my web.config file I have set the following: <system.webServer> <modules runAllManagedModulesForAllRequests="true" /> </system.webServer> My hosting company says that Impersonation is enabled by default on machine level, so this is not something I can change. I asked their support and they referred me to this article: http://www.codinghub.net/2010/08/differences-between-integrated-mode-and.html Citing this part: Different windows identity in Forms authentication When Forms Authentication is used by an application and anonymous access is allowed, the Integrated mode identity differs from the Classic mode identity in the following ways: * ServerVariables["LOGON_USER"] is filled. * Request.LogognUserIdentity uses the credentials of the [NT AUTHORITY\NETWORK SERVICE] account instead of the [NT AUTHORITY\INTERNET USER] account. This behavior occurs because authentication is performed in a single stage in Integrated mode. Conversely, in Classic mode, authentication occurs first with IIS 7.0 using anonymous access, and then with ASP.NET using Forms authentication. Thus, the result of the authentication is always a single user-- the Forms authentication user. AUTH_USER/LOGON_USER returns this same user because the Forms authentication user credentials are synchronized between IIS 7.0 and ASP.NET. A side effect is that LOGON_USER, HttpRequest.LogonUserIdentity, and impersonation no longer can access the Anonymous user credentials that IIS 7.0 would have authenticated by using Classic mode. How do I set up my website so that it can use the proper identity with the proper permissions? I've looked high and low for any answers regarding this specific problem, but found nil so far... I hope you can help!

    Read the article

  • PHP: parse $_FILES[] data in multidimesional array

    - by superUntitled
    I having been looking around for an answer to this and have not found an answer anywhere, I am hoping someone has done this before! I have a form that allows for dynamic duplication of the form fields. The form allows for file uploads and text input, so the data is sent in both $_POST and $_FILES arrays. The the initial set of inputs look like this: <input type="text" name="primary[1][text]" /> <input type="file" name="primary[1][file]" /> <input type="text" class="a" name="secondary[1][text][]" /> <input type="file" name="secondary[1][file][]" /> When duplicated the fields are incremented, they look like this: <input type="text" name="primary[2][text]" /> <input type="file" name="primary[2][file]" /> <input type="text" class="a" name="secondary[2][text][]" /> <input type="file" name="secondary[2][file][]" /> To complicate matters, the "secondary" form fields can also be duplicated (thus the [] at the end of the secondary name array. How can I parse the posted $_FILES array? I have tried something like this: foreach ($_FILES['question'] as $f_num) { echo $f['file']['name']; } but I get an "Undefined index: file... " error.

    Read the article

  • Javascript - why do I sometimes fail to read file content with GDownloadUrl?

    - by Daj pan spokój
    Hi everybody. I try to read some file with google's GDownloadUrl and it works only from time to time. failure means fileRows == "blah blah" success means fileRows == (real file content) I've noticed, however, that when I cease (with Firebug) the execution on line 3 for a couple of seconds, it succeeds more often. Maybe it is some kind of threading bug, then? Do You guys have any tip or idea? 1 var fileContent = "blah blah"; 2 availabilityFile = "input/available/" + date + ".csv"; 3 GDownloadUrl(availabilityFile, function(fileData) { 4 fileContent = fileData; 5 }); 6 fileRows = fileContent.split("\n");

    Read the article

  • Are there cloud network drives that let users lock files or mark them as "in use"?

    - by Brandon Craig Rhodes
    Having spent several hours reading about the features and limitations of services like DropBox and Jungle Disk and the hundreds of competitors they seem to have (as though everyone with an AWS account these days goes ahead and writes a file sharing application just for fun), I have yet to find one that would let a team of people at a small business collaborate without stepping all over each other's toes. At a small business there are often many small documents per project — estimates, contracts, project plans, budgets — and team members frequently have to open and edit them, with all sorts of problems happening if two people edit a file at once. Even if a sharing service is smart enough to keep both versions of the file created, most small-business software (like word processors, spreadsheets, estimating software, or billing systems) has no way to compare — much less to merge! — the changes in two rival versions of a file that two people edited at the same time without each other's knowledge. So, my question: are their cloud-based file sharing solutions that not only provide a virtual network drive that people can access, but that also let users lock files — even if it's not a real lock but just a flag or indicator — that could possibly prevent remote workers from both editing the same file at once? Having one person wait for another person to finish editing is a very, very small inconvenience compared to the hour or more than it can take to compare two estimates by hand until you find and resolve the rival changes. Given this fact, I am surprised that almost none of the popular file sharing solutions seem to recognize this problem and provide some solution! Does anyone know of a service that does?

    Read the article

  • Parsing JPEG file format: Format of entropy-coded segment (ECS) segments?

    - by me2
    I'm having difficulty understanding the ITU-T T.81 spec for the JPEG file format. Hopefully someone else here has tried to parse JPEG files and/or knows about the details of this file format. The spec indiates that the ECS0 segment starts after the SOS segment but I can't find where in the spec it actually talks about the format of the ECS0 segment or how do detect its start. Simple JPEG implementations online are of limited help because they assume many things about the JPEGs they parse. Can anyone point me in the right direction?

    Read the article

  • Are there cloud network drives that let users lock files or mark them as "in use"?

    - by Brandon Craig Rhodes
    Having spent several hours reading about the features and limitations of services like DropBox and Jungle Disk and the hundreds of competitors they seem to have (as though everyone with an AWS account these days goes ahead and writes a file sharing application just for fun), I have yet to find one that would let a team of people at a small business collaborate without stepping all over each other's toes. At a small business there are often many small documents per project — estimates, contracts, project plans, budgets — and team members frequently have to open and edit them, with all sorts of problems happening if two people edit a file at once. Even if a sharing service is smart enough to keep both versions of the file created, most small-business software (like word processors, spreadsheets, estimating software, or billing systems) has no way to compare — much less to merge! — the changes in two rival versions of a file that two people edited at the same time without each other's knowledge. So, my question: are their cloud-based file sharing solutions that not only provide a virtual network drive that people can access, but that also let users lock files — even if it's not a real lock but just a flag or indicator — that could possibly prevent remote workers from both editing the same file at once? Having one person wait for another person to finish editing is a very, very small inconvenience compared to the hour or more than it can take to compare two estimates by hand until you find and resolve the rival changes. Given this fact, I am surprised that almost none of the popular file sharing solutions seem to recognize this problem and provide some solution! Does anyone know of a service that does?

    Read the article

  • Unlock a file with unlocker from a WinForms App?

    - by netadictos
    I am trying to unlock a file from a C# program, using unlocker. In my UI, I put a button to unlock the file the app couldn't delete. When the user pushes the button, I want unlocker (the famous app) to be opened. I have read about in the Unlocker web, and there is some explanations about the commandline to use but nothing works. I write the following code but nothing happens: "c:\Program Files\unlocker\unlocker.exe" -L "PATHFORTHEFILE.doc" Nothing happens. I have tried without parameters and with -LU. Any idea? Something more efficient than unlocker to integrate it with software?

    Read the article

  • How can I read/write data from a file?

    - by samy
    I'm writing a simple chrome extension. I need to create the ability to add sites URLs to a list, or read from the list. I use the list to open the sites in the new tabs. I'm looking for a way to have a data file I can write to, and read from. I was thinking on XML. I read there is a problem changing the content of files with Javascript. Is XML the right choice for this kinda thing? I should add that there is no web server, and the app will run locally, so maybe the problem websites are having are not same as this. Before I wrote this question, I tried one thing, and started to feel insecure because it didn't work. I made a XML file called Site.xml: <?xml version="1.0" encoding="utf-8" ?> <Sites> <site> <url> http://www.sulamacademy.com/AddMsgForum.asp?FType=273171&SBLang=0&WSUAccess=0&LocSBID=20375 </url> </site> <site> <url> http://www.wow.co.il </url> </site> <site> <url> http://www.Google.co.il </url> </site> I made this script to read the data from him, and put in on the html file. function LoadXML() { var ajaxObj = new XMLHttpRequest(); ajaxObj.open('GET', 'Sites.xml', false); ajaxObj.send(); var myXML = ajaxObj.responseXML; document.write('<table border="2">'); var prs = myXML.getElementsByTagName("site"); for (i = 0; i < prs.length; i++) { document.write("<tr><td>"); document.write(prs[i].getElementsByName("url")[0].childNode[0].nodeValue); document.write("</td></tr>"); } document.write("</table"); }

    Read the article

  • SSIS 2005 - How to Import a Fixed Width Flat File?

    - by Greg
    I have a flat file that looks something like this: junk I don't care about \n \n columns names\n val1 val2 val3\n val1 val2 val3\n columns names \n val1 val2 val3\n I only care the lines with values. These value lines are all fixed width format and have the same line length. The other junk lines and column names can have any line width. When I try the flat file fixed width option or the ragged right option the preview looks all wrong. Any ideas what the easiest way to get this into SSIS is?

    Read the article

  • plupload not working in wordpress theme files

    - by Kedar B
    This is my code for image upload.... <a id="aaiu-uploader" class="aaiu_button" href="#"><?php _e('*Select Images (mandatory)','wpestate');?></a> <input type="hidden" name="attachid" id="attachid" value="<?php echo $attachid;?>"> <input type="hidden" name="attachthumb" id="attachthumb" value="<? php echo $thumbid;?>"> i want upload functionality more than one time in single page in wordpress.i have add js code for that same as first upload block but its not working. This is js code for image upload.... jQuery(document).ready(function($) { "use strict"; if (typeof(plupload) !== 'undefined') { var uploader = new plupload.Uploader(ajax_vars.plupload); uploader.init(); uploader.bind('FilesAdded', function (up, files) { $.each(files, function (i, file) { // console.log('append'+file.id ); $('#aaiu-upload-imagelist').append( '<div id="' + file.id + '">' + file.name + ' (' + plupload.formatSize(file.size) + ') <b></b>' + '</div>'); }); up.refresh(); // Reposition Flash/Silverlight uploader.start(); }); uploader.bind('UploadProgress', function (up, file) { $('#' + file.id + " b").html(file.percent + "%"); }); // On erro occur uploader.bind('Error', function (up, err) { $('#aaiu-upload-imagelist').append("<div>Error: " + err.code + ", Message: " + err.message + (err.file ? ", File: " + err.file.name : "") + "</div>" ); up.refresh(); // Reposition Flash/Silverlight }); uploader.bind('FileUploaded', function (up, file, response) { var result = $.parseJSON(response.response); // console.log(result); $('#' + file.id).remove(); if (result.success) { $('#profile-image').css('background-image','url("'+result.html+'")'); $('#profile-image').attr('data-profileurl',result.html); $('#profile-image_id').val(result.attach); var all_id=$('#attachid').val(); all_id=all_id+","+result.attach; $('#attachid').val(all_id); $('#imagelist').append('<div class="uploaded_images" data- imageid="'+result.attach+'"><img src="'+result.html+'" alt="thumb" /><i class="fa deleter fa-trash-o"></i> </div>'); delete_binder(); thumb_setter(); } }); $('#aaiu-uploader').click(function (e) { uploader.start(); e.preventDefault(); }); $('#aaiu-uploader2').click(function (e) { uploader.start(); e.preventDefault(); }); } });

    Read the article

  • Filesystem fragmentation on the level of set of files

    - by trismarck
    The file is stored in blocks by the file system. The block is the smallest amount of data the file system can assign to store a file. The classical definition of a fragmented file is that the file is stored in blocks that are 'scattered' (that are physically non-contiguous) around the hard drive. What I want to ask about is this second type of fragmentation I've came up with. Lets suppose we install a program. This program has very many files. When the program starts, the program always loads the contents of those files sequentially. Now, even if the hard disk is defragmented, there is still a possibility that the files (but not the blocks building up to files) will be scattered on the disk and thus the program launch time will be longer. Actually, this time could be longer due to defragmentation of the disk, as the defragmentation process not only glues fragmented files but also moves some files to optimize free space chunks. The questions: is the type of fragmentation I mentioned relevant for the file system? is it possible to remedy this kind of fragmentation and if yes, how would you do it? Also, I'm not sure if this question should belong to superuser or to serverfault (as I guess the filesystem fragmentation is more important in the server environment).

    Read the article

  • Alternate way to create a clone of a UNIX System

    - by Spirit
    THE STORY: (If you don't like to read much, down below is the question :) ) Where I work we have two HP RP2470 servers same hardware same number of hard drives same everything :). One of them is a production server and runs HP-UX 11.00. The poor ba***rd hasn't been turned off for years and now I have to make a clone of it on the other server - just in case, for redundancy. The problem is simple (or not simple) as I have to make the the other server exactly the same. However the old version of OS (UX 11.00 is a history now) and the old software running on it, have made my task almost impossible. On the production server there is also a cloning/recover utility Ignite-UX. I tried many times to create a recovery tape with it. Then when I load the tape on the backup server, it succeeds with the loading of the tape (no errors no warnings) but on the next restart it fails to load the OS :S and drops into HP`s ISL prompt. --- THE QUESTION: Is there an alternate way to create a clone of the Unix System? The environment is: 1. 2x HP RP2470 Servers (non-Intel), same hardware, same number od HDDs (two each of them) same everything. 2. OS running: HP-UX 11.00 The production server has to be cloned without downtime - sadly :( as I hope that they will reconsider on this one For example (like on Windows platforms), if you try to copy an entire HDD with Windows inside on another HDD, and then put that HDD on another PC it will still work, as long as the hardware is the same. Can I do something like that with a Unix system? Can I somehow COPY the contents of the entire HDD, put those on another HDD, and then just load the HDD into the other server? (If you haven't read the story the servers are exactly the same) Will it work? Can it be done with ordinary commands like cp or dump or something like that? Does any one have a similar experience? --- UPDATE: 26.01.2012 NOTE: The update is related to "The Story". If you haven't read that part then you can skip this update. This is just a short update on the recover logs from the Ignite Tape.. someone with more exp. might notice something.. ... --- READING CONTENTS OF THE IGNITE TAPE --- --- OUTPUT OMITED --- ... ... x ./configure3, 413696 bytes, 808 tape blocks x ./monitor_bpr, 20480 bytes, 40 tape blocks * Download_mini-system: Complete * Loading_software: Begin * Installing boot area on disk. * Enabling swap areas. * Backing up LVM configuration for "vg00". * Processing the archive source (Recovery Archive). * Wed Jan 25 15:27:32 EST 2012: Starting archive load of the source (Recovery Archive). * Positioning the tape (/dev/rmt/0mn). * Archive extraction from tape is beginning. Please wait. * Wed Jan 25 15:39:52 EST 2012: Completed archive load of the source (Recovery Archive). * Executing user specified script: "/opt/ignite/data/scripts/os_arch_post_l". * Running in recovery mode (os_arch_post_l). * Running the ioinit command ("/sbin/ioinit -c") * Creating device files via the insf command. insf: Installing special files for sdisk instance 0 address 0/0/1/1.15.0 insf: Installing special files for sdisk instance 1 address 0/0/2/0.1.0 insf: Installing special files for sdisk instance 2 address 0/0/2/1.15.0 insf: Installing special files for stape instance 0 address 0/0/1/0.3.0 insf: Installing special files for btlan instance 0 address 0/0/0/0 insf: Installing special files for btlan instance 1 address 0/2/0/0 insf: Installing special files for pseudo driver dlpi insf: Installing special files for pseudo driver kepd insf: Installing special files for pseudo driver framebuf insf: Installing special files for pseudo driver sad * Running "/opt/upgrade/bin/tlinstall -v" and correcting transition link permissions. * Constructing the bootconf file. * Setting primary boot path to "0/0/1/1.15.0". * Executing: "/var/adm/sw/products/PHSS_20146/pfiles/iux_postload". * Executing: "/var/adm/sw/products/PHSS_25982/pfiles/iux_postload". NOTE: tlinstall is searching filesystem - please be patient NOTE: Successfully completed * Loading_software: Complete * Build_Kernel: Begin NOTE: Since the /stand/vmunix kernel is already in place, the kernel will not be re-built. Note that no mod_kernel directives will be processed. * Build_Kernel: Complete * Boot_From_Client_Disk: Begin * Rebooting machine as expected. NOTE: Rebooting system. sync'ing disks (0 buffers to flush): 0 buffers not flushed 0 buffers still dirty Closing open logical volumes... Done Console reset done. Boot device reset done. ********** VIRTUAL FRONT PANEL ********** System Boot detected ***************************************** LEDs: RUN ATTENTION FAULT REMOTE POWER FLASH OFF OFF ON ON LED State: Running non-OS code. (i.e. Boot or Diagnostics) ... ... ... --- SERVER IS PERFORMING POST SEQUENCE HERE --- --- OUTPUT OMITED --- ... ... ... ***************************************** ************ EARLY BOOT VFP ************* End of early boot detected ***************************************** Firmware Version 43.50 Duplex Console IO Dependent Code (IODC) revision 1 ------------------------------------------------------------------------------ (c) Copyright 1995-2002, Hewlett-Packard Company, All rights reserved ------------------------------------------------------------------------------ Processor Speed State CoProcessor State Cache Size Number State Inst Data --------- -------- --------------------- ----------------- ------------ 0 650 MHz Active Functional 750 KB 1.5 MB 1 650 MHz Idle Functional 750 KB 1.5 MB Central Bus Speed (in MHz) : 120 Available Memory : 2097152 KB Good Memory Required : 16140 KB Primary boot path: 0/0/1/1.15 Alternate boot path: 0/0/2/1.15 Console path: 0/0/4/1.643 Keyboard path: 0/0/4/0.0 Processor is starting autoboot process. To discontinue, press any key within 10 seconds. 10 seconds expired. Proceeding... Trying Primary Boot Path ------------------------ Booting... Boot IO Dependent Code (IODC) revision 1 HARD Booted. ISL Revision A.00.38 OCT 26, 1994 ISL booting hpux ISL>

    Read the article

  • How do file references within a PHP Objects work?

    - by bender
    I'm trying to create an PHP object that can load objects in other files on demand when needed. My problem is that when I reference the files based on file location for the class definition, it can not find the files. So file structure: /Test.php /os/os.php (extends kernel) /os/kernel.php /os/libraries/lib1.php /os/libraries/lib2.php /os/libraries/lib3.php In kernel.php, the libraries are referenced as 'libraries/lib1.php'. If I create an "os" object in Test.php. The lib files are not found.

    Read the article

  • How to remove the file extension in a zsh completion?

    - by meeselet
    I want to adjust zsh so that I can tab complete: myprog <tab> using all *.foo files in ~/somedir, but have it so that it displays them without the .foo extension. Is there any way to do this? This is what I have so far: #compdef myprog typeset -A opt_args local context state line local -a mydirs mydirs="(. ~/somedir)" _arguments -s -S \ "*:name:->foos" \ && return 0 case $state in (foos) _files -W ${mydirs} -g '*.foo(:r)' && return 0 ;; esac return 1 However, this displays double the output for every file (that is, each .foo file is listed with and without its extension). Is there any way around this?

    Read the article

  • How to save Chinese Characters to file with java ?

    - by Frank
    I use the following code to save Chinese characters into a .txt file, but when I opened it with wordpad, I can't read it. StringBuffer Shanghai_StrBuf=new StringBuffer("\u4E0A\u6D77"); boolean Append=true; FileOutputStream fos; fos=new FileOutputStream(FileName,Append); for (int i=0;i<Shanghai_StrBuf.length();i++) fos.write(Shanghai_StrBuf.charAt(i)); fos.close(); What can I do ? I know if I cut and paste Chinese characters into a wordpad I can save it into a .txt file. How to do that with java ?

    Read the article

  • Unlock a file with unlocker from a c# App?

    - by netadictos
    I am trying to unlock a file from a C# program, using unlocker. In my UI, I put a button to unlock the file the app couldn't delete. When the user pushes the button, I want unlocker (the famous app) to be opened. I have read about in the Unlocker web, and there is some explanations about the commandline to use but nothing works. I write the following code but nothing happens: "c:\Program Files\unlocker\unlocker.exe" -L "PATHFORTHEFILE.doc" Nothing happens. I have tried without parameters and with -LU. Any idea? Something more efficient than unlocker to integrate it with software?

    Read the article

  • Writing a simple batch file to setup a variable?

    - by Sam
    I want to write a simple batch file where i want to setup a environment variable based on the machine architecture. It is as below: set ARCH=%PROCESSOR_ARCHITECTURE% echo %ARCH% if %ARCH%==x86 ( set JAVA_ROOT=C:\Progra~1\Java\j2re1.4.2_13 ) else ( set JAVA_ROOT=C:\Progra~2\Java\j2re1.4.2_13 ) echo JAVA_ROOT is %JAVA_ROOT% On 64-bit machine where the architecture is 'AMD64' the JAVA_ROOT will be displayed as 'C:\Progra~2\Java\j2re1.4.2_13' at the echo statement. But when i run an application that uses this file, the first value of JAVA_ROOT would be picked up 'C:\Progra~1\Java\j2re1.4.2_13'. I don't have any idea why it goes in the 'if' part even though i am running this on 64-bit Windows7. When i echoed the

    Read the article

  • Append to a file in Java. Is it a joke?

    - by Roman
    I need to append some data to existing file. I started to browse Internet to find out how to do it. And I found this mini (as they say) application to do that: http://www.devdaily.com/java/edu/qanda/pjqa00009.shtml Well I was already annoyed by the fact how complicated are things in Java (in comparison with Python, for example). But this is too much! I just want to add to a file! It should be one line! Not 50! Or do I get something wrong?

    Read the article

< Previous Page | 162 163 164 165 166 167 168 169 170 171 172 173  | Next Page >