Search Results

Search found 48797 results on 1952 pages for 'read write'.

Page 334/1952 | < Previous Page | 330 331 332 333 334 335 336 337 338 339 340 341  | Next Page >

  • Null safe way to get values from an IDataReader

    - by kumar
    Hi all, (LocalVariable)ABC.string(Name)= (Idatareader)datareader.GetString(0); this name value is coming from database.. what happening here is if this name value is null while reading it's throwing an exception? I am manually doing some if condition here. I don't want to write a manual condition to check all my variables.. I am doing something like this now.. string abc = (Idatareader)datareader.GetValue(0); if(abc = null) //assiging null else assiging abc value is there something like can we write extension method for this? thanks

    Read the article

  • Norton Ghost EBAB03F1: The specified network name is no longer available

    - by Breck Carter
    After about 15 minutes, a Norton Ghost 14 backup fails with Error EBAB03F1: The specified network name is no longer available. The source computer is a P4 laptop running Windows XP SP3. The target computer is a Core2 Quad desktop running Windows Vista Ultimate 64bit. It does not help to disable Norton 360 on the source computer or Norton Antivirus 2008 on the target computer. The Event Viewer consistently shows the same two VSS-related errors after Norton Ghost starts but before it fails. It makes no difference if the VSS service is started or stopped. The VSS errors do not appear elsewhere in the event log, only after Ghost starts. The MSS event messages, however, are quite common, appearing throughout the log, and they may have nothing to do with the problem. Here is the Norton Ghost error display... -Errors exist. --Unable to write to file. ---Error EBAB03F1: The specified network name is no longer available. ---Unable to set file size. ----Error EBAB03F1: The specified network name is no longer available. ----Unable to write to file. -----Error EBAB03F1: The specified network name is no longer available. -----Unable to set file size. ------Error EBAB03F1: The specified network name is no longer available. Here are the source computer events, with the final error at the top and the "Ghost Starting" message at the bottom: ===== Event Type: Error Event Source: Norton Ghost Event Category: High Priority Event ID: 100 Date: 11/09/2009 Time: 9:40:26 AM User: N/A Computer: PAVILION2 Description: Error EC8F17B7: Cannot create recovery points for job: Drive Backup of (C:\) (3). Error E7D1001F: Unable to write to file. Error EBAB03F1: The specified network name is no longer available. Error E7D10046: Unable to set file size. Error EBAB03F1: The specified network name is no longer available. Error E7D1001F: Unable to write to file. Error EBAB03F1: The specified network name is no longer available. Error E7D10046: Unable to set file size. Error EBAB03F1: The specified network name is no longer available. Details: 0xEBAB0005 Source: Norton Ghost ===== Event Type: Information Event Source: MSSQL$SQLEXPRESS Event Category: Server Event ID: 3421 Date: 11/09/2009 Time: 9:34:06 AM User: NT AUTHORITY\NETWORK SERVICE Computer: PAVILION2 Description: Recovery completed for database ReportServer$SQLEXPRESSTempDB (database ID 6) in 1 second(s) (analysis 205 ms, redo 0 ms, undo 376 ms.) This is an informational message only. No user action is required. For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp. Data: 0000: 5d 0d 00 00 0a 00 00 00 ]....... 0008: 15 00 00 00 50 00 41 00 ....P.A. 0010: 56 00 49 00 4c 00 49 00 V.I.L.I. 0018: 4f 00 4e 00 32 00 5c 00 O.N.2.\. 0020: 53 00 51 00 4c 00 45 00 S.Q.L.E. 0028: 58 00 50 00 52 00 45 00 X.P.R.E. 0030: 53 00 53 00 00 00 18 00 S.S..... 0038: 00 00 52 00 65 00 70 00 ..R.e.p. 0040: 6f 00 72 00 74 00 53 00 o.r.t.S. 0048: 65 00 72 00 76 00 65 00 e.r.v.e. 0050: 72 00 24 00 53 00 51 00 r.$.S.Q. 0058: 4c 00 45 00 58 00 50 00 L.E.X.P. 0060: 52 00 45 00 53 00 53 00 R.E.S.S. 0068: 00 00 .. ===== Event Type: Information Event Source: MSSQL$SQLEXPRESS Event Category: Server Event ID: 17137 Date: 11/09/2009 Time: 9:34:02 AM User: NT AUTHORITY\NETWORK SERVICE Computer: PAVILION2 Description: Starting up database 'ReportServer$SQLEXPRESSTempDB'. For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp. Data: 0000: f1 42 00 00 0a 00 00 00 ñB...... 0008: 15 00 00 00 50 00 41 00 ....P.A. 0010: 56 00 49 00 4c 00 49 00 V.I.L.I. 0018: 4f 00 4e 00 32 00 5c 00 O.N.2.\. 0020: 53 00 51 00 4c 00 45 00 S.Q.L.E. 0028: 58 00 50 00 52 00 45 00 X.P.R.E. 0030: 53 00 53 00 00 00 18 00 S.S..... 0038: 00 00 52 00 65 00 70 00 ..R.e.p. 0040: 6f 00 72 00 74 00 53 00 o.r.t.S. 0048: 65 00 72 00 76 00 65 00 e.r.v.e. 0050: 72 00 24 00 53 00 51 00 r.$.S.Q. 0058: 4c 00 45 00 58 00 50 00 L.E.X.P. 0060: 52 00 45 00 53 00 53 00 R.E.S.S. 0068: 00 00 .. ===== Event Type: Error Event Source: VSS Event Category: None Event ID: 5013 Date: 11/09/2009 Time: 9:28:32 AM User: N/A Computer: PAVILION2 Description: Volume Shadow Copy Service error: Shadow Copy writer ContentIndexingService called routine RegQueryValueExW which failed with status 0x80070002 (converted to 0x800423f4). For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp. Data: 0000: 57 53 48 43 4f 4d 4e 43 WSHCOMNC 0008: 32 32 39 32 00 00 00 00 2292.... 0010: 57 53 48 43 49 43 00 00 WSHCIC.. 0018: 32 38 37 00 00 00 00 00 287..... ===== Event Type: Error Event Source: VSS Event Category: None Event ID: 5013 Date: 11/09/2009 Time: 9:28:32 AM User: N/A Computer: PAVILION2 Description: Volume Shadow Copy Service error: Shadow Copy writer ContentIndexingService called routine RegQueryValueExW which failed with status 0x80070002 (converted to 0x800423f4). For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp. Data: 0000: 57 53 48 43 4f 4d 4e 43 WSHCOMNC 0008: 32 32 39 32 00 00 00 00 2292.... 0010: 57 53 48 43 49 43 00 00 WSHCIC.. 0018: 32 38 37 00 00 00 00 00 287..... ===== Event Type: Error Event Source: VSS Event Category: None Event ID: 12302 Date: 11/09/2009 Time: 9:28:32 AM User: N/A Computer: PAVILION2 Description: Volume Shadow Copy Service error: An internal inconsistency was detected in trying to contact shadow copy service writers. Please check to see that the Event Service and Volume Shadow Copy Service are operating properly. For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp. Data: 0000: 42 55 45 43 58 4d 4c 43 BUECXMLC 0008: 33 36 33 37 00 00 00 00 3637.... 0010: 42 55 45 43 58 4d 4c 43 BUECXMLC 0018: 33 36 30 37 00 00 00 00 3607.... ===== Event Type: Information Event Source: Norton Ghost Event Category: High Priority Event ID: 100 Date: 11/09/2009 Time: 9:27:57 AM User: N/A Computer: PAVILION2 Description: Info 6C8F1F63: The drive-based backup job, Drive Backup of (C:\) (3), has been started manually. Details: Source: Norton Ghost

    Read the article

  • generate only objectLayer of Entity Framework Model by edmgen tool

    - by loviji
    How to generate only objectLayer by edmgen tool, without generating csdl, ssdl and views ? *"%windir%\Microsoft.NET\Framework\v4.0.30319\edmgen.exe" /mode:fullgeneration /c:"Data Source=.\sqlexpress; Initial Catalog=uqs; Integrated Security=SSPI" /project:generateEntityModel /entitycontainer:uqsEntities /namespace:uqsModel /language:CSharp /outobjectlayer:"D:/uqsObjectLayer.cs" * in this script I don't write location to write csdl, ssdl and views , but they are generated in C:\Users\adminUser in windows Vista and objectLayer generated to D:/uqsObjectLayer.cs. If I use /mode:EntityClassGeneration, this option requires the /incsdl argument and either the /project argument or the /outobjectlayer argument. The /language argument is optional. But I don't want use csdl file. As I understand, edmgen.tool can not create objectlayer without csdl file. Now is there alternate way or tool for generating objectlayer from db?

    Read the article

  • processing: convert int to byte

    - by inspectorG4dget
    Hello SO, I'm trying to convert an int into a byte in Processing 1.0.9. This is the snippet of code that I have been working with: byte xByte = byte(mouseX); byte yByte = byte(mouseY); byte setFirst = byte(128); byte resetFirst = byte(127); xByte = xByte | setFirst; yByte = yByte >> 1; port.write(xByte); port.write(yByte); According to the Processing API, this should work, but I keep getting an error at xByte = xByte | setFirst; that says: cannot convert from int to byte I have tried converting 128 and 127 to they respective hex values (0x80 and 0x7F), but that didn't work either. I have tried everything mentioned in the API as well as some other blogs, but I feel like I'm missing something very trivial. I would appreciate any help. Thank you.

    Read the article

  • C Privilege Escalation (With Password)

    - by AriX
    Hey everyone, I need to write a C program that will allow me to read/write files that are owned by root. However, I can only run the code under another user. I have the root password, but there are no "sudo" or "su" commands on the system, so I have no way of accessing the root account (there are practically no shell commands whatsoever, actually). I don't know a whole lot about UNIX permissions, so I don't know whether or not it is actually possible to do this without exploiting the system in some way or running a program owned by root itself (with +s or whatever). Any advice? Thanks! P.S. No, this isn't anything malicious, this is on an iPhone.

    Read the article

  • Using FFmpeg in .net?

    - by daniel
    So i know its a fairly big challenge but i want to write a basic movie player/converter in c# using the FFmpeg library. However the first obstacle i need to overcome is wrapping the FFmpeg library in c#. I downloaded ffmpeg but couldn't compile it on windows, so i downloaded a precompiled version for me. Ok awesome. Then i started looking for c# wrappers. I have looked around and have found a few wrappers such as SharpFFmpeg (http://sourceforge.net/projects/sharpffmpeg/) and ffmpeg-sharp (http://code.google.com/p/ffmpeg-sharp/). First of all i wanted to use ffmpeg-sharp as its LGPL and SharpFFmpeg is GPL. However it had quite a few compile errors. Turns out it was written for the mono compiler, i tried compiling it with mon but couldn't figure out how. I then started to manually fix the compiler errors myself, but came across a few scary ones and thought i'd better leave those alone. So i gave up on ffmpeg-sharp. Then i looked at SharpFFmpeg and it looks like what i want, all the functions P/Invoked for me. However its GPL? Both the AVCodec.cs and AVFormat.cs files look like ports of avcodec.c and avformat.c which i reckon i could port myself? Then not have to worry about licencing. But i wan't to get this right before i go ahead and start coding. Should I: Write my own c++ library for interacting with ffmpeg, then have my c# program talk to the c++ library in order to play/convert videos etc. OR Port avcodec.h and avformat.h (is that all i need?) to c# by using a whole lot of DllImports and write it entirely in c#? First of all consider that i'm not great at c++ as i rarely use it but i know enough to get around. The reason i'm thinking #1 might be the better option is that most FFmpeg tutorials are in c++ and i'd also have more control over memory management than if i was to do it in c#. What do you think? Also would you happen to have any usefull links (perhaps a tutorial) for using FFmpeg? EDIT: spelling mistakes

    Read the article

  • How to change Matlab program for solving equation with finite element method?

    - by DSblizzard
    I don't know is this question more related to mathematics or programming and I'm absolute newbie in Matlab. Program FEM_50 applies the finite element method to Laplace's equation -Uxx(x, y) - Uyy(x, y) = F(x, y) in Omega. How to change it to apply FEM to equation -Uxx(x, y) - Uyy(x, y) + U(x, y) = F(x, y)? At this page: http://sc.fsu.edu/~burkardt/m_src/fem_50/fem_50.html additional code files in case you need them. function fem_50 ( ) %% FEM_50 applies the finite element method to Laplace's equation. % % Discussion: % % FEM_50 is a set of MATLAB routines to apply the finite % element method to solving Laplace's equation in an arbitrary % region, using about 50 lines of MATLAB code. % % FEM_50 is partly a demonstration, to show how little it % takes to implement the finite element method (at least using % every possible MATLAB shortcut.) The user supplies datafiles % that specify the geometry of the region and its arrangement % into triangular and quadrilateral elements, and the location % and type of the boundary conditions, which can be any mixture % of Neumann and Dirichlet. % % The unknown state variable U(x,y) is assumed to satisfy % Laplace's equation: % -Uxx(x,y) - Uyy(x,y) = F(x,y) in Omega % with Dirichlet boundary conditions % U(x,y) = U_D(x,y) on Gamma_D % and Neumann boundary conditions on the outward normal derivative: % Un(x,y) = G(x,y) on Gamma_N % If Gamma designates the boundary of the region Omega, % then we presume that % Gamma = Gamma_D + Gamma_N % but the user is free to determine which boundary conditions to % apply. Note, however, that the problem will generally be singular % unless at least one Dirichlet boundary condition is specified. % % The code uses piecewise linear basis functions for triangular elements, % and piecewise isoparametric bilinear basis functions for quadrilateral % elements. % % The user is required to supply a number of data files and MATLAB % functions that specify the location of nodes, the grouping of nodes % into elements, the location and value of boundary conditions, and % the right hand side function in Laplace's equation. Note that the % fact that the geometry is completely up to the user means that % just about any two dimensional region can be handled, with arbitrary % shape, including holes and islands. % clear % % Read the nodal coordinate data file. % load coordinates.dat; % % Read the triangular element data file. % load elements3.dat; % % Read the quadrilateral element data file. % load elements4.dat; % % Read the Neumann boundary condition data file. % I THINK the purpose of the EVAL command is to create an empty NEUMANN array % if no Neumann file is found. % eval ( 'load neumann.dat;', 'neumann=[];' ); % % Read the Dirichlet boundary condition data file. % load dirichlet.dat; A = sparse ( size(coordinates,1), size(coordinates,1) ); b = sparse ( size(coordinates,1), 1 ); % % Assembly. % for j = 1 : size(elements3,1) A(elements3(j,:),elements3(j,:)) = A(elements3(j,:),elements3(j,:)) ... + stima3(coordinates(elements3(j,:),:)); end for j = 1 : size(elements4,1) A(elements4(j,:),elements4(j,:)) = A(elements4(j,:),elements4(j,:)) ... + stima4(coordinates(elements4(j,:),:)); end % % Volume Forces. % for j = 1 : size(elements3,1) b(elements3(j,:)) = b(elements3(j,:)) ... + det( [1,1,1; coordinates(elements3(j,:),:)'] ) * ... f(sum(coordinates(elements3(j,:),:))/3)/6; end for j = 1 : size(elements4,1) b(elements4(j,:)) = b(elements4(j,:)) ... + det([1,1,1; coordinates(elements4(j,1:3),:)'] ) * ... f(sum(coordinates(elements4(j,:),:))/4)/4; end % % Neumann conditions. % if ( ~isempty(neumann) ) for j = 1 : size(neumann,1) b(neumann(j,:)) = b(neumann(j,:)) + ... norm(coordinates(neumann(j,1),:) - coordinates(neumann(j,2),:)) * ... g(sum(coordinates(neumann(j,:),:))/2)/2; end end % % Determine which nodes are associated with Dirichlet conditions. % Assign the corresponding entries of U, and adjust the right hand side. % u = sparse ( size(coordinates,1), 1 ); BoundNodes = unique ( dirichlet ); u(BoundNodes) = u_d ( coordinates(BoundNodes,:) ); b = b - A * u; % % Compute the solution by solving A * U = B for the remaining unknown values of U. % FreeNodes = setdiff ( 1:size(coordinates,1), BoundNodes ); u(FreeNodes) = A(FreeNodes,FreeNodes) \ b(FreeNodes); % % Graphic representation. % show ( elements3, elements4, coordinates, full ( u ) ); return end

    Read the article

  • LINQ error when deployed - Security Exception - cannot create DataContext

    - by aximili
    The code below works locally, but when I deploy it to the server it gives the following error. Security Exception Description: The application attempted to perform an operation not allowed by the security policy. To grant this application the required permission please contact your system administrator or change the application's trust level in the configuration file. Exception Details: System.Security.SecurityException: Request for the permission of type 'System.Security.Permissions.FileIOPermission, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed. The code is protected void Page_Load(object sender, EventArgs e) { DataContext context = new DataContext(Global.ConnectionString); // <-- throws the exception //Table<Group> _kindergartensTable = context.GetTable<Group>(); Response.Write("ok"); } I have set full write permissons on all files and folders on the server. Any suggestions how to solve this problem? Thanks in advance.

    Read the article

  • Ethernet Communication Error

    - by SivaKumar
    Hi, I wrote a program to query the status of the Ethernet printer for that i created a TCP Stream Socket and i send the query command to the printer.In case of Error less condition it returns No error status but in error case its getting hang at recv command.Even i used Non blocking now the recv command returns nothing and error set as Resource temporarily unavailable. code: #include <stdio.h> #include <sys/types.h> #include <sys/socket.h> #include <netinet/in.h> #include <netdb.h> #include <string.h> #include <unistd.h> #include <arpa/inet.h> #include <errno.h> #include <stdlib.h> #include <fcntl.h> #include <sys/ioctl.h> #include <sys/socket.h> #include <signal.h> #include <termios.h> #include <poll.h> #include <netinet/tcp.h> #include <stdarg.h> int main() { int ConnectSocket,ConnectSocket1,select_err,err,nRet,nBytesRead; struct timeval waitTime = {10,30}; fd_set socket_set; unsigned char * dataBuf = NULL; unsigned char tempVar, tempVar1, tempVar2, tempVar3; char reset[] = "\033E 2\r"; char print[] = "\033A 1\r"; char buf[1024]={0}; ConnectSocket = socket(AF_INET, SOCK_STREAM, IPPROTO_TCP); printf("The Socket ID is %d\n",ConnectSocket); if (ConnectSocket < 0) { perror("socket()"); return 0; } struct sockaddr_in clientService; clientService.sin_family = AF_INET; clientService.sin_addr.s_addr = inet_addr("192.168.0.129"); //Printer IP clientService.sin_port = htons( 9100); // Printer Port if ( connect( ConnectSocket, (struct sockaddr*) &clientService, sizeof(clientService) ) == -1) { perror("connect()"); close(ConnectSocket); return -1; } /* if((nRet = ioctl(ConnectSocket , FIONREAD, &nBytesRead) == -1)) { perror("ioctl()"); } perror("ioctl()"); */ FD_ZERO(&socket_set); FD_SET(ConnectSocket, &socket_set); do { errno=0; select_err = select(ConnectSocket+1, NULL, &socket_set, NULL, &waitTime); }while(errno==EINPROGRESS); if (-1 == select_err || 0 == select_err) { int optVal = 0; int optLen = sizeof(optVal); if(select_err == -1) { perror("select() write-side"); } else { //Timeout errno=0; err = getsockopt(ConnectSocket, SOL_SOCKET, SO_ERROR, (char*)&optVal, &optLen); printf("the return of the getsockopt is %d\n",err); printf("the opt val is %s\n",(char*)optVal); perror("getsockopt()"); if(err == -1) { perror("getsockopt() write-side"); } } printf("Select Failed during write - ConnectSocket: %d\n", ConnectSocket); //close(ConnectSocket); return -1; } err = send(ConnectSocket,print,sizeof(print)-1, 0); printf("\n No of Bytes Send is %d\n",err); if(err == -1 || err ==0) { perror("send()"); //close(ConnectSocket); return -1; } FD_ZERO(&socket_set); FD_SET(ConnectSocket, &socket_set); do { errno=0; select_err = select(ConnectSocket+1, NULL, &socket_set, NULL, &waitTime); }while(errno==EINPROGRESS); if (-1 == select_err || 0 == select_err) { printf("Select Failed during write - ConnectSocket: %d\n", ConnectSocket); return -1; } err = send(ConnectSocket,reset,sizeof(reset)-1, 0); printf("\n No of Bytes Send is %d\n",err); if(err == -1 || err ==0) { perror("send()"); //close(ConnectSocket); return -1; } FD_ZERO(&socket_set); FD_SET(ConnectSocket, &socket_set); printf("i am in reading \n"); select_err = select(ConnectSocket+1, &socket_set, NULL, NULL, &waitTime); printf("the retun of the read side select is %d \n",select_err); perror("select()"); if (-1 == select_err|| 0 == select_err) { printf("Read timeout; ConnectSocket: %d\n", ConnectSocket); close(ConnectSocket); perror("close()"); return -1; } printf("Before Recv\n"); nBytesRead = recv(ConnectSocket , buf, 1024, 0); printf("No of Bytes read is %d\n",nBytesRead); printf("%s\n",buf); if(nBytesRead == -1) { perror("recv()"); close(ConnectSocket); perror("clode()"); return -1; } close(ConnectSocket); return 1; }

    Read the article

  • retreive POST data from FLASH to ASP.Net

    - by Martin Ongtangco
    here's my AS3 code: var jpgEncoder:JPGEncoder = new JPGEncoder(100); var jpgStream:ByteArray = jpgEncoder.encode(bitmapData); var header:URLRequestHeader = new URLRequestHeader("Content-type", "application/octet-stream"); var jpgURLRequest:URLRequest = new URLRequest("/patients/webcam.aspx"); jpgURLRequest.requestHeaders.push(header); jpgURLRequest.method = URLRequestMethod.POST; jpgURLRequest.data = jpgStream; navigateToURL(jpgURLRequest, "_self"); And here's my ASP.Net Code try { string pt = Path.Combine(PathFolder, "test.jpg"); HttpFileCollection fileCol = Request.Files; Response.Write(fileCol.Count.ToString()); foreach (HttpPostedFile hpf in fileCol) { hpf.SaveAs(pt); } } catch (Exception ex) { Response.Write(ex.Message); } im getting a weird error, HttpFox mentioned: "NS_ERROR_NET_RESET" Any help would be excellent! Thanks!

    Read the article

  • Is Updating double operation atomic

    - by Yan Cheng CHEOK
    In Java, updating double and long variable may not be atomic, as double/long are being treated as two separate 32 bits variables. http://java.sun.com/docs/books/jls/second_edition/html/memory.doc.html#28733 In C++, if I am using 32 bit Intel Processor + Microsoft Visual C++ compiler, is updating double (8 byte) operation atomic? I cannot find much specification mention on this behavior. When I say "atomic variable", here is what I mean : Thread A trying to write 1 to variable x. Thread B trying to write 2 to variable x. We shall get value 1 or 2 out from variable x, but not an undefined value.

    Read the article

  • Sending binary data with Indy through TCP\IP, how?

    - by Wodzu
    Hello. How to send a binary data with Indy components? Which of them is most suitable for this task? I've tried to use TIdTcpClient but it allows only to send strings. I've found one reponce for that problem here but I don't get it. It says about method Write(TIdBytes), but the answer is not clear for me. Does he meant Write to some instance of TIdBytes, and how to connect that instance with TIdTcpClient? Thanks for any help.

    Read the article

  • What type of git server do you use? or how do you use git?

    - by Johan
    Hi Let's say we have a small team, 1-5 persons. What type of "git setup" would you use? Would you use gitweb and apache so you could run over http? Or would you use the user-accounts and ssh in some way? Today I'm familiar to use SubVersion thou apache (http), but I'm not sure it is right to setup to use git the same way... Thanks Johan Update: It feels like if we combine the answer that Dietrich Epp and the one hallidave gave I could get a quick and good solution. A common dir in the servers filesystem where all can write, and that dir is also exposed with apache. That way everybody can always get the latest, but only trusted people can write to it...

    Read the article

  • Insert a Script into a iFrame's Header, without clearing out the body of the iFrame

    - by nobosh
    Hello, I'm looking to add a script to an iFrame's header while not losing everything contained in the iFrame's body or header... here is what I have right now which does update the iFrame with the new script, but it cleans everything in the iframe out, not appends which is what I'd like. thxs! B // Find the iFrame var iframe = document.getElementById('hi-world'); // create a string to use as a new document object var val = '<scr' + 'ipt type="text/javascript" src="http://jqueryjs.googlecode.com/files/jquery-1.3.2.min.js"></scr' + 'ipt>'; // get a handle on the <iframe>d document (in a cross-browser way) var doc = iframe.contentWindow || iframe.contentDocument; if (doc.document) { doc = doc.document;} // open, write content to, and close the document doc.open(); doc.write(val); doc.close();

    Read the article

  • Problem with bootstrap loader and kernel

    - by dboarman-FissureStudios
    We are working on a project to learn how to write a kernel and learn the ins and outs. We have a bootstrap loader written and it appears to work. However we are having a problem with the kernel loading. I'll start with the first part: bootloader.asm: [BITS 16] [ORG 0x0000] ; ; all the stuff in between ; ; the bottom of the bootstrap loader datasector dw 0x0000 cluster dw 0x0000 ImageName db "KERNEL SYS" msgLoading db 0x0D, 0x0A, "Loading Kernel Shell", 0x0D, 0x0A, 0x00 msgCRLF db 0x0D, 0x0A, 0x00 msgProgress db ".", 0x00 msgFailure db 0x0D, 0x0A, "ERROR : Press key to reboot", 0x00 TIMES 510-($-$$) DB 0 DW 0xAA55 ;************************************************************************* The bootloader.asm is too long for the editor without causing it to chug and choke. In addition, the bootloader and kernel do work within bochs as we do get the message "Welcome to our OS". Anyway, the following is what we have for a kernel at this point. kernel.asm: [BITS 16] [ORG 0x0000] [SEGMENT .text] ; code segment mov ax, 0x0100 ; location where kernel is loaded mov ds, ax mov es, ax cli mov ss, ax ; stack segment mov sp, 0xFFFF ; stack pointer at 64k limit sti mov si, strWelcomeMsg ; load message call _disp_str mov ah, 0x00 int 0x16 ; interrupt: await keypress int 0x19 ; interrupt: reboot _disp_str: lodsb ; load next character or al, al ; test for NUL character jz .DONE mov ah, 0x0E ; BIOS teletype mov bh, 0x00 ; display page 0 mov bl, 0x07 ; text attribute int 0x10 ; interrupt: invoke BIOS jmp _disp_str .DONE: ret [SEGMENT .data] ; initialized data segment strWelcomeMsg db "Welcome to our OS", 0x00 [SEGMENT .bss] ; uninitialized data segment Using nasm 2.06rc2 I compile as such: nasm bootloader.asm -o bootloader.bin -f bin nasm kernel.asm -o kernel.sys -f bin We write bootloader.bin to the floppy as such: dd if=bootloader.bin bs=512 count=1 of/dev/fd0 We write kernel.sys to the floppy as such: cp kernel.sys /dev/fd0 As I stated, this works in bochs. But booting from the floppy we get output like so: Loading Kernel Shell ........... ERROR : Press key to reboot Other specifics: OpenSUSE 11.2, GNOME desktop, AMD x64 Any other information I may have missed, feel free to ask. I tried to get everything in here that would be needed. If I need to, I can find a way to get the entire bootloader.asm posted somewhere. We are not really interested in using GRUB either for several reasons. This could change, but we want to see this boot successful before we really consider GRUB.

    Read the article

  • Open source and salary

    - by darko petreski
    Hi, We are facing a lot of open source software. But someone needs to write that software. How are they payed? Do you know a good article about the open source politics and economy? Sometimes the big companies themselves release open source because they have some benefits. Then they sell support, advices ... My question is what is the real economy about open software? No professional will work for nothing. This software are couple of classes but thousand or may be millions of classes. If you are really a pro you will write software for money, because you have life, wife, kids, taxes, you must earn. Please do not tell me that they are doing this for pleasure or hobby. Regards, Darko Peterski Regards

    Read the article

  • XStream <-> Alternative binary formats (e.g. protocol buffers)

    - by sehugg
    We currently use XStream for encoding our web service inputs/outputs in XML. However we are considering switching to a binary format with code generator for multiple languages (protobuf, Thrift, Hessian, etc) to make supporting new clients easier and less reliant on hand-coding (also to better support our message formats which include binary data). However most of our objects on the server are POJOs with XStream handling the serialization via reflection and annotations, and most of these libraries assume they will be generating the POJOs themselves. I can think of a few ways to interface an alternative library: Write an XStream marshaler for the target format. Write custom code to marshal the POJOs to/from the classes generated by the alternative library. Subclass the generated classes to implement the POJO logic. May require some rewriting. (Also did I mention we want to use Terracotta?) Use another library that supports both reflection (like XStream) and code generation. However I'm not sure which serialization library would be best suited to the above techniques.

    Read the article

  • Writing Android apps in C#

    - by JamesBrownIsDead
    I'm a C# programmer and want to write an Android app. I'm a stubborn curmudgeon and refuse to write Java ever again (after switching to C# six years ago). Besides Mono and MonoDroid (and writing Java), are there any options for me? Or should I just feel foolish for refusing to returning to my Java roots? (Please refrain from Java-related vs. C# discussion. I was being rhetorical when I asked about returning to me Java roots.)

    Read the article

  • Bacula windows client could not connect to Bacula director

    - by pr0f-r00t
    I have a Bacula server on my Linux Debian squeeze host (Bacula version 5.0.2) and a Bacula client on Windows XP SP3. On my network each client can see each other, can share files and can ping. On my local server I could run bconsole and the server responds but when I run bconsole or bat on my windows client the server does not respond. Here are my configuration files: bacula-dir.conf: # # Default Bacula Director Configuration file # # The only thing that MUST be changed is to add one or more # file or directory names in the Include directive of the # FileSet resource. # # For Bacula release 5.0.2 (28 April 2010) -- debian squeeze/sid # # You might also want to change the default email address # from root to your address. See the "mail" and "operator" # directives in the Messages resource. # Director { # define myself Name = nima-desktop-dir DIRport = 9101 # where we listen for UA connections QueryFile = "/etc/bacula/scripts/query.sql" WorkingDirectory = "/var/lib/bacula" PidDirectory = "/var/run/bacula" Maximum Concurrent Jobs = 1 Password = "Cv70F6pf1t6pBopT4vQOnigDrR0v3L" # Console password Messages = Daemon DirAddress = 127.0.0.1 # DirAddress = 72.16.208.1 } JobDefs { Name = "DefaultJob" Type = Backup Level = Incremental Client = nima-desktop-fd FileSet = "Full Set" Schedule = "WeeklyCycle" Storage = File Messages = Standard Pool = File Priority = 10 Write Bootstrap = "/var/lib/bacula/%c.bsr" } # # Define the main nightly save backup job # By default, this job will back up to disk in /nonexistant/path/to/file/archive/dir Job { Name = "BackupClient1" JobDefs = "DefaultJob" } #Job { # Name = "BackupClient2" # Client = nima-desktop2-fd # JobDefs = "DefaultJob" #} # Backup the catalog database (after the nightly save) Job { Name = "BackupCatalog" JobDefs = "DefaultJob" Level = Full FileSet="Catalog" Schedule = "WeeklyCycleAfterBackup" # This creates an ASCII copy of the catalog # Arguments to make_catalog_backup.pl are: # make_catalog_backup.pl <catalog-name> RunBeforeJob = "/etc/bacula/scripts/make_catalog_backup.pl MyCatalog" # This deletes the copy of the catalog RunAfterJob = "/etc/bacula/scripts/delete_catalog_backup" Write Bootstrap = "/var/lib/bacula/%n.bsr" Priority = 11 # run after main backup } # # Standard Restore template, to be changed by Console program # Only one such job is needed for all Jobs/Clients/Storage ... # Job { Name = "RestoreFiles" Type = Restore Client=nima-desktop-fd FileSet="Full Set" Storage = File Pool = Default Messages = Standard Where = /nonexistant/path/to/file/archive/dir/bacula-restores } # job for vmware windows host Job { Name = "nimaxp-fd" Type = Backup Client = nimaxp-fd FileSet = "nimaxp-fs" Schedule = "WeeklyCycle" Storage = File Messages = Standard Pool = Default Write Bootstrap = "/var/bacula/working/rsys-win-www-1-fd.bsr" #Change this } # job for vmware windows host Job { Name = "arg-michael-fd" Type = Backup Client = nimaxp-fd FileSet = "arg-michael-fs" Schedule = "WeeklyCycle" Storage = File Messages = Standard Pool = Default Write Bootstrap = "/var/bacula/working/rsys-win-www-1-fd.bsr" #Change this } # List of files to be backed up FileSet { Name = "Full Set" Include { Options { signature = MD5 } # # Put your list of files here, preceded by 'File =', one per line # or include an external list with: # # File = <file-name # # Note: / backs up everything on the root partition. # if you have other partitions such as /usr or /home # you will probably want to add them too. # # By default this is defined to point to the Bacula binary # directory to give a reasonable FileSet to backup to # disk storage during initial testing. # File = /usr/sbin } # # If you backup the root directory, the following two excluded # files can be useful # Exclude { File = /var/lib/bacula File = /nonexistant/path/to/file/archive/dir File = /proc File = /tmp File = /.journal File = /.fsck } } # List of files to be backed up FileSet { Name = "nimaxp-fs" Enable VSS = yes Include { Options { signature = MD5 } File = "C:\softwares" File = C:/softwares File = "C:/softwares" } } # List of files to be backed up FileSet { Name = "arg-michael-fs" Enable VSS = yes Include { Options { signature = MD5 } File = "C:\softwares" File = C:/softwares File = "C:/softwares" } } # # When to do the backups, full backup on first sunday of the month, # differential (i.e. incremental since full) every other sunday, # and incremental backups other days Schedule { Name = "WeeklyCycle" Run = Full 1st sun at 23:05 Run = Differential 2nd-5th sun at 23:05 Run = Incremental mon-sat at 23:05 } # This schedule does the catalog. It starts after the WeeklyCycle Schedule { Name = "WeeklyCycleAfterBackup" Run = Full sun-sat at 23:10 } # This is the backup of the catalog FileSet { Name = "Catalog" Include { Options { signature = MD5 } File = "/var/lib/bacula/bacula.sql" } } # Client (File Services) to backup Client { Name = nima-desktop-fd Address = localhost FDPort = 9102 Catalog = MyCatalog Password = "_MOfxEuRzxijc0DIMcBqtyx9iW1tzE7V6" # password for FileDaemon File Retention = 30 days # 30 days Job Retention = 6 months # six months AutoPrune = yes # Prune expired Jobs/Files } # Client file service for vmware windows host Client { Name = nimaxp-fd Address = nimaxp FDPort = 9102 Catalog = MyCatalog Password = "Ku8F1YAhDz5EMUQjiC9CcSw95Aho9XbXailUmjOaAXJP" # password for FileDaemon File Retention = 30 days # 30 days Job Retention = 6 months # six months AutoPrune = yes # Prune expired Jobs/Files } # Client file service for vmware windows host Client { Name = arg-michael-fd Address = 192.168.0.61 FDPort = 9102 Catalog = MyCatalog Password = "b4E9FU6s/9Zm4BVFFnbXVKhlyd/zWxj0oWITKK6CALR/" # password for FileDaemon File Retention = 30 days # 30 days Job Retention = 6 months # six months AutoPrune = yes # Prune expired Jobs/Files } # # Second Client (File Services) to backup # You should change Name, Address, and Password before using # #Client { # Name = nima-desktop2-fd # Address = localhost2 # FDPort = 9102 # Catalog = MyCatalog # Password = "_MOfxEuRzxijc0DIMcBqtyx9iW1tzE7V62" # password for FileDaemon 2 # File Retention = 30 days # 30 days # Job Retention = 6 months # six months # AutoPrune = yes # Prune expired Jobs/Files #} # Definition of file storage device Storage { Name = File # Do not use "localhost" here Address = localhost # N.B. Use a fully qualified name here SDPort = 9103 Password = "Cj-gtxugC4dAymY01VTSlUgMTT5LFMHf9" Device = FileStorage Media Type = File } # Definition of DDS tape storage device #Storage { # Name = DDS-4 # Do not use "localhost" here # Address = localhost # N.B. Use a fully qualified name here # SDPort = 9103 # Password = "Cj-gtxugC4dAymY01VTSlUgMTT5LFMHf9" # password for Storage daemon # Device = DDS-4 # must be same as Device in Storage daemon # Media Type = DDS-4 # must be same as MediaType in Storage daemon # Autochanger = yes # enable for autochanger device #} # Definition of 8mm tape storage device #Storage { # Name = "8mmDrive" # Do not use "localhost" here # Address = localhost # N.B. Use a fully qualified name here # SDPort = 9103 # Password = "Cj-gtxugC4dAymY01VTSlUgMTT5LFMHf9" # Device = "Exabyte 8mm" # MediaType = "8mm" #} # Definition of DVD storage device #Storage { # Name = "DVD" # Do not use "localhost" here # Address = localhost # N.B. Use a fully qualified name here # SDPort = 9103 # Password = "Cj-gtxugC4dAymY01VTSlUgMTT5LFMHf9" # Device = "DVD Writer" # MediaType = "DVD" #} # Generic catalog service Catalog { Name = MyCatalog # Uncomment the following line if you want the dbi driver # dbdriver = "dbi:sqlite3"; dbaddress = 127.0.0.1; dbport = dbname = "bacula"; dbuser = ""; dbpassword = "" } # Reasonable message delivery -- send most everything to email address # and to the console Messages { Name = Standard # # NOTE! If you send to two email or more email addresses, you will need # to replace the %r in the from field (-f part) with a single valid # email address in both the mailcommand and the operatorcommand. # What this does is, it sets the email address that emails would display # in the FROM field, which is by default the same email as they're being # sent to. However, if you send email to more than one address, then # you'll have to set the FROM address manually, to a single address. # for example, a '[email protected]', is better since that tends to # tell (most) people that its coming from an automated source. # mailcommand = "/usr/lib/bacula/bsmtp -h localhost -f \"\(Bacula\) \<%r\>\" -s \"Bacula: %t %e of %c %l\" %r" operatorcommand = "/usr/lib/bacula/bsmtp -h localhost -f \"\(Bacula\) \<%r\>\" -s \"Bacula: Intervention needed for %j\" %r" mail = root@localhost = all, !skipped operator = root@localhost = mount console = all, !skipped, !saved # # WARNING! the following will create a file that you must cycle from # time to time as it will grow indefinitely. However, it will # also keep all your messages if they scroll off the console. # append = "/var/lib/bacula/log" = all, !skipped catalog = all } # # Message delivery for daemon messages (no job). Messages { Name = Daemon mailcommand = "/usr/lib/bacula/bsmtp -h localhost -f \"\(Bacula\) \<%r\>\" -s \"Bacula daemon message\" %r" mail = root@localhost = all, !skipped console = all, !skipped, !saved append = "/var/lib/bacula/log" = all, !skipped } # Default pool definition Pool { Name = Default Pool Type = Backup Recycle = yes # Bacula can automatically recycle Volumes AutoPrune = yes # Prune expired volumes Volume Retention = 365 days # one year } # File Pool definition Pool { Name = File Pool Type = Backup Recycle = yes # Bacula can automatically recycle Volumes AutoPrune = yes # Prune expired volumes Volume Retention = 365 days # one year Maximum Volume Bytes = 50G # Limit Volume size to something reasonable Maximum Volumes = 100 # Limit number of Volumes in Pool } # Scratch pool definition Pool { Name = Scratch Pool Type = Backup } # # Restricted console used by tray-monitor to get the status of the director # Console { Name = nima-desktop-mon Password = "-T0h6HCXWYNy0wWqOomysMvRGflQ_TA6c" CommandACL = status, .status } bacula-fd.conf on client: # # Default Bacula File Daemon Configuration file # # For Bacula release 5.0.3 (08/05/10) -- Windows MinGW32 # # There is not much to change here except perhaps the # File daemon Name # # # "Global" File daemon configuration specifications # FileDaemon { # this is me Name = nimaxp-fd FDport = 9102 # where we listen for the director WorkingDirectory = "C:\\Program Files\\Bacula\\working" Pid Directory = "C:\\Program Files\\Bacula\\working" # Plugin Directory = "C:\\Program Files\\Bacula\\plugins" Maximum Concurrent Jobs = 10 } # # List Directors who are permitted to contact this File daemon # Director { Name = Nima-desktop-dir Password = "Cv70F6pf1t6pBopT4vQOnigDrR0v3L" } # # Restricted Director, used by tray-monitor to get the # status of the file daemon # Director { Name = nimaxp-mon Password = "q5b5g+LkzDXorMViFwOn1/TUnjUyDlg+gRTBp236GrU3" Monitor = yes } # Send all messages except skipped files back to Director Messages { Name = Standard director = Nima-desktop = all, !skipped, !restored } I have checked my firewall and disabled the firewall but it doesn't work.

    Read the article

  • SQL SERVER – Faster SQL Server Databases and Applications – Power and Control with SafePeak Caching Options

    - by Pinal Dave
    Update: This blog post is written based on the SafePeak, which is available for free download. Today, I’d like to examine more closely one of my preferred technologies for accelerating SQL Server databases, SafePeak. Safepeak’s software provides a variety of advanced data caching options, techniques and tools to accelerate the performance and scalability of SQL Server databases and applications. I’d like to look more closely at some of these options, as some of these capabilities could help you address lagging database and performance on your systems. To better understand the available options, it is best to start by understanding the difference between the usual “Basic Caching” vs. SafePeak’s “Dynamic Caching”. Basic Caching Basic Caching (or the stale and static cache) is an ability to put the results from a query into cache for a certain period of time. It is based on TTL, or Time-to-live, and is designed to stay in cache no matter what happens to the data. For example, although the actual data can be modified due to DML commands (update/insert/delete), the cache will still hold the same obsolete query data. Meaning that with the Basic Caching is really static / stale cache.  As you can tell, this approach has its limitations. Dynamic Caching Dynamic Caching (or the non-stale cache) is an ability to put the results from a query into cache while maintaining the cache transaction awareness looking for possible data modifications. The modifications can come as a result of: DML commands (update/insert/delete), indirect modifications due to triggers on other tables, executions of stored procedures with internal DML commands complex cases of stored procedures with multiple levels of internal stored procedures logic. When data modification commands arrive, the caching system identifies the related cache items and evicts them from cache immediately. In the dynamic caching option the TTL setting still exists, although its importance is reduced, since the main factor for cache invalidation (or cache eviction) become the actual data updates commands. Now that we have a basic understanding of the differences between “basic” and “dynamic” caching, let’s dive in deeper. SafePeak: A comprehensive and versatile caching platform SafePeak comes with a wide range of caching options. Some of SafePeak’s caching options are automated, while others require manual configuration. Together they provide a complete solution for IT and Data managers to reach excellent performance acceleration and application scalability for  a wide range of business cases and applications. Automated caching of SQL Queries: Fully/semi-automated caching of all “read” SQL queries, containing any types of data, including Blobs, XMLs, Texts as well as all other standard data types. SafePeak automatically analyzes the incoming queries, categorizes them into SQL Patterns, identifying directly and indirectly accessed tables, views, functions and stored procedures; Automated caching of Stored Procedures: Fully or semi-automated caching of all read” stored procedures, including procedures with complex sub-procedure logic as well as procedures with complex dynamic SQL code. All procedures are analyzed in advance by SafePeak’s  Metadata-Learning process, their SQL schemas are parsed – resulting with a full understanding of the underlying code, objects dependencies (tables, views, functions, sub-procedures) enabling automated or semi-automated (manually review and activate by a mouse-click) cache activation, with full understanding of the transaction logic for cache real-time invalidation; Transaction aware cache: Automated cache awareness for SQL transactions (SQL and in-procs); Dynamic SQL Caching: Procedures with dynamic SQL are pre-parsed, enabling easy cache configuration, eliminating SQL Server load for parsing time and delivering high response time value even in most complicated use-cases; Fully Automated Caching: SQL Patterns (including SQL queries and stored procedures) that are categorized by SafePeak as “read and deterministic” are automatically activated for caching; Semi-Automated Caching: SQL Patterns categorized as “Read and Non deterministic” are patterns of SQL queries and stored procedures that contain reference to non-deterministic functions, like getdate(). Such SQL Patterns are reviewed by the SafePeak administrator and in usually most of them are activated manually for caching (point and click activation); Fully Dynamic Caching: Automated detection of all dependent tables in each SQL Pattern, with automated real-time eviction of the relevant cache items in the event of “write” commands (a DML or a stored procedure) to one of relevant tables. A default setting; Semi Dynamic Caching: A manual cache configuration option enabling reducing the sensitivity of specific SQL Patterns to “write” commands to certain tables/views. An optimization technique relevant for cases when the query data is either known to be static (like archive order details), or when the application sensitivity to fresh data is not critical and can be stale for short period of time (gaining better performance and reduced load); Scheduled Cache Eviction: A manual cache configuration option enabling scheduling SQL Pattern cache eviction based on certain time(s) during a day. A very useful optimization technique when (for example) certain SQL Patterns can be cached but are time sensitive. Example: “select customers that today is their birthday”, an SQL with getdate() function, which can and should be cached, but the data stays relevant only until 00:00 (midnight); Parsing Exceptions Management: Stored procedures that were not fully parsed by SafePeak (due to too complex dynamic SQL or unfamiliar syntax), are signed as “Dynamic Objects” with highest transaction safety settings (such as: Full global cache eviction, DDL Check = lock cache and check for schema changes, and more). The SafePeak solution points the user to the Dynamic Objects that are important for cache effectiveness, provides easy configuration interface, allowing you to improve cache hits and reduce cache global evictions. Usually this is the first configuration in a deployment; Overriding Settings of Stored Procedures: Override the settings of stored procedures (or other object types) for cache optimization. For example, in case a stored procedure SP1 has an “insert” into table T1, it will not be allowed to be cached. However, it is possible that T1 is just a “logging or instrumentation” table left by developers. By overriding the settings a user can allow caching of the problematic stored procedure; Advanced Cache Warm-Up: Creating an XML-based list of queries and stored procedure (with lists of parameters) for periodically automated pre-fetching and caching. An advanced tool allowing you to handle more rare but very performance sensitive queries pre-fetch them into cache allowing high performance for users’ data access; Configuration Driven by Deep SQL Analytics: All SQL queries are continuously logged and analyzed, providing users with deep SQL Analytics and Performance Monitoring. Reduce troubleshooting from days to minutes with database objects and SQL Patterns heat-map. The performance driven configuration helps you to focus on the most important settings that bring you the highest performance gains. Use of SafePeak SQL Analytics allows continuous performance monitoring and analysis, easy identification of bottlenecks of both real-time and historical data; Cloud Ready: Available for instant deployment on Amazon Web Services (AWS). As you can see, there are many options to configure SafePeak’s SQL Server database and application acceleration caching technology to best fit a lot of situations. If you’re not familiar with their technology, they offer free-trial software you can download that comes with a free “help session” to help get you started. You can access the free trial here. Also, SafePeak is available to use on Amazon Cloud. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Error connecting to online fossil repository after changing password.

    - by Toby Allen
    I set up a fossil repository on a shared hosting account I have. I created a perl script fossil.pl that points to a cloned repository that I put up on the webspace. I set all the correct permissions (755). When I go to fossil.pl I get the web ui. Everythings cool. However I'm having a problem with pushes and hoping someone could point me to a solution. When I clone a repository it sets a new password for me (Toby) in the new cloned repository. If I push to this repository online without changing the password it works fine, I can push up changes from my local machine to the online repository. However once I change the password for Toby (to something more easily remembered by me) I get the following error. Bytes Cards Artifacts Deltas Send: 1810 9 0 2 1Server Error: not authorized to write fossil: server says: not authorized to write Anyone know why this is happening? Anyone know how to fix it?

    Read the article

  • Large file upload into WSS v3

    - by Rubens Farias
    I'd built an WSSv3 application which upload files in small chunks; when every data piece arrives, I temporarly keep it into a SQL 2005 image data type field for performance reasons**. Problem come when upload ends; I need to move data from my SQL Server to Sharepoint Document Library through WSSv3 object model. Right now, I can think two approaches: SPFileCollection.Add(string, (byte[])reader[0]); // OutOfMemoryException and SPFile file = folder.Files.Add("filename", new byte[]{ }); using(Stream stream = file.OpenBinaryStream()) { // ... init vars and stuff ... while ((bytes = reader.GetBytes(0, offset, buffer, 0, BUFFER_SIZE)) 0) { stream.Write(buffer, 0, (int)bytes); // Timeout issues } file.SaveBinary(stream); } Are there any other way to complete successfully this task? ** Performance reasons: if you tries to write every chunk directly at Sharepoint, you'll note a performance degradation as file grows up (100Mb).

    Read the article

  • Calling Save on a Configuration obj with a custom sectiongroup of type NameValueSectionHandler

    - by Allen
    I've got aConfiguration object that I can easily read and write settings from and call Save(ConfigurationSaveMode.Minimal, true) on and that typically works just fine. However, I now have a config file that has a custom sectionGroup defined of type System.Configuration.NameValueSectionHandler (I didn't write that part and I can't change it). Now when I call the Save method, it throws a TypeLoadException Message: Type System.Configuration.NameValueSectionHandler does not inherit from System.Configuration.ConfigurationSectionGroup. Callstack: at System.Configuration.TypeUtil.VerifyAssignableType (Type baseType, Type type, Boolean throwOnError) at System.Configuration.TypeUtil.GetConstructorWithReflectionPermission (Type type, Type baseType, Boolean throwOnError) at System.Configuration.MgmtConfigurationRecord.CreateSectionGroupFactory (FactoryRecord factoryRecord) at System.Configuration.MgmtConfigurationRecord.EnsureSectionGroupFactory (FactoryRecord factoryRecord) at System.Configuration.MgmtConfigurationRecord.GetSectionGroup (String configKey) at System.Configuration.ConfigurationSectionGroupCollection.Get (String name) at System.Configuration.ConfigurationSectionGroupCollection. <GetEnumerator>d__0.MoveNext() at System.Configuration.Configuration.ForceGroupsRecursive (ConfigurationSectionGroup group) at System.Configuration.Configuration.SaveAsImpl (String filename, ConfigurationSaveMode saveMode, Boolean forceSaveAll) at System.Configuration.Configuration.Save (ConfigurationSaveMode saveMode, Boolean forceSaveAll) at MyCompany.MyProduct.blah.blah.Save () in C:\\projects\\blah\\blah\\blah.cs:line blah For reference, the configuration goes like so <configuration> <configSections> ... normal sections here that work just fine, followed by ... <sectionGroup name="threadSettings" type="System.Configuration.NameValueSectionHandler"> <section name="T1" type="System.Configuration.DictionarySectionHandler"/> <section name="T2" type="System.Configuration.DictionarySectionHandler"/> <section name="T3" type="System.Configuration.DictionarySectionHandler"/> </sectionGroup> </configSections> <applicationSettings /> <threadSettings> <T1> <add key="key-here" value="value-here"/> </T1> ... T2 and T3 here as well, thats not interesting ... </threadSettings> </configuration> If I remove the actual sections and the definition for the custom section groups, I am able to read / write settings just fine and even call Save just fine. Obviously the custom section groups could be setup a little nicer but I can't fix that so I'm wondering: Is it possible to get the Configuration object to save if it has custom sectionGroups of the NameValueSectionHandler type

    Read the article

  • How to pass a very long string/file into RESTWebservice JAX-RS Jersey

    - by Sashikiran Challa
    Hello All, I am trying to write a webservice that takes in an XML string, does parsing of it using DOM and extract particular things I want. My XML string happens to be very long so I do not want to pass it as a @QueryParam or @PathParam. Say If I write that XML string into a file, How do I go about writing a RESTful service that takes in this file, extracts whatever I want and return the results. I am actually trying to extract some number of strings, so my output should probably be an ArrayList having all these strings. Could somebody please shed some light on how I should go about doing this. Thanks in advance

    Read the article

  • Problem to display a pdf from my JSF Portlet of Liferay

    - by Stefano
    I use liferay 5.2 with jsf-portlet. From the page I want to press a button to generate one PDF. In managedbean i build pdf and I want to show it in response. In a ByteArrayOutputStream named outputStream i have my pdf built with JasperReport. I write: PortletResponse portletResponse = (PortletResponse)externalCtx.getResponse(); HttpServletResponse httpResponse = PortalUtil.getHttpServletResponse(portletResponse); ServletOutputStream out = httpResponse.getOutputStream(); String filename="Pdf" + System.currentTimeMillis()+".pdf"; httpResponse.reset(); httpResponse.setContentType("application/pdf"); httpResponse.setHeader("Content-Disposition", "attachment; filename=\""+ filename + "\""); httpResponse.setContentLength(outputStream.size()); outputStream.writeTo(out); out.flush(); out.close(); I do not see anything output! In jboss log i read: IllegaStateException.... What is wrong? LOG 11:03:19,716 INFO [STDOUT] 11:03:19,716 ERROR [IncludeTag] Current URL /web/organo-di-governo/datawarehouse?p_p_id=1_WAR_Portlet_Datawarehouse_INSTANCE_D7s7&p_p_lifecycle=1&p_p_state=normal&p_p_mode=view&p_p_col_id=column-2&p_p_col_count=1&_1_WAR_Portlet_Datawarehouse_INSTANCE_D7s7_com.sun.faces.portlet.VIEW_ID=%2Fview.xhtml&_1_WAR_Portlet_Datawarehouse_INSTANCE_D7s7_com.sun.faces.portlet.NAME_SPACE=_1_WAR_Portlet_Datawarehouse_INSTANCE_D7s7_ generates exception: null 11:03:19,717 INFO [STDOUT] 11:03:19,717 ERROR [IncludeTag] java.lang.IllegalStateException at com.liferay.portal.servlet.filters.strip.StripResponse.getWriter(StripResponse.java:85) at org.apache.jasper.runtime.JspWriterImpl.initOut(JspWriterImpl.java:125) at org.apache.jasper.runtime.JspWriterImpl.flushBuffer(JspWriterImpl.java:118) at org.apache.jasper.runtime.JspWriterImpl.write(JspWriterImpl.java:326) at org.apache.jasper.runtime.JspWriterImpl.write(JspWriterImpl.java:342) at org.apache.jasper.runtime.JspWriterImpl.print(JspWriterImpl.java:468) at com.liferay.taglib.util.ThemeUtil.includeVM(ThemeUtil.java:208) at com.liferay.taglib.util.ThemeUtil.include(ThemeUtil.java:68) at com.liferay.taglib.util.IncludeTag.doEndTag(IncludeTag.java:59) at org.apache.jsp.html.common.themes.portal_jsp._jspx_meth_liferay_002dtheme_005finclude_005f1(portal_jsp.java:816) at org.apache.jsp.html.common.themes.portal_jsp._jspx_meth_c_005fotherwise_005f0(portal_jsp.java:788) at org.apache.jsp.html.common.themes.portal_jsp._jspService(portal_jsp.java:724) at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70) at javax.servlet.http.HttpServlet.service(HttpServlet.java:803) at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:373) at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:336) at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:265) at javax.servlet.http.HttpServlet.service(HttpServlet.java:803) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) 11:03:19,718 ERROR [[jsp]] Servlet.service() for servlet jsp threw exception java.lang.IllegalStateException 11:03:19,719 ERROR [[Main Servlet]] Servlet.service() for servlet Main Servlet threw exception java.lang.IllegalStateException at com.liferay.portal.servlet.filters.strip.StripResponse.getWriter(StripResponse.java:85) at org.apache.jasper.runtime.JspWriterImpl.initOut(JspWriterImpl.java:125) at org.apache.jasper.runtime.JspWriterImpl.flushBuffer(JspWriterImpl.java:118) 11:03:19,722 INFO [STDOUT] 11:03:19,720 ERROR [OpenSSOFilter] org.apache.jasper.JasperException: java.lang.IllegalStateException org.apache.jasper.JasperException: java.lang.IllegalStateException at org.apache.jasper.servlet.JspServletWrapper.handleJspException(JspServletWrapper.java:521) at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:409) at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:336) at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:265) 11:03:19,722 INFO [STDOUT] n.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96) Caused by: java.lang.IllegalStateException at com.liferay.portal.servlet.filters.strip.StripResponse.getWriter(StripResponse.java:85) at org.apache.jasper.runtime.JspWriterImpl.initOut(JspWriterImpl.java:125) at org.apache.jasper.runtime.JspWriterImpl.flushBuffer(JspWriterImpl.java:118)

    Read the article

< Previous Page | 330 331 332 333 334 335 336 337 338 339 340 341  | Next Page >