Search Results

Search found 59273 results on 2371 pages for 'data protection'.

Page 219/2371 | < Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >

  • What language is to binary, as Perl is to text?

    - by ehdr
    I am looking for a scripting (or higher level programming) language (or e.g. modules for Python or similar languages) for effortlessly analyzing and manipulating binary data in files (e.g. core dumps), much like Perl allows manipulating text files very smoothly. Things I want to do include presenting arbitrary chunks of the data in various forms (binary, decimal, hex), convert data from one endianess to another, etc. That is, things you normally would use C or assembly for, but I'm looking for a language which allows for writing tiny pieces of code for highly specific, one-time purposes very quickly. Any suggestions?

    Read the article

  • Print ms access data in vb.net

    - by user225269
    How do I print the ms access data(.mdb) in vb.net? Here is the code that I'm using to view the data in the form. What I want to do is to be able to print what is currently being viewed. Perhaps automatically save the .pdf file and the pdf viewer installed on the system will open that newly generated pdf file Dim cn As New OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\search.mdb") Dim cmd As OleDbCommand = New OleDbCommand("Select * from GH where NAME= '" & TextBox6.Text & "' ", cn) cn.Open() Dim rdr As OleDbDataReader rdr = cmd.ExecuteReader If rdr.HasRows Then rdr.Read() NoAcc = rdr("NAME") If (TextBox6.Text = NoAcc) Then TextBox1.Text = rdr("IDNUMBER") If (TextBox6.Text = NoAcc) Then TextBox7.Text = rdr("DEPARTMENT") If (TextBox6.Text = NoAcc) Then TextBox8.Text = rdr("COURSE") End If -some sites for beginners regarding this topic would help a lot:)

    Read the article

  • To Wrap or Not to Wrap: Wrapping Data Access in a Service Facade

    - by PureCognition
    For a while now, my team and I have been wrapping our data access layer in a web service facade (using WCF) and calling it from the business logic layer. Meanwhile, we could simply use the repository pattern where the business logic layer consumes the data access layer locally through an interface, and at any point in time, we can switch things out for it to hit a service instead (if necessary). The question is: When is it a good time to wrap the data access layer in a service facade and when isn't it? Right now, it seems like the main advantage is that other applications can consume the service, but if they are internal applications written in .NET then they can just consume the .NET assembly instead. Are there other advantages of having the DAL be wrapped in a service that I am unaware of?

    Read the article

  • PostgreSQL 8.3 data types: xml vs varchar

    - by Sejanus
    There's xml data type in Postgres, I never used it before so I'd like to hear opinions. Downsides and upsides vs using regular varchar (or Text) column to store xml. The text I'm going to store is xml, well-formed, UTF-8. No need to search by it (I've read searching by xml is slow). This XML actually is data prepared for PDF generation with Apache FOP. XML can be generated dynamically from data found elsewhere (other Postgres tables), it's stored as is only so that I won't need to generate it twice. Kinda backup#2 for already generated PDF documents. Anything else to know? Good practices, performance, maintenance, etc?

    Read the article

  • how to get group total in self refrenced data in data table ?

    - by Nikhil Vaghela
    I have three columns in my data table. 1) ProductID 2) ProductParentID 3) ProductTotal ProductID and ProductParentID are self refrencing columns where i can set parent child relationship and get child rows based on my relationship. Let us say i have following data Product1     Product11     Product12     Product13         Product131         Product132         Product133 Product2     Product21     Product22     Product23 Next to above hierarchy in Product total column what i want is total of each child rows and sum of those child rows product total should be rolled up to it parent product. E.g if Product 131 total is 10,Product 13 total is 15 and Product 133 total is 5 then the product 13 total should be 30. The logic should work for n number of self hierarchy. Is there any functionality in data table itself where i can achieve this without iterating through each row and do it manually ? Thanks.

    Read the article

  • Reading timecode track data using the C# version of Quicktime

    - by tinmaru
    Hi, I'm using the .Net activeX component of Quicktime. I would like to read the timecode track data contained in a QTMovie track. I can already select my timecode track like this : // Valid Quicktime movie QTMovie movie; QTUtils qtu = new QTUtils(); for (int i = 1; i <= movie.Tracks.Count; i++) { if (movie.Tracks[i].Type == qtu.StringToFourCharCode("tmcd")) { QTTrack tcTrack = movie.Tracks[i]; // // Timecode data reading ? // } Is there a way to extract the timecode data? Thank you for your help!

    Read the article

  • get length of data sent over network to TCPlistener/networkstream vb.net

    - by Jonathan.
    It seems the most obvious thing, but I just can't work out how to get the length of bytes sent over a network using a TCPClient and TCPListener? This is my code so far: 'Must listen on correct port- must be same as port client wants to connect on. Const portNumber As Integer = 9999 Dim tcpListener As New TcpListener(IPAddress.Parse("192.168.2.7"), portNumber) tcpListener.Start() Console.WriteLine("Waiting for connection...") 'Accept the pending client connection and return 'a TcpClient initialized for communication. Dim tcpClient As TcpClient = tcpListener.AcceptTcpClient() Console.WriteLine("Connection accepted.") ' Get the stream Dim networkStream As NetworkStream = tcpClient.GetStream() '' Read the stream into a byte array I need to get the length of the networkstream to set the size of the array of bytes I'm going to read the data into. But the networkStream.length is unsupported and does not work and throws an Notsupportedexception. The only other way I can think of is to send the size of the data before sending the data, but this seems the long way round.

    Read the article

  • Cross Domain Post - Losing POST Data

    - by Tomas Beblar
    I have 2 servers, both running R2 / IIS7 / ASP Classic sites (can't get around any of that) Server A is making the follow calls: Dim objXMLHTTP, xml Set xml = Server.CreateObject("Msxml2.ServerXmlHTTP.6.0") xml.Open "POST", templateName, false xml.setRequestHeader "Content-Type", "application/xml" xml.Send variables Where the templateName is the URL of Server B (It's an email template) ... and variables are a name value pair string like a query string password=myPassword&customerEmail=Dear+Bob,.... Server B receives the POST but all the POST data (password=myPassword&customerEmail=Dear+Bob,....) is missing from the POST password = Request.Form("templatePassword") customerEmail = Request.Form("RackAttackCustomerEmail") The above values are all empty. Here's the kicker. This all worked on our old servers (Windows Server 2003, IIS 6) But when we migrated over, this stopped working correctly. My question is: What would cause the POST data to be dropped in IIS 7 when it all worked in IIS 6? I've done about 3 days of research into this trying many different things and nothing has worked. The POST data is just gone.

    Read the article

  • Can i write to data output stream after reading response from data input stream?

    - by Sirius
    Hi, i want to do a client-server activity like this: 1. first the client sends/writes to output stream 2. the server responses with some data that will be read with input stream 3. after receiving the data, the client sends/writes to output stream again to respond that the data has been received now, do i have to close the output stream and re-open it again before doing step no.3 ? also if someone could provide me with a snippet, it would be really helpful. thanks

    Read the article

  • Data Integration/EAI Project Lessons Learned

    - by Greg Harman
    Have you worked on a significant data or application integration project? I'm interested in hearing what worked for you and what didn't and how that affected the project both during and after implementation (i.e. during ongoing operation, maintenance and expansion). In addition to these lessons learned, please describe the project by including a quick overview of: The data sources and targets. Specifics are not necessary, but I'd like to know general technology categories e.g. RDBMS table, application accessed via a proprietary socket protocol, web service, reporting tool. The overall architecture of the project as related to data flows. Different human roles in the project (was this all done by one engineer? Did it include analysts with a particular expertise?) Any third-party products utilized, commercial or open source.

    Read the article

  • Parsing specific numeric data from csv file using python

    - by KJ Lim
    Good morning. I have series of data in cvs file like below, 1,,, 1,137.1,1198,1.6 2,159,300,0.4 3,176,253,0.3 4,197,231,0.3 5,198,525,0.7 6,199,326,0.4 7,215,183,0.2 8,217.1,178,0.2 9,244.2,416,0.5 10,245.1,316,0.4 I want to extract specific data from second column for example 217.1 and 245.1 and have them concatenated into a new file like, 8,217.1,178,0.2 10,245.1,316,0.4 I use cvs module to read my cvs file, but, I can't extract specific data as I desire. Could anyone kindly please help me. Thank you.

    Read the article

  • sort data in c language

    - by ANIL MANE
    Hello C experts, I need little help on following requirement, as I know very little about C syntaxes. I have data in a file like this 73 54 57 [52] 75 73 65 [23] 65 54 57 [22] 22 59 71 [12] 22 28 54 [2] 65 22 54 73 [12] 65 28 54 73 [52] 22 28 65 73 [42] 65 54 57 73 [22] 22 28 54 73 [4] Where values in bracket denotes the occurrence of that series. I need to sort this data based on the occurrence of the data descending with maximum elements on the top as follows 65 28 54 73 [52] 22 28 65 73 [42] 65 54 57 73 [22] 65 22 54 73 [12] 22 28 54 73 [4] 28 59 71 [122] 73 54 57 [52] 22 28 65 [26] .. . . . and so on... Can someone give me a quick code for this. Thanks in advance.

    Read the article

  • EasyXDM passing data issue

    - by Jeff Ryan
    I'm using rpc with XDM, and I can send simple data back and forth easily between child and parent window. But it seems to be limited to simple strings and numbers. The demos on the site only use numbers. When I try to send a json ecoded string, I get a cross domain error. When I use cors, I can make ajax requests fine, but I can't display the child page in the iframe, because the data is returned and not rendered. My question is, how can I render an iframe, and pass complex data back and forth. Or maybe I am doing something wrong?

    Read the article

  • Importing large datasets on iPhone using CoreData

    - by Matthes
    Hi there, I'm facing very annoying problem. My iPhone app is loading it's data from a network server. Data are sent as plist and when parsed, it neeeds to be stored to SQLite db using CoreData. Issue is that in some cases those datasets are too big (5000+ records) and import takes way too long. More on that, when iPhone tries to suspend the screen, Watchdog kills the app because it's still processing the import and does not respond up to 5 seconds, so import is never finished. I used all recommended techniques according to article "Efficiently Importing Data" http://developer.apple.com/mac/library/DOCUMENTATION/Cocoa/Conceptual/CoreData/Articles/cdImporting.html and other docs concerning this, but it's still awfully slow. Solution I'm looking for is to let app suspend, but let import run in behind (better one) or to prevent attempts to suspend the app at all. Or any better idea is welcomed too. Any tips on how to overcome these issues are highly appreciated! Thanks

    Read the article

  • Any clever way to fix 'string or binary data would be truncated' warning with LINQ

    - by Simon_Weaver
    Is there a clever way to determine which field is causing 'string or binary data would be truncated' with LINQ. I've always ended up doing it manually by stepping through a debugger, but with a batch using 'SubmitChanges' I have to change my code to inserting a single row to find the culprit in a batch of rows. Am I missing something or in this day and age do I really have to still use a brute force method to find the problem. Please dont give me advice on avoiding this error in future (unless its something much cleverer than 'validate your data'). The source data is coming from a different system where I dont have full control anyway - plus I want to be lazy. PS. Does SQL Server 2008 actually tell me the field name. Please tell me it does! I'll upgrade!

    Read the article

  • Powershell - Splitting variable into chunks

    - by Andrew
    I have written a query in Powershell interrogating a F5 BIG-IP box through it's iControl API to bring back CPU usage etc. Using this code (see below) I can return the data back into a CSV format which is fine. However the $csvdata variable contains all the data. I need to be able to take this variable and for each line split each column of data into a seperate variable. The output currently looks like this: timestamp,"Utilization" 1276181160,2.3282800000e+00 Any advice would be most welcome $SystemStats = (Get-F5.iControl).SystemStatistics ### Allocate a new Query Object and add the inputs needed $Query = New-Object -TypeName iControl.SystemStatisticsPerformanceStatisticQuery $Query.object_name = $i $Query.start_time = $startTime $Query.end_time = 0 $Query.interval = $interval $Query.maximum_rows = 0 ### Make method call passing in an array of size one with the specified query $ReportData = $SystemStats.get_performance_graph_csv_statistics( (,$Query) ) ### Allocate a new encoder and turn the byte array into a string $ASCII = New-Object -TypeName System.Text.ASCIIEncoding $csvdata = $ASCII.GetString($ReportData[0].statistic_data)

    Read the article

  • Real-time data on webpage with Django and jQuery

    - by Steven Hepting
    I would like a webpage that constantly updates a graph with new data as it arrives. Regularly, all the data you have is passed to a Django view at the beginning of the request. However, I need the page to be able to update itself with fresh information every few seconds to redraw the graph. Background The webpage will be similar to this http://www.panic.com/blog/2010/03/the-panic-status-board/. The data coming in will temperature values to be graphed measured by an Arduino and saved to the Django database (I've already done this part).

    Read the article

  • Is SSIS able to query flat files from another Windows Server?

    - by atricapilla
    I pretty new SQL Server Integration Server (SSIS) user. Is SSIS able to query data from text files located in another Windows Server? I mean that when SSIS is installed on Windwos Server A, is SSIS able to query data from e.g. one folder containing text files in Windows Server B (under same domain)? I have used only SAP BO Data Integrator ETL tool and it cannot query flat files from another Server: during execution, all files must be located on the Job Server machine that executes the job.

    Read the article

  • cannot display json data returned from jquery ajax call

    - by amby
    Hi, can somebody please tell me, how can I display json data returning from the ajax call. I am new to this. $.ajax({ type: "POST", dataType: 'JSON', //data: "{'ntid':'john'}", //contentType: "application/json; charset=utf-8", //processData: false, url: "Testing.aspx/SendMessage", error: function(XMLHttpRequest, textStatus, errorThrown) { alert(textStatus); }, success: function(result, txtStatus, httpRequest) { alert(txtStatus); the_object = result; $('#status').html(concatObject(the_object)); } above is the js file. should i need to do something on asp file directly to display it. if yes, then how? please reply me soon. i m stuck here and unable to display data here

    Read the article

  • Streaming support for flex with Ruby On Rails (working with live data )

    - by Ashine
    Hi Freinds, I am working on flex dasboards and charting stuff. Till now I have build them for static data only now I want to upgrade them to work for real time data where new data is continuosly sent to client (swf file) from server and it updates the same. I am using Ruby On Rails (RoR) at server side. Is there some thing similar to 'Adbobe live cycle(Java-Flex)' applicable for RoR-Flex architecture that can be used here ? Please share the links for any similar implementation in RoR-Flex architecture. Or if you have some suggestions to share I will really appreciate. Thanks friends.

    Read the article

  • How can I programatically convert SQL data-types to .Net data-types?

    - by Simon
    Can anyone show me a way of converting SQL Server data-types (varchar for example) to .Net data-types (String for example). I'm assuming that automatic conversion is not possible? I have an 'EntityProperty' object and would like it to have an appropriate 'Type' property (string, decimal, int32 etc), at the moment this property is just a string - 'int32' for example. A little background: I'm using SQL DMO in an internal code generation app to query a database and generate a stored procedure based DAL from the database. Being an internal app I can take quite a few shortcuts and make quite a few assumptions. To get the app working at the moment this data-type conversion is handled by a Select Case statement which just converts the types to strings and generates a set of properties based on these strings but I would prefer a little more flexibility in being able to handle the types (use of TypeOf etc). Anyone worked on something similar? I know EF, nHibernate, Subsonic etc could do all this for me but in this case, for various reasons, I am having to roll my own. :)

    Read the article

  • Good conventions for embedding schema of a flat file

    - by Ville Koskinen
    We receive lots of data as flat files: delimitted or just fixed length records. It's sometimes hard to find out what the files actually contain. Are there any well established practices for embedding the schema of the file to the beginning or the end of a file to make the file self-explanatory? Just to get an idea, imagine something like this: <data name=test records=2 type=fixed> <field name=foo start=0 length=2 type=numeric> <field name=bar start=2 length=4 type=text> </data> 11test 12ing We would parse the xml in the beginning and use it for reading the records.

    Read the article

  • Repository vs Data Access

    - by vdh_ant
    Hi guys In the context of the n-tier application, is there a difference between what you would consider your data access classes to be and your repositories? I tend to think yes but I just wanted to see what other thought. My thinking is that the job of the repository is just to contain and execute the raw query itself, where as the data access class would create the context, execute the repository (passing in the context), handle mapping the data model to the domain model and return the result back up... What do you guys think? Also do you see any of this changing in a Linq to XML scenario (assuming that you change the context for the relevant XDocument)? Cheers Anthony

    Read the article

  • How can I perform a web query in C# similar to the Data > Import External Data > New Web Query in Mi

    - by TNT
    I need to pull data from a table on a website. I can easily do this in VBA using the Web Query, but I need to do this in C#. I'm just having some trouble figuring out how to properly convert the code. I got something close, but it's returning HTML with the data. I just want the data. Any help would be great. The block of code giving me issues is below: Private Function GetData(ByVal theURL As String, ByVal theRow As Integer, ByVal thePosition As Integer, ByVal theColumn As String, ByVal theTable As Integer) With ActiveSheet.QueryTables.Add(Connection:="URL;" + theURL, Destination:=Sheet2.Range(theColumn & theRow + 1)) .Name = "op?s=" + Mid(theURL, thePosition + 1) + "_1" .PreserveFormatting = True .AdjustColumnWidth = True .WebTables = theTable .Refresh BackgroundQuery:=False End With End Function

    Read the article

  • Replication - syncronizing most of the data some of the time

    - by uncle brad
    I have some data that isn't properly "partitioned" (for lack of a better word). All inserts, processing and reporting happen on the same table. The bulk of the processing happens not long after the insert and not long after that it becomes immutable (we're talking days). I could do all inserts and processing on a new table that I replicate to the old table. When I detect that the data has become immutable I would delete the data from the new table, but I would edit the delete replication stored procedure so that the delete did not replicate. How bad an idea is this? It seems attractive at the moment (I haven't slept on it yet) because it might mitigate a performance problem with only very small changes to the application. It also seems like it might be a good way to shoot myself in the foot.

    Read the article

< Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >