Search Results

Search found 13808 results on 553 pages for 'remote storage'.

Page 200/553 | < Previous Page | 196 197 198 199 200 201 202 203 204 205 206 207  | Next Page >

  • Memory management with Objective-C Distributed Objects: my temporary instances live forever!

    - by jkp
    I'm playing with Objective-C Distributed Objects and I'm having some problems understanding how memory management works under the system. The example given below illustrates my problem: Protocol.h #import <Foundation/Foundation.h> @protocol DOServer - (byref id)createTarget; @end Server.m #import <Foundation/Foundation.h> #import "Protocol.h" @interface DOTarget : NSObject @end @interface DOServer : NSObject < DOServer > @end @implementation DOTarget - (id)init { if ((self = [super init])) { NSLog(@"Target created"); } return self; } - (void)dealloc { NSLog(@"Target destroyed"); [super dealloc]; } @end @implementation DOServer - (byref id)createTarget { return [[[DOTarget alloc] init] autorelease]; } @end int main() { NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init]; DOServer *server = [[DOServer alloc] init]; NSConnection *connection = [[NSConnection new] autorelease]; [connection setRootObject:server]; if ([connection registerName:@"test-server"] == NO) { NSLog(@"Failed to vend server object"); } else [[NSRunLoop currentRunLoop] run]; [pool drain]; return 0; } Client.m #import <Foundation/Foundation.h> #import "Protocol.h" int main() { unsigned i = 0; for (; i < 3; i ++) { NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init]; id server = [NSConnection rootProxyForConnectionWithRegisteredName:@"test-server" host:nil]; [server setProtocolForProxy:@protocol(DOServer)]; NSLog(@"Created target: %@", [server createTarget]); [[NSRunLoop currentRunLoop] runUntilDate:[NSDate dateWithTimeIntervalSinceNow:1.0]]; [pool drain]; } return 0; } The issue is that any remote objects created by the root proxy are not released when their proxy counterparts in the client go out of scope. According to the documentation: When an object’s remote proxy is deallocated, a message is sent back to the receiver to notify it that the local object is no longer shared over the connection. I would therefore expect that as each DOTarget goes out of scope (each time around the loop) it's remote counterpart would be dellocated, since there is no other reference to it being held on the remote side of the connection. In reality this does not happen: the temporary objects are only deallocate when the client application quits, or more accurately, when the connection is invalidated. I can force the temporary objects on the remote side to be deallocated by explicitly invalidating the NSConnection object I'm using each time around the loop and creating a new one but somehow this just feels wrong. Is this the correct behaviour from DO? Should all temporary objects live as long as the connection that created them? Are connections therefore to be treated as temporary objects which should be opened and closed with each series of requests against the server? Any insights would be appreciated.

    Read the article

  • Upgraded Ubuntu 12.04 -> 12.10 and Drupal 7 site now get error

    - by Paul B
    I do all my Drupal 7 webdev and today I took advantage to upgrade my local WebDev box O/S to Ubuntu 12.10 from 12.04 and now I get the following errors for all my D7 projects on my localhost WebDev box (Ubuntu 12.10) It was all fine pre Ubuntu 12.04: Error The website encountered an unexpected error. Please try again later. Error message PDOException: SQLSTATE[42000]: Syntax error or access violation: 1286 Unknown storage engine 'InnoDB': SELECT expire, value FROM {semaphore} WHERE name = :name; Array ( [:name] => variable_init ) in lock_may_be_available() (line 167 of /var/www/jobsdaily/includes/lock.inc). A quick research and look into the phpmyadmin (3.4.11.1) and it seems InnoDB is an issue and when I click on a table to see data I get #1286 - Unknown storage engine 'InnoDB'. I have all my D7 sql backed up, but don't really want to go down the whole 'import' route, since it's 10 months work! Anyone had this issues and can anyone suggest fix ideas? Thanks

    Read the article

  • Reading POST data from html form sent to serversocket.

    - by user32167
    i try to write simplest possible server app in Java, displaying html form with textarea input, which after submitting gives me possibility to parse xml typed in that textarea. For now i build simple serversocket based server like that: import java.io.BufferedReader; import java.io.InputStreamReader; import java.io.PrintWriter; import java.net.ServerSocket; import java.net.Socket; public class WebServer { protected void start() { ServerSocket s; String gets = ""; System.out.println("Start on port 80"); try { // create the main server socket s = new ServerSocket(80); } catch (Exception e) { System.out.println("Error: " + e); return; } System.out.println("Waiting for connection"); for (;;) { try { // wait for a connection Socket remote = s.accept(); // remote is now the connected socket System.out.println("Connection, sending data."); BufferedReader in = new BufferedReader(new InputStreamReader( remote.getInputStream())); PrintWriter out = new PrintWriter(remote.getOutputStream()); String str = "."; while (!str.equals("")) { str = in.readLine(); if (str.contains("GET")){ gets = str; break; } } out.println("HTTP/1.0 200 OK"); out.println("Content-Type: text/html"); out.println(""); // Send the HTML page String method = "get"; out.print("<html><form method="+method+">"); out.print("<textarea name=we></textarea></br>"); out.print("<input type=text name=a><input type=submit></form></html>"); out.println(gets); out.flush(); remote.close(); } catch (Exception e) { System.out.println("Error: " + e); } } } public static void main(String args[]) { WebServer ws = new WebServer(); ws.start(); } } After form (textarea with xml and one additional text input) is submitted in 'gets' String-type variable I have Urlencoded values of my variables (also displayed on the screen, it looks like that: gets = GET /?we=%3Cnetwork+ip_addr%3D%2210.0.0.0%2F8%22+save_ip%3D%22true%22%3E%0D%0A%3Csubnet+interf_used%3D%22200%22+name%3D%22lan1%22+%2F%3E%0D%0A%3Csubnet+interf_used%3D%22254%22+name%3D%22lan2%22+%2F%3E%0D%0A%3C%2Fnetwork%3E&a=fooBar HTTP/1.1 What can i do to change GET to POST method (if i simply change it in form and than put " if (str.contains("POST")){" it gives me string like gets = POST / HTTP/1.1 with no variables. And after that, how i can use xml from my textarea field (called 'we')?

    Read the article

  • What is a good way to keep track of strings for dictionary lookups?

    - by Justin
    I am working through the Windows 8 app tutorial. They have some code about saving app data like so: private void NameInput_TextChanged(object sender, TextChangedEventArgs e) { Windows.Storage.ApplicationDataContainer roamingSettings = Windows.Storage.ApplicationData.Current.RoamingSettings; roamingSettings.Values["userName"] = nameInput.Text; } I have worked with C# in the past and found that things like using constant string values (like "userName" in this case) for keys could get messy because auto-complete did not work and it was easy to forget if I had made an entry for a setting before and what it was called. So if I don't touch code for awhile I end up accidentally creating multiple entries for the same value that are named slightly differently. Surely there is a better way to keep track of the strings that key to those values. What is a good solution to this problem?

    Read the article

  • A little gem from MPN&ndash;FREE online course on Architectural Guidance for Migrating Applications to Windows Azure Platform

    - by Eric Nelson
    I know a lot of technical people who work in partners (ISVs, System Integrators etc). I know that virtually none of them would think of going to the Microsoft Partner Network (MPN) learning portal to find some deep and high quality technical content. Instead they would head to MSDN, Channel 9, msdev.com etc. I am one of those people :-) Hence imagine my surprise when i stumbled upon this little gem Architectural Guidance for Migrating Applications to Windows Azure Platform (your company and hence your live id need to be a member of MPN – which is free to join). This is first class stuff – and represents about 4 hours which is really 8 if you stop and ponder :) Course Structure The course is divided into eight modules.  Each module explores a different factor that needs to be considered as part of the migration process. Module 1:  Introduction:  This section provides an introduction to the training course, highlighting the values of the Windows Azure Platform for developers. Module 2:  Dynamic Environment: This section goes into detail about the dynamic environment of the Windows Azure Platform. This session will explain the difference between current development states and the Windows Azure Platform environment, detail the functions of roles, and highlight development considerations to be aware of when working with the Windows Azure Platform. Module 3:  Local State: This session details the local state of the Windows Azure Platform. This section details the different types of storage within the Windows Azure Platform (Blobs, Tables, Queues, and SQL Azure). The training will provide technical guidance on local storage usage, how to write to blobs, how to effectively use table storage, and other authorization methods. Module 4:  Latency and Timeouts: This session goes into detail explaining the considerations surrounding latency, timeouts and how to assess an IT portfolio. Module 5:  Transactions and Bandwidth: This session details the performance metrics surrounding transactions and bandwidth in the Windows Azure Platform environment. This session will detail the transactions and bandwidth costs involved with the Windows Azure Platform and mitigation techniques that can be used to properly manage those costs. Module 6:  Authentication and Authorization: This session details authentication and authorization protocols within the Windows Azure Platform. This session will detail information around web methods of authorization, web identification, Access Control Benefits, and a walkthrough of the Windows Identify Foundation. Module 7:  Data Sensitivity: This session details data considerations that users and developers will experience when placing data into the cloud. This section of the training highlights these concerns, and details the strategies that developers can take to increase the security of their data in the cloud. Module 8:  Summary Provides an overall review of the course.

    Read the article

  • ArchBeat Link-o-Rama for 10-24-2012

    - by Bob Rhubart
    Play Oracle Vanquisher Here's a little respite from whatever it is you normally spend your time on. Oracle Vanquisher is an online diversion that makes a game of data center optimization. According to the description: "Armed with a cool Oracle vacuum pack suit and a strategic IT roadmap, you will thwart threats and optimize your data center to increase your company’s stock price and boost your company's position." Mainly you avoid electric shock and killer birds. The current high score belongs to someone identified as "TEN." My score? Never mind. Book: DevOps for Developers | The Java Source The subject of DevOps has come up in a couple of recent OTN ArchBeat Podcasts, so it's somewhat serendipitous that Tori Weildt's recent blog post offers an overview of Java Champion Michael Hutterman's new book, DevOps for Developers, now available from Apress. Bring Your Own Device (BYOD) : Context is everything… | The ORACLE-BASE Blog BOYD is a factor in the evolution of IT, but in what context? "The real IT work in companies is still being done on PCs," says Oracle ACE Director Tim Hall. "Yes, you can use a cloud service on your phone, but look around the office and you will see those cloud services are actually being used by people on PCs." Oracle in the Cloud: Oracle EBusiness Suite sizing | Tom Laszewski Cloud expert Tom Laszewski shares several technical resources that will be helpful for sizing of Oracle EBusiness Suite. Setting Up, Configuring, and Using an Oracle WebLogic Server Cluster Author and expert Yuli Vasiliev shows you how take advantage of multiple Oracle WebLogic Server instances grouped into a cluster to maximize scalability and availability. Webcast: Reduce Costs with Oracle's Database Storage Management Watch this! Join Oracle experts Kevin Jernigan and Margaret Hamburger for an interactive webcast in which you'll learn how Oracle's Database Storage Management can reduce storage costs and management complexity while improving query performance to meet service-level agreements and compliance requirements. Event Date: Tuesday, November 6, 2012 Event Time: 10 a.m. PT/1 p.m. ET Thought for the Day "Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves." — Alan Kay Source: softwarequotes.com

    Read the article

  • New EMEA Partner Community for Hardware

    - by Julien Haye
    We are delighted to announce the availability of the EMEA HW partner community. The EMEA Partner Community for Hardware is the place where partners in Europe, Middle East and Africa can share experiences and best practices about selling and implementing Servers, Storage and Solaris based projects. You will also receive first-hand information from Oracle on products, training and tools that can help you better market, sell and implement your projects and services based on Oracle Hardware. If you are an individual  working for an Oracle partner and your job is selling, implementing or supporting Oracle Servers, Storage and Solaris projects in EMEA then this community is for you. For further information on the EMEA HW partner community and instructions on how to become a member please visit: www.oracle.com/partners/goto/hardware-emea

    Read the article

  • Google I/O 2012 - Gaming in the Cloud

    Google I/O 2012 - Gaming in the Cloud "Fred Sauer Many games developers are finding the easy development and deployment experience of Google App Engine ideal for building cloud based state-storage, matching making services and collaborations services. When you have a hit game, the last thing you want to do is worry about your server provisioning. App Engine has an always-free tier to get you started and then scales seamlessly to any size of usage. Game developers also use Google Cloud Storage to easily store and quickly deliver media files to clients around the world. For all I/O 2012 sessions, go to developers.google.com From: GoogleDevelopers Views: 1 0 ratings Time: 01:02:17 More in Science & Technology

    Read the article

  • Cloud computing cost savings for large enterprise

    - by user13817
    I'm trying to understand whether cloud computing is meant for small to medium sized companies OR also for large companies. Imagine a website with a very large user base. The storage and bandwidth demands as well as the number of database transactions are incredibly high. The website might be hosting videos, music, images, etc. that keep the demands high. Does it make sense to be in the cloud when you know you need huge volumes of storage, bandwidth, and GET,PUT,etc. requests? (Each of these variables costs money in the cloud) OR does it make sense to build your own infrastructure? I can see the cost savings of cloud computing if you are a small business, but if you were aiming at the next big thing on the Internet, I can't quite see the benefits.

    Read the article

  • Upgrading a dual-boot system HDD

    - by Jason
    I dual-boot my laptop due to lousy VM performance, and have a new 500GB/7200rpm drive coming in to replace the stock 320GB/5400rpm drive. I have the drive set up in three partitions: one for the Win7 system files, one for storage, and the third as the ext4 Linux file system. The system file and storage partitions are both NTFS. What I'm planning to do is use the system image creator built in Win7, then move that over to the new drive. However, how can I migrate the Ubuntu partition, and how do I make sure that the Grub bootloader isn't overwritten by the Windows loader?

    Read the article

  • NEW EMEA Hardware Partner Community

    - by Cinzia Mascanzoni
    We are delighted to announce the availability of the EMEA HW partner community. The EMEA Partner Community for Hardware is the place where partners in Europe, Middle East and Africa can share experiences and best practices about selling and implementing Servers, Storage and Solaris based projects. You will also receive first-hand information from Oracle on products, training and tools that can help you better market, sell and implement your projects and services based on Oracle Hardware. If you are an individual  working for an Oracle partner or distributor and your job is selling, implementing or supporting Oracle Servers, Storage and Solaris projects in EMEA then this community is for you. For further information on the EMEA HW partner community and instructions on how to become member please visit: www.oracle.com/partners/goto/hardware-emea

    Read the article

  • Welcome to our Friday tips series!

    - by Chris Kawalek
    Today we're starting a brand new blog series. For your Friday afternoon reading, we'll be posting a technical tip or question and answer on a technical topic. We'll start by introducing ideas on our own, but we'd really like it if you were involved and asked us questions via Twitter! Tag your tweet with #AskOracleVirtualization and we'll consider your question for the blog. Today's tip is on Storage and Oracle Virtual Desktop Infrastructure: Question: I run Oracle Virtual Desktop 3.4.1 on Solaris and use a local ZFS storage pool.  How should I configure my ZFS ARC cache?  Answer by John Renko, Consulting Developer, Oracle: Oracle recommends about 5G of ARC cache per template in use to achieve up to a 90% disk read offload. Set your ARC min=max to reserve the maximum amount of your remaining memory for your running VMs. In /etc/system: set zfs:zfs_arc_min = 5368709120 set zfs:zfs_arc_max = 5368709120 The amount you need to reserve will depend on your template but this has proven to be a great start for a typical windows 7 VM running productivity applications.

    Read the article

  • TechEd 2014 Day 2

    - by John Paul Cook
    Today people asked me about backing up older versions of SQL Server to Azure. Older versions back to SQL Server 2005 can be easily backed up to Azure Storage by installing Microsoft SQL Server Backup to Windows Azure Tool. It installs a service of the same name that applies rules to SQL Server backups. You can tell the tool to backup or encrypt your SQL Server backups. You can have it automatically upload your backups to Azure Storage. Even if you don’t want to upload your backups to Azure, you might...(read more)

    Read the article

  • Latest Exadata Quarterly Full Stack Patch

    - by Bethany Lapaglia
    The latest Quarterly Full Stack patch was released on October 17th for Exadata.  This patch contains all of the latest recommended patches for the Exadata Database Machine.  For more details, refer to My Oracle Support Note titled Database Machine and Exadata Storage Server 11g Release 2 (11.2) Supported Versions [888828.1].  Note that the information in the note applies only to Exadata Storage Server Software 11.2.  This note is maintained on a regular basis so bookmark it and use it as reference for the latest Exadata Database Machine patching recommendations.

    Read the article

< Previous Page | 196 197 198 199 200 201 202 203 204 205 206 207  | Next Page >