Search Results

Search found 36922 results on 1477 pages for 'custom post tags'.

Page 472/1477 | < Previous Page | 468 469 470 471 472 473 474 475 476 477 478 479  | Next Page >

  • A Look at SQL Server 2008 Change Tracking

    Before SQL Server 2008, you had to build a custom solution if you wanted to keep track of the changes to the data in your tables. SQL Server 2008 has a new offering called Change Tracking that keeps track of each DML event type and the keys of the row that was affected.

    Read the article

  • ArchBeat Link-o-Rama for December 4, 2012

    - by Bob Rhubart
    Exalogic 2.0.1 Tea Break Snippets - Creating and using Distribution Groups | The Old Toxophilist "Although in many cases we, as Cloud Users, may not be to worried how the Virtualisation Algorithm decides where to place our vServers," says The Old Toxopholist, "there are cases where it is extremely important that vServers run on distinct physical compute nodes." There's plenty more on the subject in his blog post. Oracle Endeca (2.3) Record Level Security | Adam Seed Adam Sneed's blog post covers "the basics of security within Endeca Information Discovery, as these basic security objects are required in order to explain the implementation of record level security." ODI Handling DQ | Gurcan Orhan Oracle ACE Director Gurcan Orhan suggests you have fun with these scripts for Oracle Data Integrator. Parleys Testimonial at GlassFish Community Event - JavaOne 2012 Video of Parley's webmaster Stephan Janssen's presentation at the GlassFish Community Event at JavaOne 2012, in which he explains why Parley's moved from Tomcat to GlassFish. Java Spotlight Episode 109: Pete Muir on CDI 1.1 This edition of Roger Brinkley's Java Spotlight Podcast features an interview with CDI 1.1 spec lead Pete Muir of JBoss/Red Hat. Muir talks about the features in CDI 1.1 and what to expect in the future. Webcast: Java Management Extensions with Oracle WebLogic Server 12c Dr. Frank Munz and Dave Cabelus do the talking in this on-demand webcast focused on Oracle WebLogic Server 12c with Java Management Extensions (JMX). Using the Coherence API to get Portable Object Format bytes | Bruno Borges Bruno Borges shares a code snippet that illustrates how easy it is to use the Coherence API. Thought for the Day "Experience is something you don't get until just after you need it." — Anonymous Source: SoftwareQuotes.com

    Read the article

  • Apache 2 UserDir for only one VirtualHost

    - by dentarg
    Is it possible to enable the UserDir Directive for just one VirtualHost rather than have it on for all and then disable it (with "UserDir disable") for each VirtualHost you don't want it on? I have tried by putting this inside a <VirtualHost> and comment out everything in the global config (/etc/apache2/conf.d/userdir.conf). No luck though. <IfModule mod_userdir.c> UserDir public.www UserDir disabled root <Directory /home/*/public.www> AllowOverride FileInfo AuthConfig Limit Indexes Options MultiViews Indexes SymLinksIfOwnerMatch IncludesNoExec <Limit GET POST OPTIONS> Order allow,deny Allow from all </Limit> <LimitExcept GET POST OPTIONS> Order deny,allow Deny from all </LimitExcept> </Directory> </IfModule>

    Read the article

  • Tips For Getting the Best Web Development

    For many people getting custom web development sounds like a complicated endeavor. It is true, that getting a good overall work output from a company or an individual in regards to web creation can turn sour fast, but you can utilize a few tips and ask a few questions to make sure that you're not getting swindled.

    Read the article

  • How to write PowerShell code part 2 (Using function)

    - by ybbest
    In the last post, I have showed you how to use external configuration file in your PowerShell script. In this post, I will show you how to create PowerShell function and call external PowerShell script.You can download the script here. 1. In the original script, I create the site directly using New-SPSite command. I will refactor it so that I will create a new function to create the site using New-SPSite. The PowerShell function is quite similar to a C# method. You put your function parameters in () and separate each parameter by a comma (,). Then you put your method body in {}. function add ([int] $num1 , [int] $num2){ $total=$num1+$num2 #Return $total $total } 2. The difference is you do not need semi-colon (;) at the end of each statement and when calling the method you do not need comma (,) to separate each parameter. function add ([int] $num1 , [int] $num2){ $total=$num1+$num2 #Return $total $total } #Calling the function [int] $num1=3 [int] $num2=4 $d= add $num1 $num2 Write-Host $d 3. If you like to return anything from the function, you just need to type in the object you like to return, not need to type return .e.g. $ObjectToReturn not return $ObjectToReturn

    Read the article

  • Installing Visual Studio 2010 Service Pack 1

    - by Martin Hinshelwood
    As has become customary when the product team releases a new patch, SP or version I like to document the install. This post seams almost redundant as I had no problems, but I think that is as valuable to other thinking of installing the Service Pack as all the problems that we sometimes get. As per Brian's post I am Installing Visual Studio Team Foundation Server Service Pack 1 first and indeed as this is a single server local deployment I need to install both. If I only install one it will leave the other product broken. Figure: Hopefully this will be more uneventful It takes a little while for your system to be checked to see what components need updating. On my main computer this was pretty quick, but on the laptop it took some time. Figure: There are a lot of components to update With this update also comes an update to .NET as well as many other components. Figure: I downloaded the full 1.5GB’s, but you could do a web install It depends on how good you internet connection is to how long it would take to download, but as I am now in the US I decided not to trust the internet connection speeds. It took around 30-40 minutes to download the full thing which is a little slow. Figure: I did not need to download, but that would increase the install time So on my main computer again this was fast, but again on my netbook this took a little while. Figure: The actual install took around 30-40 minutes (2 hours on netbook) I was pretty impressed with the speed of the install, and as Team Explore is now out of the box with Visual Studio 2010 I don’t get the problem of the SP being installed before Team Explorer and having a disjointed experience Figure: As I suspected, no problems with the install Figure: Checking in Visual Studio shows that all the servicing points were successful This was an easy experience even if the SP was over 1.5GB’s to download Hopefully I will be discovering things that work better for a good while to come, as well as not seeing holes in the product that I had no encountered yet. What were your experiences of installing Visual Studio 2010 Service pack 1?

    Read the article

  • Should I stay in my degree or take an opportunity for management experience?

    - by Adam
    I've read a couple other post along these lines and they've been helpful but I'm wondering if my case is any different. I've been working towards my CS degree while working part time in a programming job. I'm now about two years away from getting my degree and was just offered a management position at my job. This would mean that I have to work full-time at my job and I can't really work towards my degree anymore in person. My school doesn't really offer CS classes after hours nor online. It seems that getting a degree is very important from the other post that I read. Does having management experience trump that? I'm currently leaning towards taking the job and finding some sort of online degree. Also my school only offers a business degree online, could I just get this in place. Does the type of degree really matter? For some jobs it's not the type of degree just that you have one, is there any merit for this in the programming industry? Thanks :)

    Read the article

  • 'dd' access to a drive

    - by John
    I've been working with making custom bootloader and kernel code (not necessarily Linux kernel). I'm putting the images on USB, and was using dd to place them on the sector they needed to be on, and I'm getting tired of burning the image to /dev/sda instead of /dev/sdb (effectively destroying my hard drive). So I was wondering if I could somehow give user access to the 'dd' command, but only to the /dev/sdb drive, so that if I accidentally type /dev/sda it won't let me, because I wouldn't have run the command as sudo or root.

    Read the article

  • Setting up an Ubuntu 9.10 chroot for Android OS builds

    <b>Spencer Herzberg Blog: </b>"So the goal of this post will be to install a 32-bit chroot of 9.10 in my fresh install of 10.04 as 9.10 is easier to build and test Cyanogen's custom Android rom. I have modified some guides from here and here. I have also elected to use schroot as it allows for easy chroot access."

    Read the article

  • Render angles of a 3D model into 2D images?

    - by Ricket
    Is there a tool out there that you can give a 3D model file, and it will output 2D renders of it from various angles? For example if you were making a 2D RPG but you want to make your character look nice, you might make the character in 3D and then just render the character from 8 or more angles into images which then are used by the 2D engine to give a pseudo-3D look. Does such a tool exist or will it need to be custom-written or done manually?

    Read the article

  • Another Exchange 2003 to Exchange 2010 mail flow issue

    - by Ryan Roussel
    During a migration recently, we came across another internal mail routing issue.  The symptoms were identical to my previous post about Exchange internal mail routing.  Mail was flowing from 2010 to 2003, from 2010 to the internet, but not from 2003 to 2010.   I went through the normal check list looking at permissions, DNS, and the routing group connectors.  I verified that both servers listed in the routing group connectors were the routing master in their respective routing groups through the 2003 ESM.  I also verified that inheritable permissions were enabled for the Exchange 2003 server object in the schema.  No luck with either.   For my previous post about this issue in which inheritable permissions were the culprit: Exchange 2010, Exchange 2003 Mail Flow issue   And for Routing Group issues: Exchange 2007 Routing Group Connector Mayhem   I finally enabled logging on the SMTP virtual server on Exchange 2003 and the Default Receive Connector on 2010 and sent a few test e-mails where I found 2003 was having issues authenticating to 2010.  By default 2003 uses Exchange Server Authentication to communicate to 2010. The exact error was: 4.7.0 Temporary Authentication Failure which was found in the SMTP logs on the Exchange 2003 side   After scouring based on this error, I found the solution:   The Access this computer from the network user rights in the local computer policy on the Exchange 2010 server were changed from the default.  The network administrator had modified the Default Domain policy and changed this user right assignment to only list Domain Users.   The fix was to clear this setting in the Default Domain policy,  force gpupdate to refresh the group policy settings, then ensure the appropriate users and groups were listed.   This immediately fixed the problem and the Exchange 2003 server was able to route mail to the Exchange 2010 mailboxes.   The default user rights assignments for Access this computer from the network On Workstations and Servers: Administrators Backup Operators Power Users Users Everyone On Domain Controllers: Administrators Authenticated Users Everyone More can be found here: http://technet.microsoft.com/en-us/library/cc740196(WS.10).aspx

    Read the article

  • wget has a 4 second delay

    - by guisius
    Hello. I have tried to wget a page with windows/mac, and the response is instant while the linux vesion needs to wait for 4 seconds before it shows the response. I just hope this can be solved. More information added: in Ubuntu : wget xxx://192.168.0.135/test.cgi?cmd= -O test.txt --2011-03-04 14:21:17-- xxx://192.168.0.135/test.cgi?cmd= Connecting to 192.168.0.135:80... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] Saving to: `test.txt' [ <=> ] 17 --.-K/s in 0s 2011-03-04 14:21:22 (1.88 MB/s) - `test.txt' saved [17] while in Mac OS : wget xxx://192.168.0.135/test.cgi?cmd= -O test.txt --2011-03-04 14:22:33-- xxx://192.168.0.135/test.cgi?cmd= Connecting to 192.168.0.135:80... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] Saving to: `test.txt' [ <=> ] 17 --.-K/s in 0s 2011-03-04 14:22:33 (755 KB/s) - `test.txt' saved [17] in ubuntu it delays 4 seconds while windows and mac will not i believe it may related to some setting in the network config such as packet size , window frame , but i have no idea to set this PS: because the limit of the post not allow to post the url so i mark this as xxx

    Read the article

  • Operating System not Found after installing Ubuntu in a Sony Vaio

    - by diego8arock
    I just bought a Sony Vaio SVS15115FLB that came with Windows 7, after enjoying the PC graphic power for a little, I decided it was time to install Ubuntu 12.04. First, I inserted a USB stick, reboot, press F11 but a message saying that no OS was found on the USB, so then I used a live CD. It booted fine and I installed Ubuntu, then when it was time to restart the PC, it didn't boot to GRUB but it went straight to Windows and it began an startup error and was looking for a solution, after it was done, it restarted and then it booted again to Windows and to the same start up error solution thing. I freaked out, so I booted again the Ubuntu live CD, and installed Ubuntu over everything, after it installed I rebooted and then a message appeared saying Operating system Not Found, and I have no idea why. So I Googled again and found this post on Boot Partition, I did everything exactly on that post, but it didn't work (by the way, this was the message): The boot files of [Ubuntu 12.04 LTS] are far from the start of the disk. Your BIOS may not detect them. You may want to retry after creating a /boot partition (EXT4, >200MB, start of the disk). This can be performed via tools such as gParted. Then select this partition via the [Separate /boot partition:] option of [Boot Repair]. It appeared the first time, then I did it all again and then it was gone. I rebooted and nothing, the same Operating System not found message appeared. So I decided to create a partition for Windows, hoping for something, but the message still appears. I really have no idea what to do, but there is something odd, if I insert the USB stick containing Ubuntu 11.10, the message that says that there is no OS in the PendDrive flashes for a fraction of a second and the boot straight to Ubuntu 12.04 without problems (and booted to Windows when I installed it, ignoring Ubuntu), right now I'm using it like that, but its pretty annoying. Can anyone advise me how to fix this? I'm no expert on this kind of things (boot, GRUB, recovery and stuff like that).

    Read the article

  • What is Rainbow (not the CMS)

    - by Jeremy Thompson
    I was reading this excellent blog article regarding speeding up the badge page and in the last comment the author @waffles (a.k.a Sam Saffron) mentions these tools: dapper and a bunch of custom helpers like rainbow, sql builder etc Dapper and sql builder was easy to look up but rainbow keeps pointing me to a CMS, can someone please point me to the real source? Thanks. Obviously the architecture of these [SE] sites is uber cool and ultra fast so no comments on that thanks.

    Read the article

  • Monitoring Baseline

    - by Grant Fritchey
    Knowing what's happening on your servers is important, that's monitoring. Knowing what happened on your server is establishing a baseline. You need to do both. I really enjoyed this blog post by Ted Krueger (blog|twitter). It's not enough to know what happened in the last hour or yesterday, you need to compare today to last week, especially if you released software this weekend. You need to compare today to 30 days ago in order to begin to establish future projections. How your data has changed over 30 days is a great indicator how it's going to change for the next 30. No, it's not perfect, but predicting the future is not exactly a science, just ask your local weatherman. Red Gate's SQL Monitor can show you the last week, the last 30 days, the last year, or all data you've collected (if you choose to keep a year's worth of data or more, please have PLENTY of storage standing by). You have a lot of choice and control here over how much data you store. Here's the configuration window showing how you can set this up: This is for version 2.3 of SQL Monitor, so if you're running an older version, you might want to update. The key point is, a baseline simply represents a moment in time in your server. The ability to compare now to then is what you're looking for in order to really have a useful baseline as Ted lays out so well in his post.

    Read the article

  • Looking for a disk manager that has options for setting allocation sizes in paritions

    - by mango
    I'm looking for a GUI program that is compatible with Ubuntu 13.10 - Server X86-64 that has all the features of Gparted but also allows for setting custom allocation sizes when creating a partition. Eg: Ability to create a 4gb Fat32 parition with 32 kilobyte allocation size. Please don't suggest a terminal only application, no matter how awesome it might be, because that's not what I asked. Wow, I come off like a right up prick when I write, eh?

    Read the article

  • Synonyms made easy

    The Custom Search team is always working to provide more relevant results, and improving user queries is a big part of that goal. We've shown you how to...

    Read the article

  • ORAchk version 2.2.5 is now available for download

    - by Gerry Haskins
    Those awfully nice ORAchk folks have asked me to let you know about their latest release... ORAchk version 2.2.5 is now available for download, new features in 2.2.5: Running checks for multiple databases in parallel Ability to schedule multiple automated runs via ORAchk daemon New "scratch area" for ORAchk temporary files moved from /tmp to a configurable $HOME directory location System health score calculation now ignores skipped checks Checks the health of pluggable databases using OS authentication New report section to report top 10 time consuming checks to be used for optimizing runtime in the future More readable report output for clusterwide checks Includes over 50 new Health Checks for the Oracle Stack Provides a single dashboard to view collections across your entire enterprise using the Collection Manager, now pre-bundled Expands coverage of pre and post upgrade checks to include standalone databases, with new profile options to run only these checks Expands to additional product areas in E-Business Suite of Workflow & Oracle Purchasing and in Enterprise Manager Cloud Control ORAchk has replaced the popular RACcheck tool, extending the coverage based on prioritization of top issues reported by users, to proactively scan for known problems within the area of: Oracle Database Standalone Database Grid Infrastructure & RAC Maximum Availability Architecture (MAA) Validation Upgrade Readiness Validation Golden Gate Enterprise Manager Cloud Control Repository E-Business Suite Oracle Payables (R12 only) Oracle Workflow Oracle Purchasing (R12 only) Oracle Sun Systems Oracle Solaris ORAchk features: Proactively scans for the most impactful problems across the various layers of your stack Streamlines how to investigate and analyze which known issues present a risk to you Executes lightweight checks in your environment, providing immediate results with no configuration data sent to Oracle Local reporting capability showing specific problems and their resolutions Ability to configure email notifications when problems are detected Provides a single dashboard to view collections across your entire enterprise using the Collection Manager ORAchk will expand in the future with high impact checks in existing and additional product areas. If you have particular checks or product areas you would like to see covered, please post suggestions in the ORAchk subspace in My Oracle Support Community. For more details about ORAchk see Document 1268927.2

    Read the article

  • Introduction to Human Workflow 11g

    - by agiovannetti
    Human Workflow is a component of SOA Suite just like BPEL, Mediator, Business Rules, etc. The Human Workflow component allows you to incorporate human intervention in a business process. You can use Human Workflow to create a business process that requires a manager to approve purchase orders greater than $10,000; or a business process that handles article reviews in which a group of reviewers need to vote/approve an article before it gets published. Human Workflow can handle the task assignment and routing as well as the generation of notifications to the participants. There are three common patterns or usages of Human Workflow: 1) Approval Scenarios: manage documents and other transactional data through approval chains . For example: approve expense report, vacation approval, hiring approval, etc. 2) Reviews by multiple users or groups: group collaboration and review of documents or proposals. For example, processing a sales quote which is subject to review by multiple people. 3) Case Management: workflows around work management or case management. For example, processing a service request. This could be routed to various people who all need to modify the task. It may also incorporate ad hoc routing which is unknown at design time. SOA 11g Human Workflow includes the following features: Assignment and routing of tasks to the correct users or groups. Deadlines, escalations, notifications, and other features required for ensuring the timely performance of a task. Presentation of tasks to end users through a variety of mechanisms, including a Worklist application. Organization, filtering, prioritization and other features required for end users to productively perform their tasks. Reports, reassignments, load balancing and other features required by supervisors and business owners to manage the performance of tasks. Human Workflow Architecture The Human Workflow component is divided into 3 modules: the service interface, the task definition and the client interface module. The Service Interface handles the interaction with BPEL and other components. The Client Interface handles the presentation of task data through clients like the Worklist application, portals and notification channels. The task definition module is in charge of managing the lifecycle of a task. Who should get the task assigned? What should happen next with the task? When must the task be completed? Should the task be escalated?, etc Stages and Participants When you create a Human Task you need to specify how the task is assigned and routed. The first step is to define the stages and participants. A stage is just a logical group. A participant can be a user, a group of users or an application role. The participants indicate the type of assignment and routing that will be performed. Stages can be sequential or in parallel. You can combine them to create any usage you require. See diagram below: Assignment and Routing There are different ways a task can be assigned and routed: Single Approver: task is assigned to a single user, group or role. For example, a vacation request is assigned to a manager. If the manager approves or rejects the request, the employee is notified with the decision. If the task is assigned to a group then once one of managers acts on it, the task is completed. Parallel : task is assigned to a set of people that must work in parallel. This is commonly used for voting. For example, a task gets approved once 50% of the participants approve it. You can also set it up to be a unanimous vote. Serial : participants must work in sequence. The most common scenario for this is management chain escalation. FYI (For Your Information) : task is assigned to participants who can view it, add comments and attachments, but can not modify or complete the task. Task Actions The following is the list of actions that can be performed on a task: Claim : if a task is assigned to a group or multiple users, then the task must be claimed first to be able to act on it. Escalate : if the participant is not able to complete a task, he/she can escalate it. The task is reassigned to his/her manager (up one level in a hierarchy). Pushback : the task is sent back to the previous assignee. Reassign :if the participant is a manager, he/she can delegate a task to his/her reports. Release : if a task is assigned to a group or multiple users, it can be released if the user who claimed the task cannot complete the task. Any of the other assignees can claim and complete the task. Request Information and Submit Information : use when the participant needs to supply more information or to request more information from the task creator or any of the previous assignees. Suspend and Resume :if a task is not relevant, it can be suspended. A suspension is indefinite. It does not expire until Resume is used to resume working on the task. Withdraw : if the creator of a task does not want to continue with it, for example, he wants to cancel a vacation request, he can withdraw the task. The business process determines what happens next. Renew : if a task is about to expire, the participant can renew it. The task expiration date is extended one week. Notifications Human Workflow provides a mechanism for sending notifications to participants to alert them of changes on a task. Notifications can be sent via email, telephone voice message, instant messaging (IM) or short message service (SMS). Notifications can be sent when the task status changes to any of the following: Assigned/renewed/delegated/reassigned/escalated Completed Error Expired Request Info Resume Suspended Added/Updated comments and/or attachments Updated Outcome Withdraw Other Actions (e.g. acquiring a task) Here is an example of an email notification: Worklist Application Oracle BPM Worklist application is the default user interface included in SOA Suite. It allows users to access and act on tasks that have been assigned to them. For example, from the Worklist application, a loan agent can review loan applications or a manager can approve employee vacation requests. Through the Worklist Application users can: Perform authorized actions on tasks, acquire and check out shared tasks, define personal to-do tasks and define subtasks. Filter tasks view based on various criteria. Work with standard work queues, such as high priority tasks, tasks due soon and so on. Work queues allow users to create a custom view to group a subset of tasks in the worklist, for example, high priority tasks, tasks due in 24 hours, expense approval tasks and more. Define custom work queues. Gain proxy access to part of another user's tasks. Define custom vacation rules and delegation rules. Enable group owners to define task dispatching rules for shared tasks. Collect a complete workflow history and audit trail. Use digital signatures for tasks. Run reports like Unattended tasks, Tasks productivity, etc. Here is a screenshoot of what the Worklist Application looks like. On the right hand side you can see the tasks that have been assigned to the user and the task's detail. References Introduction to SOA Suite 11g Human Workflow Webcast Note 1452937.2 Human Workflow Information Center Using the Human Workflow Service Component 11.1.1.6 Human Workflow Samples Human Workflow APIs Java Docs

    Read the article

  • What is the "default" software license?

    - by Tesserex
    If I release some code and binaries, but I don't include any license at all with it, what are the legal terms that apply by default (in the US, where I am). I know that I automatically have copyright without doing anything, but what restrictions are there on it? If I upload my code to github and announce it as a free download / contribute at will, then are people allowed to modify and close source my work? I haven't said that they cannot, as a GPL would, but I don't feel that it would by default be acceptable to steal my work either. So what can and cannot people do with code that is freely available, but has absolutely no licensing terms attached? By the way, I know that it would be a good idea for me to pick a license and apply it to my code soon, but I'm still curious about this. Edit Thanks! So it looks like the consensus is that it starts out very restricted, and then my actions imply any further rights. If I just put software on my website with no security, it would be an infringement to download it. If I post a link to that download on a forum, then that would implicitly give permission to use it for free, but not distribute it or its derivatives (but you can modify it for your own use). If I put it on GitHub, then it is conveyed as FOSS. Again, this is probably not codified exactly in law but may be enough to be defensible in court. It's still a good idea to post a complete license to be safe.

    Read the article

  • SQL Rally Nordic & Amsterdam slides & demos

    - by Davide Mauri
    Last week I had the pleasure to speak at two GREAT conferences (as you can see from the wordcloud I’ve posted, here for Stockholm and here for Amsterdam. I used two different filtering techniques to produce the wordcloud, that’s why they look different. I’m playing a lot with R in these days…so I like to experiment different stuff). The workshop with my friend Thomas Kejser on “Data Warehouse Modeling – Making the Right Choices” and my sessions on “Automating DWH Patterns through Metadata” has been really appreciated by attendees, give the amount of good feedback I had on twitter and on some blog posts (Here and here). Of course many asked for slides & demos to download, so here you are! Automating DWH Patterns through Metadata – Stockholm http://sdrv.ms/1bcRAaW Automating DWH Patterns through Metadata – Amsterdam http://sdrv.ms/1cNDAex I’m still trying to understand if and how I can publicly post slides & demos of the workshop, so for that you have to wait a couple of days. I will post about it as soon as possible. Anyway, if you were in the workshop and would like to get the slide & demos ASAP, just send me an email, I’ll happily sent the protected link to my skydrive folder to you. Enjoy!

    Read the article

  • Google I/O 2012 - Computing Map Tiles with Go on App Engine

    Google I/O 2012 - Computing Map Tiles with Go on App Engine Chris Broadfoot, Andrew Gerrand In this talk we use the Maps API and Go on App Engine to build an app to build custom tile sets for Google Maps. The app demonstrates using Go's suitability for computation in the cloud and App Engine's key scalability features, such as Task Queues and Backends. For all I/O 2012 sessions, go to developers.google.com From: GoogleDevelopers Views: 1170 21 ratings Time: 47:22 More in Science & Technology

    Read the article

< Previous Page | 468 469 470 471 472 473 474 475 476 477 478 479  | Next Page >