Search Results

Search found 14145 results on 566 pages for 'level of detail'.

Page 143/566 | < Previous Page | 139 140 141 142 143 144 145 146 147 148 149 150  | Next Page >

  • A Review of From Zero To SSIS Training

    - by andyleonard
    I recently (5-9 Mar 2012) delivered my five-day SSIS training course – From Zero To SSIS! – in London. The class was delivered in collaboration with TechniTrain . I must commend Chris Webb ( blog | @Technitrain ) and Helen Lau on their leadership, professionalism, and attention to detail. They made the course a breeze for the students and the instructor! It was a pleasure and privilege to work with them. In addition to people just learning data integration, this class contained several experienced...(read more)

    Read the article

  • what's included in a typical computer architecture class? [closed]

    - by sq1020
    Does this description fit what's usually included in a computer architecture class? Computer Organization and Assembly Language An introduction to the hardware organization and assembly language of the Intel processor. Topics include memory hierarchy and design- CPU design- pipelining- addressing modes- subroutine linkage- polled input/output- interrupts- high level language interfacing and macros.

    Read the article

  • Webmaster Tools: root and subdirectories?

    - by nick
    We have all our international sites on our .com domain like this: site.com/uk site.com/us etc... When creating the sites in Webmaster Tools I've created different sites and submitted sitemaps for each directory so that we can appropriately geotarget the site. Is it also recommended to add the root .com with its geotargeting set to international? If so should I also add all the seperate site maps (like the /us/sitemap.xml) even though they have been added to the directory level sites?

    Read the article

  • What could be the Java successor Oracle wants to invest in?

    - by deamon
    I've read that Oracle wants to invest into another language than Java: "On the other hand, Oracle has been particularly supportive of alternative JVM languages. Adam Messinger ( http://www.linkedin.com/in/adammessinger ) was pretty blunt at the JVM Languages Summit this year about Java the language reaching it's logical end and how Oracle is looking for a 'higher level' language to 'put significant investment into.'" But what language could be the one Oracle wants to invest in? Is there another candidate than Scala?

    Read the article

  • Cannot get Virtualbox to install properly on Ubuntu 12.04

    - by lopac1029
    I cannot get Virtualbox to install properly on my 12.04. I first went with a manual install for the .deb from the old builds section of the Virtualbox page. That .deb opened up the Software Center and installed. Then I got the error coming up of VT-x/AMD-V hardware acceleration is not available on your system. Your 64-bit guest will fail to detect a 64-bit CPU and will not be able to boot. which I can only assume was due to my Ubuntu version being 32-bit (System Details - Overview - OC type: 32-bit, right?) So I followed these instructions to remove the .deb manually, restarted my laptop, and then FOUND the actual Virtualbox install in the Software Center and installed from that (assuming it would give me the correct version I need for my system) So after all that (and then some), I'm still getting the same error when I connect to my new job's project in Virtualbox. Can anyone point me in the right direction of what to do here? This is the first time I've ever worked with Virtualbox, and no one at this company is using Ubuntu, so I'm on my own here. EDIT: Here is the direct info from running the 2 suggested commands Inspiron-1750-brick:~ $lscpu Architecture: i686 CPU op-mode(s): 32-bit, 64-bit Byte Order: Little Endian CPU(s): 2 On-line CPU(s) list: 0,1 Thread(s) per core: 1 Core(s) per socket: 2 Socket(s): 1 Vendor ID: GenuineIntel CPU family: 6 Model: 23 Stepping: 10 CPU MHz: 2100.000 BogoMIPS: 4189.45 L1d cache: 32K L1i cache: 32K L2 cache: 2048K Inspiron-1750-brick:~ $cat /proc/cpuinfo processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 23 model name : Intel(R) Core(TM)2 Duo CPU T6500 @ 2.10GHz stepping : 10 microcode : 0xa07 cpu MHz : 1200.000 cache size : 2048 KB physical id : 0 siblings : 2 core id : 0 cpu cores : 2 apicid : 0 initial apicid : 0 fdiv_bug : no hlt_bug : no f00f_bug : no coma_bug : no fpu : yes fpu_exception : yes cpuid level : 13 wp : yes flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe nx lm constant_tsc arch_perfmon pebs bts aperfmperf pni dtes64 monitor ds_cpl est tm2 ssse3 cx16 xtpr pdcm sse4_1 xsave lahf_lm dtherm bogomips : 4189.80 clflush size : 64 cache_alignment : 64 address sizes : 36 bits physical, 48 bits virtual power management: processor : 1 vendor_id : GenuineIntel cpu family : 6 model : 23 model name : Intel(R) Core(TM)2 Duo CPU T6500 @ 2.10GHz stepping : 10 microcode : 0xa07 cpu MHz : 1200.000 cache size : 2048 KB physical id : 0 siblings : 2 core id : 1 cpu cores : 2 apicid : 1 initial apicid : 1 fdiv_bug : no hlt_bug : no f00f_bug : no coma_bug : no fpu : yes fpu_exception : yes cpuid level : 13 wp : yes flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe nx lm constant_tsc arch_perfmon pebs bts aperfmperf pni dtes64 monitor ds_cpl est tm2 ssse3 cx16 xtpr pdcm sse4_1 xsave lahf_lm dtherm bogomips : 4189.45 clflush size : 64 cache_alignment : 64 address sizes : 36 bits physical, 48 bits virtual power management:

    Read the article

  • Ideas for attack damage algorithm (language irrelevant)

    - by Dillon
    I am working on a game and I need ideas for the damage that will be done to the enemy when your player attacks. The total amount of health that the enemy has is called enemyHealth, and has a value of 1000. You start off with a weapon that does 40 points of damage (may be changed.) The player has an attack stat that you can increase, called playerAttack. This value starts off at 1, and has a possible max value of 100 after you level it up many times and make it farther into the game. The amount of damage that the weapon does is cut and dry, and subtracts 40 points from the total 1000 points of health every time the enemy is hit. But what the playerAttack does is add to that value with a percentage. Here is the algorithm I have now. (I've taken out all of the gui, classes, etc. and given the variables very forward names) double totalDamage = weaponDamage + (weaponDamage*(playerAttack*.05)) enemyHealth -= (int)totalDamage; This seemed to work great for the most part. So I statrted testing some values... //enemyHealth ALWAYS starts at 1000 weaponDamage = 50; playerAttack = 30; If I set these values, the amount of damage done on the enemy is 125. Seemed like a good number, so I wanted to see what would happen if the players attack was maxed out, but with the weakest starting weapon. weaponDamage = 50; playerAttack = 100; the totalDamage ends up being 300, which would kill an enemy in just a few hits. Even with your attack that high, I wouldn't want the weakest weapon to be able to kill the enemy that fast. I thought about adding defense, but I feel the game will lose consistency and become unbalanced in the long run. Possibly a well designed algorithm for a weapon decrease modifier would work for lower level weapons or something like that. Just need a break from trying to figure out the best way to go about this, and maybe someone that has experience with games and keeping the leveling consistent could give me some ideas/pointers.

    Read the article

  • APEX 4.2: Neue Features für interaktive Berichte

    - by carstenczarski
    Seit Oktober 2012 steht APEX 4.2 zum Download zur Verfügung. Dass der Schwerpunkt dieses Releases auf der Entwicklung von APEX-Anwendungen für Smartphones - auf Basis von jQuery Mobile und HTML5-Charts - liegt, dürfte mittlerweile nahezu überall bekannt sein. Doch das ist nicht alles. APEX 4.2 bringt noch mehr neue Features mit: Im Bereich der interaktiven Berichte hat sich sehr viel getan: Zwar ist auch weiterhin nur ein interaktiver Bericht pro Seite möglich, es gibt aber dennoch einige, interessante Neuerungen - dieser Tipp stellt sie im Detail vor. Interaktive Berichtsspalten formatieren: HTML-Ausdruck Email-Abonnements: Absenderadresse und einfache Abmeldung PL/SQL-Zugriff auf interaktive Berichte: APEX_IR Linguistische Suche in einem interaktiven Bericht Weitere neue Features

    Read the article

  • Dell Synaptics touchpad's middle mouse button gets mapped as normal click

    - by Henrik
    How do I make the middle touchpad's button work? xinput --test 11 yields button press 1 button press 1 For pressing both the left and the middle button. I have tried to do xinput set-button-map 11 1 4 2 and so on, but as the --test shows that button 1 is being depressed, then probably the issue is at a lower level than with X11's perception of what mouse buttons I'm pressing (or assigning button-map 11 1 2 3 and clicking the right button in firefox, wouldn't trigger the middle-click on the link)

    Read the article

  • RiverTrail - JavaScript GPPGU Data Parallelism

    - by JoshReuben
    Where is WebCL ? The Khronos WebCL working group is working on a JavaScript binding to the OpenCL standard so that HTML 5 compliant browsers can host GPGPU web apps – e.g. for image processing or physics for WebGL games - http://www.khronos.org/webcl/ . While Nokia & Samsung have some protype WebCL APIs, Intel has one-upped them with a higher level of abstraction: RiverTrail. Intro to RiverTrail Intel Labs JavaScript RiverTrail provides GPU accelerated SIMD data-parallelism in web applications via a familiar JavaScript programming paradigm. It extends JavaScript with simple deterministic data-parallel constructs that are translated at runtime into a low-level hardware abstraction layer. With its high-level JS API, programmers do not have to learn a new language or explicitly manage threads, orchestrate shared data synchronization or scheduling. It has been proposed as a draft specification to ECMA a (known as ECMA strawman). RiverTrail runs in all popular browsers (except I.E. of course). To get started, download a prebuilt version https://github.com/downloads/RiverTrail/RiverTrail/rivertrail-0.17.xpi , install Intel's OpenCL SDK http://www.intel.com/go/opencl and try out the interactive River Trail shell http://rivertrail.github.com/interactive For a video overview, see  http://www.youtube.com/watch?v=jueg6zB5XaM . ParallelArray the ParallelArray type is the central component of this API & is a JS object that contains ordered collections of scalars – i.e. multidimensional uniform arrays. A shape property describes the dimensionality and size– e.g. a 2D RGBA image will have shape [height, width, 4]. ParallelArrays are immutable & fluent – they are manipulated by invoking methods on them which produce new ParallelArray objects. ParallelArray supports several constructors over arrays, functions & even the canvas. // Create an empty Parallel Array var pa = new ParallelArray(); // pa0 = <>   // Create a ParallelArray out of a nested JS array. // Note that the inner arrays are also ParallelArrays var pa = new ParallelArray([ [0,1], [2,3], [4,5] ]); // pa1 = <<0,1>, <2,3>, <4.5>>   // Create a two-dimensional ParallelArray with shape [3, 2] using the comprehension constructor var pa = new ParallelArray([3, 2], function(iv){return iv[0] * iv[1];}); // pa7 = <<0,0>, <0,1>, <0,2>>   // Create a ParallelArray from canvas.  This creates a PA with shape [w, h, 4], var pa = new ParallelArray(canvas); // pa8 = CanvasPixelArray   ParallelArray exposes fluent API functions that take an elemental JS function for data manipulation: map, combine, scan, filter, and scatter that return a new ParallelArray. Other functions are scalar - reduce  returns a scalar value & get returns the value located at a given index. The onus is on the developer to ensure that the elemental function does not defeat data parallelization optimization (avoid global var manipulation, recursion). For reduce & scan, order is not guaranteed - the onus is on the dev to provide an elemental function that is commutative and associative so that scan will be deterministic – E.g. Sum is associative, but Avg is not. map Applies a provided elemental function to each element of the source array and stores the result in the corresponding position in the result array. The map method is shape preserving & index free - can not inspect neighboring values. // Adding one to each element. var source = new ParallelArray([1,2,3,4,5]); var plusOne = source.map(function inc(v) {     return v+1; }); //<2,3,4,5,6> combine Combine is similar to map, except an index is provided. This allows elemental functions to access elements from the source array relative to the one at the current index position. While the map method operates on the outermost dimension only, combine, can choose how deep to traverse - it provides a depth argument to specify the number of dimensions it iterates over. The elemental function of combine accesses the source array & the current index within it - element is computed by calling the get method of the source ParallelArray object with index i as argument. It requires more code but is more expressive. var source = new ParallelArray([1,2,3,4,5]); var plusOne = source.combine(function inc(i) { return this.get(i)+1; }); reduce reduces the elements from an array to a single scalar result – e.g. Sum. // Calculate the sum of the elements var source = new ParallelArray([1,2,3,4,5]); var sum = source.reduce(function plus(a,b) { return a+b; }); scan Like reduce, but stores the intermediate results – return a ParallelArray whose ith elements is the results of using the elemental function to reduce the elements between 0 and I in the original ParallelArray. // do a partial sum var source = new ParallelArray([1,2,3,4,5]); var psum = source.scan(function plus(a,b) { return a+b; }); //<1, 3, 6, 10, 15> scatter a reordering function - specify for a certain source index where it should be stored in the result array. An optional conflict function can prevent an exception if two source values are assigned the same position of the result: var source = new ParallelArray([1,2,3,4,5]); var reorder = source.scatter([4,0,3,1,2]); // <2, 4, 5, 3, 1> // if there is a conflict use the max. use 33 as a default value. var reorder = source.scatter([4,0,3,4,2], 33, function max(a, b) {return a>b?a:b; }); //<2, 33, 5, 3, 4> filter // filter out values that are not even var source = new ParallelArray([1,2,3,4,5]); var even = source.filter(function even(iv) { return (this.get(iv) % 2) == 0; }); // <2,4> Flatten used to collapse the outer dimensions of an array into a single dimension. pa = new ParallelArray([ [1,2], [3,4] ]); // <<1,2>,<3,4>> pa.flatten(); // <1,2,3,4> Partition used to restore the original shape of the array. var pa = new ParallelArray([1,2,3,4]); // <1,2,3,4> pa.partition(2); // <<1,2>,<3,4>> Get return value found at the indices or undefined if no such value exists. var pa = new ParallelArray([0,1,2,3,4], [10,11,12,13,14], [20,21,22,23,24]) pa.get([1,1]); // 11 pa.get([1]); // <10,11,12,13,14>

    Read the article

  • Installing Oracle Block Browser and Editor tool (bbed)

    What if you could directly read and manipulate data at the block level? Oracle provides such a tool to do exactly that, but you have to build it yourself. The Block Browser and Editor tool, or bbed for short, is your ticket into the contents of data blocks within an Oracle database.

    Read the article

  • How should programmers handle identity theft?

    - by Craige
    I recently signed up for an iTunes account, and found that somebody had fraudulently used MY email to register their iTunes account. Why Apple did not validate the email address, I will never know. Now I am told that I cannot use my email address to register a new iTunes account, as this email address is linked to an existing account. This got me thinking, as developers, database administrators, technical analysts, and everything in between, how should we handle reports of a fraud account? Experience teaches us never to re-assign identifying credentials. This can break things and/or cause mass confusion, especially in the realm of the web. That is, if we are are needing to reassign an identifying user credential we can very likely break a users bookmark by making a page render data that previously did not exist at that location. So if we have been taught not to re-assign details like these, how should we handle such a case where an account is discovered to be a fraud and the owner of the identity (e-mail or user name) wishes to claim this detail for their account?

    Read the article

  • Oracle Database 12c is here!

    - by Maria Colgan
    Oracle Database 12c was officially release today and is now available for download. Along with the software release comes a whole new set of collateral that explains in detail all of the new features and functionality you will find in this release. The Optimizer page on Oracle.com has all the juicy details about what you can expect from the Optimizer in Oracle Database12c.  There you will find the following 3 new white papers; What to expect from the Oracle Optimizer in Oracle Database 12c SQL Plan Management with Oracle Database 12c Understanding Optimizer Statistics with Oracle Database 12c Over the coming months we will also present an in-depth series of blog posts on all of the cool new Optimizer features in 12c so stay tuned for that and happy reading! +Maria Colgan

    Read the article

  • Jazz up your web forms using jQuery animation effects

    - by bipinjoshi
    In this part I cover how to add jazz to your web forms using jQuery effects. jQuery provides a set of methods that allow you to create animations in your web pages. Collectively these methods are called as Effects. The effects they render include fading in and out, sliding in and out, changing opacity of elements, hiding and showing elements and so on. You can, of course, define custom animations. In this part we will use these effects to develop a tooltip, master-detail listing and progress indicator.http://www.bipinjoshi.net/articles/9b1f4a81-ae07-4859-8ff2-067e5887adbd.aspx   

    Read the article

  • SQL SERVER – Weekly Series – Memory Lane – #031

    - by Pinal Dave
    Here is the list of selected articles of SQLAuthority.com across all these years. Instead of just listing all the articles I have selected a few of my most favorite articles and have listed them here with additional notes below it. Let me know which one of the following is your favorite article from memory lane. 2007 Find Table without Clustered Index – Find Table with no Primary Key Clustered index is very important concept for any table. They impact the performance very heavily. Here is a quick script to find tables without a clustered index. Replace TEXT with VARCHAR(MAX) – Stop using TEXT, NTEXT, IMAGE Data Types Question: “Is VARCHAR (MAX) big enough to store the TEXT field?” Answer: “Yes, VARCHAR(MAX) is big enough to accommodate TEXT field. TEXT, NTEXT and IMAGE data types of SQL Server 2000 will be deprecated in a future version of SQL Server, SQL Server 2005 provides backward compatibility to data types but it is recommended to use new data types which are VARHCAR (MAX), NVARCHAR (MAX) and VARBINARY (MAX).” Limiting Result Sets by Using TABLESAMPLE – Examples Introduced in SQL Server 2005, TABLESAMPLE allows you to extract a sampling of rows from a table in the FROM clause. The rows retrieved are random and they are are not in any order. This sampling can be based on a percentage of number of rows. You can use TABLESAMPLE when only a sampling of rows is necessary for the application instead of a full result set. User Defined Functions (UDF) Limitations UDF have its own advantage and usage but in this article we will see the limitation of UDF. Things UDF can not do and why Stored Procedure are considered as more flexible then UDFs. Stored Procedure are more flexibility then User Defined Functions(UDF). However, this blog post is a good read to know what are the limitations of UDF. Change Database Compatible Level – Backward Compatibility For a long time SQL Server stayed on the compatibility level of 80 which is of SQL Server 2000. However, as soon as SQL Server 2005 introduced the issue of compatibility was quite a major issue. Since that time MS has been releasing the versions at every 2-3 years, changing compatibility is a ever popular topic. In this blog post, we learn how we can do the same using T-SQL. We can also do the same using SSMS and here is the blog post for the same: Change Database Compatible Level – Backward Compatibility – Part 2 – Management Studio. Constraint on VARCHAR(MAX) Field To Limit It Certain Length How can I limit the VARCHAR(MAX) field with maximum length of 12500 characters only. His Question was valid as our application was allowed 12500 characters. First of all – this requirement is bit strange but if someone wants to do the same, they can do it as described in this blog post. 2008 UNPIVOT Table Example Understanding UNPIVOT can be very complicated at times. In this blog post, I have attempted to explain the same concept in very simple words. Create Default Constraint Over Table Column A simple straight to script blog post – I still use this blog quite many times for my own reference. UDF – Get the Day of the Week Function It took me 4 iteration to find this very simple function which can immediately get the day of the week in a single line. 2009 Find Hostname and Current Logged In User Name There are two tricks listed in this blog post where users can find out the hostname and current logged user name immediately and very easily. Interesting Observation of Logon Trigger On All Servers When I was doing a project, I made an interesting observation of executing a logon trigger multiple times. It was absolutely unexpected for me! As I was logging only once, naturally, I was expecting the entry only once. However, it did it multiple times on different threads – indeed an eccentric phenomenon at first sight! Difference Between Candidate Keys and Primary Key One needs to be very careful in selecting the Primary Key as an incorrect selection can adversely impact the database architect and future normalization. For a Candidate Key to qualify as a Primary Key, it should be Non-NULL and unique in any domain. I have observed quite often that Primary Keys are seldom changed. I would like to have your feedback on not changing a Primary Key. Create Multiple Filegroup For Single Database Why should one create multiple file group for any database and what are the advantages of the same. In this blog post, I explain the same in detail. List All Objects Created on All Filegroups in Database In this blog post we discuss the essential question – “How can I find which object belongs to which filegroup. Is there any way to know this?” 2010 DATE and TIME in SQL Server 2008 When DATE is converted to DATETIME it adds the of midnight. When TIME is converted to DATETIME it adds the date of 1900 and it is something one wants to consider if you are going to run scripts from SQL Server 2008 to earlier version with CONVERT. Disabled Index and Update Statistics If you do not need a nonclustered index, I suggest you to drop it as keeping them disabled is an overhead on your system. This is because every time the statistics are updated for system all the statistics for disabled indexes are also updated. Precision of SMALLDATETIME – A 1 Minute Precision The precision of the datatype SMALLDATETIME is 1 minute. It discards the seconds by rounding up or rounding down any seconds greater than zero. 2011 Getting Columns Headers without Result Data – SET FMTONLY ON SET FMTONLY ON returns only metadata to the client. It can be used to test the format of the response without actually running the query. When this setting is ON the resultset only have headers of the results but no data. Copy Database from Instance to Another Instance – Copy Paste in SQL Server SQL Server has a feature which copy database from one database to another database and it can be automated as well using SSIS. Make sure you have SQL Server Agent Turned on as this feature will create a job. Puzzle – SELECT * vs SELECT COUNT(*) If you have ever wondered SELECT * gives error when executed alone but SELECT COUNT(*) does not. Why? in that case, you should read this blog post. Creating All New Database with Full Recovery Model This blog post is very based on very interesting story where the user wants to do something by default for every single new database created. Model database is a secret weapon which should be used very carefully and with proper evalution. If used carefully this can be a very much beneficiary when we need a newly created database behave in certain fashion. 2012 In year 2012 I had two interesting series ran on the blog. If there is no fun in learning, the learning becomes a burden. For the same reason, I had decided to build a three part quiz around SEQUENCE. The quiz was to identify the next value of the sequence. I encourage all of you to take part in this fun quiz. Guess the Next Value – Puzzle 1 Guess the Next Value – Puzzle 2 Guess the Next Value – Puzzle 3 Can anyone remember their final day of schooling?  This is probably a silly question because – of course you can!  Many people mark this as the most exciting, happiest day of their life.  It marks the end of testing, the end of following rules set by teachers, and the beginning of finally being able to earn money and work in your chosen field. Read five part series on developer training subject Developer Training - Importance and Significance - Part 1 Developer Training – Employee Morals and Ethics – Part 2 Developer Training – Difficult Questions and Alternative Perspective - Part 3 Developer Training – Various Options for Developer Training – Part 4 Developer Training – A Conclusive Summary- Part 5 Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Memory Lane, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Trainings for Back-end Programmer [closed]

    - by Pius
    I am currently working as an Android developer but I want to continue my career as a back-end developer. I consider my self having a relatively good knowledge of networking, databases and writing low-level code and other stuff that is involved in back- and mid- ends. What would be some good courses, training or whatever to improve as a back-end developer? Not the basic ones but rather more advanced ones (not too much, I'm self-taught). What are the main events in this area?

    Read the article

  • Working with Legacy code #3 : Build a safety net.

    - by andrewstopford
    The first port of call in changing legacy code is a safety net, without one your fingers will get burnt. Make your safety net a high level functional test over the major areas of the application. Automate the test, plug it into your CI builds and run it every night. The test should act as a final fail safe as you work.

    Read the article

  • Visual Studio 2010: Custom Start Page

    - by Steve Clements
    As Visual Studio 2010 IDE has been mostly written in WPF, extending the start page has become pretty darn easy and I for one find this quite interesting as I always open with the start page and the more customisation I can have the better! There are a few things you will need to install first to get going Visual Studio 2010 SDK Start page project template, which you can either get from the New Project dialog, in the online gallery section in VS or download from here   I was going to write a blog post on how to create a custom start page, but decided that msdn have done such a good job I was pretty much wasting my time, so take a look here, it has in detail everything you need to know to get it done! :) Technorati Tags: Visual Studio 2010,Custom Start Pages

    Read the article

  • x axis detection issues platformer starter kit

    - by dbomb101
    I've come across a problem with the collision detection code in the platformer starter kit for xna.It will send up the impassible flag on the x axis despite being nowhere near a wall in either direction on the x axis, could someone could tell me why this happens ? Here is the collision method. /// <summary> /// Detects and resolves all collisions between the player and his neighboring /// tiles. When a collision is detected, the player is pushed away along one /// axis to prevent overlapping. There is some special logic for the Y axis to /// handle platforms which behave differently depending on direction of movement. /// </summary> private void HandleCollisions() { // Get the player's bounding rectangle and find neighboring tiles. Rectangle bounds = BoundingRectangle; int leftTile = (int)Math.Floor((float)bounds.Left / Tile.Width); int rightTile = (int)Math.Ceiling(((float)bounds.Right / Tile.Width)) - 1; int topTile = (int)Math.Floor((float)bounds.Top / Tile.Height); int bottomTile = (int)Math.Ceiling(((float)bounds.Bottom / Tile.Height)) - 1; // Reset flag to search for ground collision. isOnGround = false; // For each potentially colliding tile, for (int y = topTile; y <= bottomTile; ++y) { for (int x = leftTile; x <= rightTile; ++x) { // If this tile is collidable, TileCollision collision = Level.GetCollision(x, y); if (collision != TileCollision.Passable) { // Determine collision depth (with direction) and magnitude. Rectangle tileBounds = Level.GetBounds(x, y); Vector2 depth = RectangleExtensions.GetIntersectionDepth(bounds, tileBounds); if (depth != Vector2.Zero) { float absDepthX = Math.Abs(depth.X); float absDepthY = Math.Abs(depth.Y); // Resolve the collision along the shallow axis. if (absDepthY < absDepthX || collision == TileCollision.Platform) { // If we crossed the top of a tile, we are on the ground. if (previousBottom <= tileBounds.Top) isOnGround = true; // Ignore platforms, unless we are on the ground. if (collision == TileCollision.Impassable || IsOnGround) { // Resolve the collision along the Y axis. Position = new Vector2(Position.X, Position.Y + depth.Y); // Perform further collisions with the new bounds. bounds = BoundingRectangle; } } //This is the section which deals with collision on the x-axis else if (collision == TileCollision.Impassable) // Ignore platforms. { // Resolve the collision along the X axis. Position = new Vector2(Position.X + depth.X, Position.Y); // Perform further collisions with the new bounds. bounds = BoundingRectangle; } } } } } // Save the new bounds bottom. previousBottom = bounds.Bottom; }

    Read the article

  • SQL2K8R2: StreamInsight changes at RTM: Hopping Windows

    - by Greg Low
    We've been working on updating our demos and samples for the RTM changes of StreamInsight. I'll detail these as I come across them. The first is that there is a change to the HoppingWindow. The first two parameters are the same in the constructor but the third parameter is now required. It is the HoppingWindowOutputPolicy. Currently, there is only a single option for this which is ClipToWindowEnd. So you can create a HoppingWindow like this: var queryOutput = from w in input.HoppingWindow ( TimeSpan...(read more)

    Read the article

  • sending credential to linkedIn website and get oauth_verifier without sign in again

    - by akash kumar
    i am facing problem regarding sending credentials to other website and after login(automatically not clicked on sign in here) and get oauth_verifier value Detail is bellow.... I want to send emailaddress and password through form(submit button)from my website(i.e liferay portal) to another website(suppose linkedIn) it should automatically authorize and return oauth_verifier to my website. that mean i dont want my website user to submit emailaddress and password to linkedIn again. actually i want to take emailaddress and password in my website and show the user LinkedIn connection,message,job posting in my website it self,i dont want to redirect user to LinkedIn website and sign in there and again come back to my website. I have taken consumer key and secret key from LinkedIn for my aplication. i am using linkedIn api and getting oauth_verifier for access token but for that i have to take user to LinkedIn for signIn, actually it should happen in backend

    Read the article

  • Oracle : SQL Developer Data Modeler 3.0 disponible, l'outil de modélisation s'ouvre au travail collaboratif

    Oracle : SQL Developer Data Modeler 3.0 disponible L'outil de modélisation s'ouvre au travail collaboratif Oracle vient de lancer une nouvelle version majeure de « SQL Developer Data Modeler », son outil gratuit de modélisation des bases de données. Cette version 3.0 acquiert une dimension collaborative et s'ouvre aux systèmes de contrôle de version. Plusieurs collaborateurs peuvent donc désormais contribuer à l'élaboration du même modèle et suivre, en détail, quel contributeur a fait quels changements sur les modélisations. Pour l'instant, seul Subversion est supporté mais Oracle envisage d'intégrer le support d'autres CVS. Cet outil s'intègr...

    Read the article

  • Google I/O 2010 - Advanced Android audio techniques

    Google I/O 2010 - Advanced Android audio techniques Google I/O 2010 - Advanced Android audio techniques Android 301 Dave Sparks In this session, we will explore advanced techniques that you can employ in your apps when working with media. This includes using Android's low-level audio APIs, selecting the appropriate format for your media files, and what's now possible using new media framework APIs introduced in Android 2.2. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 3 0 ratings Time: 57:16 More in Science & Technology

    Read the article

  • Real tortoises keep it slow and steady. How about the backups?

    - by Maria Zakourdaev
      … Four tortoises were playing in the backyard when they decided they needed hibiscus flower snacks. They pooled their money and sent the smallest tortoise out to fetch the snacks. Two days passed and there was no sign of the tortoise. "You know, she is taking a lot of time", said one of the tortoises. A little voice from just out side the fence said, "If you are going to talk that way about me I won't go." Is it too much to request from the quite expensive 3rd party backup tool to be a way faster than the SQL server native backup? Or at least save a respectable amount of storage by producing a really smaller backup files?  By saying “really smaller”, I mean at least getting a file in half size. After Googling the internet in an attempt to understand what other “sql people” are using for database backups, I see that most people are using one of three tools which are the main players in SQL backup area:  LiteSpeed by Quest SQL Backup by Red Gate SQL Safe by Idera The feedbacks about those tools are truly emotional and happy. However, while reading the forums and blogs I have wondered, is it possible that many are accustomed to using the above tools since SQL 2000 and 2005.  This can easily be understood due to the fact that a 300GB database backup for instance, using regular a SQL 2005 backup statement would have run for about 3 hours and have produced ~150GB file (depending on the content, of course).  Then you take a 3rd party tool which performs the same backup in 30 minutes resulting in a 30GB file leaving you speechless, you run to management persuading them to buy it due to the fact that it is definitely worth the price. In addition to the increased speed and disk space savings you would also get backup file encryption and virtual restore -  features that are still missing from the SQL server. But in case you, as well as me, don’t need these additional features and only want a tool that performs a full backup MUCH faster AND produces a far smaller backup file (like the gain you observed back in SQL 2005 days) you will be quite disappointed. SQL Server backup compression feature has totally changed the market picture. Medium size database. Take a look at the table below, check out how my SQL server 2008 R2 compares to other tools when backing up a 300GB database. It appears that when talking about the backup speed, SQL 2008 R2 compresses and performs backup in similar overall times as all three other tools. 3rd party tools maximum compression level takes twice longer. Backup file gain is not that impressive, except the highest compression levels but the price that you pay is very high cpu load and much longer time. Only SQL Safe by Idera was quite fast with it’s maximum compression level but most of the run time have used 95% cpu on the server. Note that I have used two types of destination storage, SATA 11 disks and FC 53 disks and, obviously, on faster storage have got my backup ready in half time. Looking at the above results, should we spend money, bother with another layer of complexity and software middle-man for the medium sized databases? I’m definitely not going to do so.  Very large database As a next phase of this benchmark, I have moved to a 6 terabyte database which was actually my main backup target. Note, how multiple files usage enables the SQL Server backup operation to use parallel I/O and remarkably increases it’s speed, especially when the backup device is heavily striped. SQL Server supports a maximum of 64 backup devices for a single backup operation but the most speed is gained when using one file per CPU, in the case above 8 files for a 2 Quad CPU server. The impact of additional files is minimal.  However, SQLsafe doesn’t show any speed improvement between 4 files and 8 files. Of course, with such huge databases every half percent of the compression transforms into the noticeable numbers. Saving almost 470GB of space may turn the backup tool into quite valuable purchase. Still, the backup speed and high CPU are the variables that should be taken into the consideration. As for us, the backup speed is more critical than the storage and we cannot allow a production server to sustain 95% cpu for such a long time. Bottomline, 3rd party backup tool developers, we are waiting for some breakthrough release. There are a few unanswered questions, like the restore speed comparison between different tools and the impact of multiple backup files on restore operation. Stay tuned for the next benchmarks.    Benchmark server: SQL Server 2008 R2 sp1 2 Quad CPU Database location: NetApp FC 15K Aggregate 53 discs Backup statements: No matter how good that UI is, we need to run the backup tasks from inside of SQL Server Agent to make sure they are covered by our monitoring systems. I have used extended stored procedures (command line execution also is an option, I haven’t noticed any impact on the backup performance). SQL backup LiteSpeed SQL Backup SQL safe backup database <DBNAME> to disk= '\\<networkpath>\par1.bak' , disk= '\\<networkpath>\par2.bak', disk= '\\<networkpath>\par3.bak' with format, compression EXECUTE master.dbo.xp_backup_database @database = N'<DBName>', @backupname= N'<DBName> full backup', @desc = N'Test', @compressionlevel=8, @filename= N'\\<networkpath>\par1.bak', @filename= N'\\<networkpath>\par2.bak', @filename= N'\\<networkpath>\par3.bak', @init = 1 EXECUTE master.dbo.sqlbackup '-SQL "BACKUP DATABASE <DBNAME> TO DISK= ''\\<networkpath>\par1.sqb'', DISK= ''\\<networkpath>\par2.sqb'', DISK= ''\\<networkpath>\par3.sqb'' WITH DISKRETRYINTERVAL = 30, DISKRETRYCOUNT = 10, COMPRESSION = 4, INIT"' EXECUTE master.dbo.xp_ss_backup @database = 'UCMSDB', @filename = '\\<networkpath>\par1.bak', @backuptype = 'Full', @compressionlevel = 4, @backupfile = '\\<networkpath>\par2.bak', @backupfile = '\\<networkpath>\par3.bak' If you still insist on using 3rd party tools for the backups in your production environment with maximum compression level, you will definitely need to consider limiting cpu usage which will increase the backup operation time even more: RedGate : use THREADPRIORITY option ( values 0 – 6 ) LiteSpeed : use  @throttle ( percentage, like 70%) SQL safe :  the only thing I have found was @Threads option.   Yours, Maria

    Read the article

  • music for an arcade game?

    - by user717572
    I'm thinking about music for my brick breaker game, but I don't know how to choose any. If I'd make a loop from a few seconds, I think it would get annoying very quickly. I also found some longer length tracks (about 2 minutes), but when this is over, it's going to be repeated anyway, just like when you'd select a new level, you'd have to listen to the same beginning of the song again. I can't put an hour of music in my application, so what would you recommend I'd do for the music?

    Read the article

  • Stop Screenlets from minimizing

    - by Capt.Nemo
    I've setup my screenlets exactly the way that I want them, but I don't have a quick access to any of them. The screenlet config offers me the following : keep above,below, sticky, locky, widget. Out of these only treat as Widget seems to be of any use here. I just looked at this in detail and thought it was what I was looking for. It might have been a workaround for the issue (instead of minimizing I would just press F9. But this means that the widget hides itself from the normal desktop, which is not what I want. What I want is that on pressing Ctrl+Alt+D or Super+D, I should see the desktop with my screenlets there. I don't want them to minimize with the rest of the windows. As a final struggle, I've thought of a solution using compiz to declare the screenlet windows as non-minimizing, but surely there must be a better way than that. (Instructions for this would be helpful as well - I'm not sure what to enter in the rule matches)

    Read the article

< Previous Page | 139 140 141 142 143 144 145 146 147 148 149 150  | Next Page >