Search Results

Search found 12422 results on 497 pages for 'non disclosure agreement'.

Page 136/497 | < Previous Page | 132 133 134 135 136 137 138 139 140 141 142 143  | Next Page >

  • Lighting-Reflectance Models & Licensing Issues

    - by codey
    Generally, or specifically, is there any licensing issue with using any of the well known lighting/reflectance models (i.e. the BRDFs or other distribution or approximation functions): Phong, Blinn–Phong, Cook–Torrance, Blinn-Torrance-Sparrow, Lambert, Minnaert, Oren–Nayar, Ward, Strauss, Ashikhmin-Shirley and common modifications where applicable, such as: Beckmann distribution, Blinn distribution, Schlick's approximation, etc. in your shader code utilised in a commercial product? Or is it a non-issue?

    Read the article

  • As an indie game dev, what processes are the best for soliciting feedback on my design/spec/idea? [closed]

    - by Jess Telford
    Background I have worked in a professional environment where the process usually goes like the following: Brain storm idea Solidify the game mechanics / design Iterate on design/idea to create a more solid experience Spec out the details of the design/idea Build it Step 3. is generally done with the stakeholders of the game (developers, designers, investors, publishers, etc) to reach an 'agreement' which meets the goals of all involved. Due to this process involving a series of often opposing and unique view points, creative solutions can surface through discussion / iteration. This is backed up by a process for collating the changes / new ideas, as well as structured time for discussion. As a (now) indie developer, I have to play the role of all the stakeholders (developers, designers, investors, publishers, etc), and often find myself too close to the idea / design to do more than minor changes, which I feel to be local maxima when it comes to the best result (I'm looking for the global maxima, of course). I have read that ideas / game designs / unique mechanics are merely multipliers of execution, and that keeping them secret is just silly. In sharing the idea with others outside the realm of my own thinking, I hope to replicate the influence other stakeholders have. I am struggling with the collation of changes / new ideas, and any kind of structured method of receiving feedback. My question: As an indie game developer, how and where can I share my ideas/designs to receive meaningful / constructive feedback? How can I successfully collate the feedback into a new iteration of the design? Are there any specialized websites, etc?

    Read the article

  • Firefox Slow down and 100% CPU after gnome-settings-deamon update

    - by digitaljail
    I'm on Ubuntu 12.04 (Unity) with Firefox 17.0.1 instaled. after the latest update of the gnome-settings-deamon (3.4.2-0ubuntu0.5,3.4.2-0ubuntu0.6) FireFox starts taking 100% of my CPU, periodically. I have tried various things: 1) Disabled All the non standard extensions = No change to the CPU Usage 2) Disabled Flash PlugIns (also updated same time) = No change to CPU Usage 3) Disabled "Global Menu integration 3.6.4) Extension = HOOOA CPU OK !!! Any suggestion to get back global menu integration with no more CPU problems?

    Read the article

  • Opportunity Nokia's

    - by Andrew Clarke
    Nokia’s alliance with Microsoft is likely to be good news for anyone using Microsoft technologies, and particularly for .NET developers. Before the announcement, the future wasn’t looking so bright for the ‘mobile’ version of Windows, Windows Phone. Microsoft currently has only 3.1% of the Smartphone market, even though it has been involved in it for longer than its main rivals. Windows Phone has now got the basics right, but that is hardly sufficient by itself to change its predicament significantly. With Nokia's help, it is possible. Despite the promise of multi-tasking for third party apps, integration with Microsoft platforms such as Xbox and Office, direct integration of Twitter support, and the introduction of IE 9 “later this year”, there have been frustratingly few signs of urgency on Microsoft’s part in improving the Windows Phone  product. Until this happens, there seems little prospect of reward for third-party developers brave enough to support the platform with applications. This is puzzling when one sees how well SQL Server and Microsoft’s other server technologies have thrived in recent years, under good leadership from a management that understands the technology. The same just hasn’t been true for some of the consumer products. In consequence, iPads and Android tablets have already exposed diehard Windows users, for the first time, to an alternative GUI for consumer Tablet PCs, and the comparisons aren’t always in Windows’ favour. Nokia’s problem is obvious: Android’s meteoric rise. Android now has 33% of the worldwide market for smartphones, while the market share of Nokia’s Symbian has dropped from 44% to 31%. As details of the agreement emerge, it would seem that Nokia will bring a great deal of expertise, such as imaging and Nokia Maps, to Windows Phone that should make it more competitive. It is wrong to assume that Nokia’s decline will continue: the shock of Android’s sudden rise could be enough to sting them back to their previous form, and they have Microsoft’s huge resources and marketing clout to help them. For the sake of the whole Windows stack, I really hope the alliance succeeds.

    Read the article

  • [SSL] Becoming Root CA

    - by Max13
    Hi everybody, I'm the founder of a little non-profit French organization. Currently, we're providing free web and shell hosting. Talking about that, is there a way to become a Trusted Certificate Authority, in order to give free SSL certificates to my customers, but also to avoid being an intermediate (and pay a lot for that), and/or avoid paying a lot for each certificate... Thank you for your help.

    Read the article

  • When SQL Server Nonclustered Indexes Are Faster Than Clustered Indexes

    SQL Server Clustered indexes can have enormous implications for performance of operations on a table. But are there times when a SQL Server non-clustered index would perform better than a clustered index for the same operation? Are there any trade-offs to consider? Check out this tip to learn more. Deployment Manager 2 is now free!The new version includes tons of new features and we've launched a completely free Starter Edition! Get Deployment Manager here

    Read the article

  • New Oracle Database Performance and Tuning Specialization version available to all partners!

    - by Catalin Teodor
    Any partner working with the Oracle database should take advantage of the new Oracle Database Performance and Tuning specialization version. Have partners review the criteria and see what’s new in this specialization solution update. Partners should plan to take the new version as the Oracle Database 11g Performance Tuning specialization becomes non-qualifying as of August 1, 2015.

    Read the article

  • SQL Server Add Primary Key

    - by Derek D.
    Adding a primary key can be done either after a table is created, or at the same a table is created. It is important to note, that by default a primary key is clustered. This may or may not be the preferred method of creation. For more information on clustered vs non [...]

    Read the article

  • Windows Azure Role Instance Limits

    - by kaleidoscope
    Brief overview of the limits imposed on hosted services in Windows Azure is as follows: Effective before Dec. 10th 2009 Effective  after Dec. 10th 2009 Effective after Jan. 4th 2010 Token (CTP) Token (CTP) Token (non-billing country) Paying subscription Deployment Slots 2 2 2 2 Hosted Services 1 1 20 20 Roles per  deployment 5 5 5 5 Instances per Role 2 2 no limit no limit VM CPU Cores no limit 8 8 20 Storage Accounts 2 2 5 5 More Information: http://blog.toddysm.com/2010/01/windows-azure-role-instance-limits-explained.html   Amit, S

    Read the article

  • Why many designs ignore normalization in RDBMS?

    - by Yosi
    I got to see many designs that normalization wasn't the first consideration in decision making phase. In many cases those designs included more than 30 columns, and the main approach was "to put everything in the same place" According to what I remember normalization is one of the first, most important things, so why is it dropped so easily sometimes? Edit: Is it true that good architects and experts choose a denormalized design while non-experienced developers choose the opposite? What are the arguments against starting your design with normalization in mind?

    Read the article

  • Not able to setup Tomboy Web for Ubuntu One

    - by Karthik
    I have been trying to setup Tomboy Web in Ubuntu 12.04 but without much success. I press the "Connect to Server" in the Preferences dialog and the expected result is for your browser to open, with the Authorization page. But, in my case, Firefox opens but the authorization page does not open at all. Some details: My default browser is Chrome, but Firefox always opens that to with a non-default profile. Note: I have already browsed through most of the other articles in AskUbuntu regarding TomBoy Synchronization, but none of them discuss this particular problem

    Read the article

  • Correcting color-shifted mirrored i915 driver in 12.04?

    - by Will Martin
    I was called in to fix a friend's malfunctioning HP Pavilion. She's not sure exactly which model, but the sticker on the bottom says "G60". The problem was a failed upgrade to 12.04. I was able to mostly repair it with sudo apt-get -f install, which ran setup and configuration for several hundred packages. The biggest problem at the moment is Xorg. The login screen (lightdm) loads normally but at a reduced resolution (1024x768 instead of 1366x768). But once you log in, it looks like this: Observe that the colors of the dock on the left and the bar at the top are normal. But the background is filled with bizarro color-skewed ghost images of the desktop. In all cases, the actual contents of any programs you run is a totally illegible mess, except that the bar at the top of any program windows looks and acts normally. And the ghost images are interactive! For example, if you click the icon in the top right corner to get the "shut down" menu, the same menu will appear in the ghost images below. Starting a terminal will start a terminal window in both the real desktop and the ghost images, and moving it around updates both the real and ghost desktops. I suspect Xorg is using some kind of wrong driver and/or parameter for the graphics hardware. Here is the graphics-relevant portion of the lspci -v output: 00:00.0 Host bridge: Intel Corporation Mobile 4 Series Chipset Memory Controller Hub (rev 09) Subsystem: Hewlett-Packard Company Device 360b Flags: bus master, fast devsel, latency 0 Capabilities: [e0] Vendor Specific Information: Len=0a <?> Kernel driver in use: agpgart-intel 00:02.0 VGA compatible controller: Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller (rev 09) (prog-if 00 [VGA controller]) Subsystem: Hewlett-Packard Company Device 360b Flags: bus master, fast devsel, latency 0, IRQ 44 Memory at d0000000 (64-bit, non-prefetchable) [size=4M] Memory at c0000000 (64-bit, prefetchable) [size=256M] I/O ports at 5110 [size=8] Expansion ROM at <unassigned> [disabled] Capabilities: [90] MSI: Enable+ Count=1/1 Maskable- 64bit- Capabilities: [d0] Power Management version 3 Kernel driver in use: i915 Kernel modules: i915 00:02.1 Display controller: Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller (rev 09) Subsystem: Hewlett-Packard Company Device 360b Flags: bus master, fast devsel, latency 0 Memory at d2500000 (64-bit, non-prefetchable) [size=1M] Capabilities: [d0] Power Management version 3 I'm not sure what to check next. I would ordinarily check xorg.conf to see what it says, but that apparently doesn't exist any more, and my googling has not yielded any useful techniques for getting Xorg to tell me what settings it decided to use. The weird part is that it works fine on the login screen. It's only when you actually log in as a user that the display gets screwed up. Suggestions?

    Read the article

  • Are there specific legal issues for web developers working on sex dating sites?

    - by YumYumYum
    Say I have created many ordinary websites which are not related to any dating/sexual content. Are the rules and regulations for a developer the same when making a sex-related dating site? I'm talking about a site where people meet together and get to know each other, with the intent of having a sexual relationship (you know what I mean), also featuring webcam sex, but not explicitly a porno site. Do such sites have any special legal issues for developers compared with non-sexual/dating sites?

    Read the article

  • Software for Managing Subscriptions to Website Content?

    - by an00b
    Can you recommend a package that allows me to manage subscriptions to certain content on my website (not necessarily displayable) based on payment levels? Ideally, the software would allow logging in using both site-specific registration and PayPal/Facebook/Twitter/MyOpenId, etc. Preferably, it would also be open source, LAMP-based. One idea that I have in mind is hacking a shopping cart software like Zen-Cart but this may be an overkill if a non-shopping lighter-weight package exists.

    Read the article

  • what tool should I use for drawing 2d OpenGL shapes?

    - by Kenny Winker
    I'm working on a very simple OpenGL ES 2.0 game, and I'm not sure what tool to use to create the vertex data I need. My first thought was adobe illustrator, but I can't seem to google up any info on how to convert an .ai file to vertices. I'm only going to be using very simple 2d shapes so I wonder if I need to use a 3d modelling program? How is this typically done, when you are working with 2d non-sprite shapes?

    Read the article

  • Oracle Enterprise Manager Ops Center : Using Operational Profiles to Install Packages and other Content

    - by LeonShaner
    Oracle Enterprise Manager Ops Center provides numerous ways to deploy content, such as through OS Update Profiles, or as part of an OS Provisioning plan or combinations of those and other "Install Software" capabilities of Deployment Plans.  This short "how-to" blog will highlight an alternative way to deploy content using Operational Profiles. Usually we think of Operational Profiles as a way to execute a simple "one-time" script to perform a basic system administration function, which can optionally be based on user input; however, Operational Profiles can be much more powerful than that.  There is often more to performing an action than merely running a script -- sometimes configuration files, packages, binaries, and other scripts, etc. are needed to perform the action, and sometimes the user would like to leave such content on the system for later use. For shell scripts and other content written to be generic enough to work on any flavor of UNIX, converting the same scripts and configuration files into Solaris 10 SVR4 package, Solaris 11 IPS package, and/or a Linux RPM's might be seen as three times the work, for little appreciable gain.   That is where using an Operational Profile to deploy simple scripts and other generic content can be very helpful.  The approach is so powerful, that pretty much any kind of content can be deployed using an Operational Profile, provided the files involved are not overly large, and it is not necessary to convert the content into UNIX variant-specific formats. The basic formula for deploying content with an Operational Profile is as follows: Begin with a traditional script header, which is a UNIX shell script that will be responsible for decoding and extracting content, copying files into the right places, and executing any other scripts and commands needed to install and configure that content. Include steps to make the script platform-aware, to do the right thing for a given UNIX variant, or a "sorry" message if the operator has somehow tried to run the Operational Profile on a system where the script is not designed to run.  Ops Center can constrain execution by target type, so such checks at this level are an added safeguard, but also useful with the generic target type of "Operating System" where the admin wants the script to "do the right thing," whatever the UNIX variant. Include helpful output to show script progress, and any other informational messages that can help the admin determine what has gone wrong in the case of a problem in script execution.  Such messages will be shown in the job execution log. Include necessary "clean up" steps for normal and error exit conditions Set non-zero exit codes when appropriate -- a non-zero exit code will cause an Operational Profile job to be marked failed, which is the admin's cue to look into the job details for diagnostic messages in the output from the script. That first bullet deserves some explanation.  If Operational Profiles are usually simple "one-time" scripts and binary content is not allowed, then how does the actual content, packages, binaries, and other scripts get delivered along with the script?  More specifically, how does one include such content without needing to first create some kind of traditional package?   All that is required is to simply encode the content and append it to the end of the Operational Profile.  The header portion of the Operational Profile will need to contain the commands to decode the embedded content that has been appended to the bottom of the script.  The header code can do whatever else is needed, and finally clean up any intermediate files that were created during the decoding and extraction of the content. One way to encode binary and other content for inclusion in a script is to use the "uuencode" utility to convert the content into simple base64 ASCII text -- a form that is suitable to be appended to an Operational Profile.   The behavior of the "uudecode" utility is such that it will skip over any parts of the input that do not fit the uuencoded "begin" and "end" clauses.  For that reason, your header script will be skipped over, and uudecode will find your embedded content, that you will uuencode and paste at the end of the Operational Profile.  You can have as many "begin" / "end" clauses as you need -- just separate each embedded file by an empty line between "begin" and "end" clauses. Example:  Install SUNWsneep and set the system serial number Script:  deploySUNWsneep.sh ( <- right-click / save to download) Highlights: #!/bin/sh # Required variables: OC_SERIAL="$OC_SERIAL" # The user-supplied serial number for the asset ... Above is a good practice, showing right up front what kind of input the Operational Profile will require.   The right-hand side where $OC_SERIAL appears in this example will be filled in by Ops Center based on the user input at deployment time. The script goes on to restrict the use of the program to the intended OS type (Solaris 10 or older, in this example, but other content might be suitable for Solaris 11, or Linux -- it depends on the content and the script that will handle it). A temporary working directory is created, and then we have the command that decodes the embedded content from "self" which in scripting terms is $0 (a variable that expands to the name of the currently executing script): # Pass myself through uudecode, which will extract content to the current dir uudecode $0 At that point, whatever content was appended in uuencoded form at the end of the script has been written out to the current directory.  In this example that yields a file, SUNWsneep.7.0.zip, which the rest of the script proceeds to unzip, and pkgadd, followed by running "/opt/SUNWsneep/bin/sneep -s $OC_SERIAL" which is the command that stores the system serial for future use by other programs such as Explorer.   Don't get hung up on the example having used a pkgadd command.  The content started as a zip file and it could have been a tar.gz, or any other file.  This approach simply decodes the file.  The header portion of the script has to make sense of the file and do the right thing (e.g. it's up to you). The script goes on to clean up after itself, whether or not the above was successful.  Errors are echo'd by the script and a non-zero exit code is set where appropriate. Second to last, we have: # just in case, exit explicitly, so that uuencoded content will not cause error OPCleanUP exit # The rest of the script is ignored, except by uudecode # # UUencoded content follows # # e.g. for each file needed, #  $ uuencode -m {source} {source} > {target}.uu5 # then paste the {target}.uu5 files below # they will be extracted into the workding dir at $TDIR # The commentary above also describes how to encode the content. Finally we have the uuencoded content: begin-base64 444 SUNWsneep.7.0.zip UEsDBBQAAAAIAPsRy0Di3vnukAAAAMcAAAAKABUAcmVhZG1lLnR4dFVUCQADOqnVT7up ... VXgAAFBLBQYAAAAAAgACAJEAAADTNwEAAAA= ==== That last line of "====" is the base64 uuencode equivalent of a blank line, followed by "end" and as mentioned you can have as many begin/end clauses as you need.  Just separate each embedded file by a blank line after each ==== and before each begin-base64. Deploying the example Operational Profile looks like this (where I have pasted the system serial number into the required field): The job succeeded, but here is an example of the kind of diagnostic messages that the example script produces, and how Ops Center displays them in the job details: This same general approach could be used to deploy Explorer, and other useful utilities and scripts. Please let us know what you think?  Until next time...\Leon-- Leon Shaner | Senior IT/Product ArchitectSystems Management | Ops Center Engineering @ Oracle The views expressed on this [blog; Web site] are my own and do not necessarily reflect the views of Oracle. For more information, please go to Oracle Enterprise Manager  web page or  follow us at :  Twitter | Facebook | YouTube | Linkedin | Newsletter

    Read the article

  • How Microsoft Market DotNet?

    - by Fendy
    I just read an Joel's article about Microsoft's breaking change (non-backwards compatibility) with dot net's introduction. It is interesting and explicitly reflected the condition during that time. But now almost 10 years has passed. The breaking change It is mainly on how bad is Microsoft introducing non-backwards compatibility development tools, such as dot net, instead of improving the already-widely used asp classic or VB6. As much have known, dot net is not natively embedded in windows XP (yes in vista or 7), so in order to use the .net apps, you need to install the .net framework of over 300mb (it's big that day). However, as we see that nowadays many business use .net as their main development tools, with asp.net or mvc as their web-based applications. C# nowadays be one of tops programming languages (the most questions in stackoverflow). The more interesing part is, win32api still alive even there is newer technology out there (and still widely used). Imagine if microsoft does not introduce the breaking change, there will many corporates still uses asp classic or vb-based applications (there still is, but not that much). There are many corporates use additional services such as azure or sharepoint (beside how expensive is it). Please note that I also know there are many flagships applications (maybe adobe's and blizzard's) still use C-based or older language and not porting to newer high-level language. The question How can Microsoft persuade the users to migrate their old applications into dot net? As we have known it is very hard and give no immediate value when rewrite the applications (netscape story), and it is very risky. I am more interested in Microsoft's way and not opinion such as "because dot net is OOP, or dot net is dll-embedable, etc". This question may be constructive, as the technology is vastly changes over times lately. As we can see, Microsoft changes Asp.Net webform to MVC, winform is legacy now, it is starting to change to use windows store rather than basic-installment, touchscreen and later on we will have see-through applications such as google class. And that will be breaking changes. We will need to account portability as an issue nowadays. We will need other than just mere technology choice, but also migration plans. Even maybe as critical as we might need multiplatform language compiler, as approached by Joel's Wasabi. (hey, I read his articles too much!)

    Read the article

  • FileStream and FileTable in SQL Server 2012

    SQL Server 2012 enhanced the SQL Server 2008 FileStream data type by introducing FileTable, which lets an application integrate its storage and data management components to allow non-transactional access, and provide integrated SQL Server services. Arshad Ali explains how. Top 5 hard-earned lessons of a DBAIn part one, read about ‘The Case of the Missing Index’ and learn from the experience of The DBA Team. Read now.

    Read the article

  • What's better for SEO for many international markets?

    - by Roy Rico
    Right now, we're working to migrate our company sites for international markets to this scheme www.company.com/[2 letter country codes] www.company.com/uk #for United Kingdom www.company.com/au #for Australia www.company.com/jp #for Japan www.comapny.com/ #for united states, and non identifiable. However, in google webmaster tools, we can geo target each directory, but not the root. If we geo-tag the root with US, all the other markets will inherit. Is it better to move the US market to /us/ or leave it where it is?

    Read the article

  • What expectation should I have of South African web development rates / duration? [closed]

    - by Warren van Rooyen
    I am a developer but only intend on doing the front-end work for getting Reddit-like upvote / downvote functionality going on an upcoming site I'm building. I have never had to contract a developer for back-end work to implement code for me so I am quite in the dark on how much I should expect to pay and how long it could take to get the site going. I could be taken for a ride as the developer could distort the time it would take at a seemingly regular rate (hourly/day) or could otherwise distort their rate. Please could you give me help on this. I know you need some guidance on the nature of the site so here it is. I have a Reddit type template with CSS and PHP included. I then downloaded Pligg code that's intended to do the job of the Reddit upvote downvote functionality. How long would a developer roughly need to unite the theme and front end with the back-end functionality? I do understand it's not a lot of info but I'm sure you're experienced enough to have an instinct for the size of the project. Also, should I work on an hourly/day rate/ project payment agreement?

    Read the article

  • blender: 3D model from guide images

    - by Stefan
    In a effort to learn the blender interface, which is confusing to say the least, I've chosen to model a model from referrence pictures easily found on the web. Problem is that I can't ( and won't ) get perfect "right", "front" and "top" pictures. Blender only allows you to see the background pictures when in ortographic mode and only from right|front|top, which doesn't help me. How to I proceed to model from non-perfect guide images?

    Read the article

  • SharePoint - force user to accept AUP when first logging in etc

    - by Chris W
    We're looking to move a bespoke intranet across to SharePoint. One query that has come up is whether we can do the following easily: When user logs in for the first time they should be forced read and accept an Acceptable Use Policy for the site. Agree a separate agreement that relates to their data being shared with other parties. (Optional) upload their profile photo. They can skip this step if they don't have one but they should be prompted to do it each time they login subsequently. The above is all nice and easy in a bespoke app but I can't see how to do this with SharePoint. Can we build a custom workflow that is tied to the user logging in? So far I can only find how to attach workflows to libraries and lists.

    Read the article

< Previous Page | 132 133 134 135 136 137 138 139 140 141 142 143  | Next Page >