Search Results

Search found 16230 results on 650 pages for 'three js'.

Page 4/650 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • backbone.js - Having multiple instances of the same view

    - by TrueWheel
    I am having problems having multiple instances in of the same view in different div elements. When I try to initialize them only the second of the two elements appear no matter what order I put them in. Here is the code for my view. var BodyShapeView = Backbone.View.extend({ thingiview: null, scene: null, renderer: null, model: null, mouseX: 0, mouseY: 0, events:{ 'click button#front' : 'front', 'click button#diag' : 'diag', 'click button#in' : 'zoomIn', 'click button#out' : 'zoomOut', 'click button#on' : 'rotateOn', 'click button#off' : 'rotateOff', 'click button#wireframeOn' : 'wireOn', 'click button#wireframeOff' : 'wireOff', 'click button#distance' : 'dijkstra' }, initialize: function(name){ _.bindAll(this, 'render', 'animate'); scene = new THREE.Scene(); camera = new THREE.PerspectiveCamera( 15, 400 / 700, 1, 4000 ); camera.position.z = 3; scene.add( camera ); camera.position.y = -5; var ambient = new THREE.AmbientLight( 0x202020 ); scene.add( ambient ); var directionalLight = new THREE.DirectionalLight( 0xffffff, 0.75 ); directionalLight.position.set( 0, 0, 1 ); scene.add( directionalLight ); var pointLight = new THREE.PointLight( 0xffffff, 5, 29 ); pointLight.position.set( 0, -25, 10 ); scene.add( pointLight ); var loader = new THREE.OBJLoader(); loader.load( "img/originalMeanModel.obj", function ( object ) { object.children[0].geometry.computeFaceNormals(); var geometry = object.children[0].geometry; console.log(geometry); THREE.GeometryUtils.center(geometry); geometry.dynamic = true; var material = new THREE.MeshLambertMaterial({color: 0xffffff, shading: THREE.FlatShading, vertexColors: THREE.VertexColors }); mesh = new THREE.Mesh(geometry, material); model = mesh; // model = object; scene.add( model ); } ); // RENDERER renderer = new THREE.WebGLRenderer(); renderer.setSize( 400, 700 ); $(this.el).find('.obj').append( renderer.domElement ); this.animate(); }, Here is how I create the instances var morphableBody = new BodyShapeView({ el: $("#morphable-body") }); var bodyShapeView = new BodyShapeView({ el: $("#mean-body") }); Any help would be really appreciated. Thanks in advance.

    Read the article

  • node.js simply does not run

    - by user309641
    I installed and ran node.js just fine on my mac, but even if I do this on windows chdir c:\testfolder node example.js then I get this error: node.js:201 throw e; // process.nextTick error, or 'error' event on first tick Error: Cannot find module 'c:\testfolder\example.js' at Function._resolveFilename <module.js:322:11> at Function._load <module.js:299:25> at Array.0 <module.js:499:10> at EventEmitter._tickCallback <node.js:192:40> I'm only even trying to run the example code on the nodejs website: var http = require('http'); http.createServer(function (req, res) { res.writeHead(200, {'Content-Type': 'text/plain'}); res.end('Hello World\n'); }).listen(1337, '127.0.0.1'); console.log('Server running at http://127.0.0.1:1337/');

    Read the article

  • Node.js on via DynDNS

    - by Azincourt
    I have never used Node.js but since I am developing a browsergame that needs (almost) "realtime" communication, I am planning on using Node.js for this. To get started, I wanted to use a home server (normal computer) that is conntected to a dynamic IP via DynDNS. Are there the disadvantages using such a setting? What is the best way in combination with Node.js to store game status for a online game session?

    Read the article

  • Node.js apps and wordpress on the same vps

    - by Msencenb
    So currently my linode (ubuntu 11.10) serves up three node.js apps for me using connect's vhost middleware listening on port 80. Here is an example of how vhost sets up a domain: var portfolio = require('./bootstrap-portfolio/lib/app.js'); var server = express(); server.use(express.vhost('sencedev.com',portfolio)); server.use(express.vhost('www.sencedev.com',portfolio)); server.listen(80); However I would now like to add a wordpress installation to my vps as well. In the past for me this has meant a traditional apache installation; however I'm a bit unsure of how node.js + a different webserver (apache or nginx) should interact. Any thoughts on how I should approach hosting wordpress + node.js on the same box?

    Read the article

  • Connection timed out on Node.js app running under CentOS

    - by ss1271
    I followed this tutorial to create a simple node.js app on my CentOS: the node.js version is: $ node -v v0.10.28 Here's my app.js: // Include http module, var http = require("http"), // And url module, which is very helpful in parsing request parameters. url = require("url"); // show message at console console.log('Node.js app is running.'); // Create the server. http.createServer(function (request, response) { request.resume(); // Attach listener on end event. request.on("end", function () { // Parse the request for arguments and store them in _get variable. // This function parses the url from request and returns object representation. var _get = url.parse(request.url, true).query; // Write headers to the response. response.writeHead(200, { 'Content-Type': 'text/plain' }); // Send data and end response. response.end('Here is your data: ' + _get['data']); }); // Listen on the 8080 port. }).listen(8080); However, when I uploaded this app onto my remote server (assume the address is 123.456.78.9), I couldn't get access to it on my browser http://123.456.78.9:8080/?data=123 The browser returned Error code: ERR_CONNECTION_TIMED_OUT. I tried the same app.js code which runs fine on my local machine, is there anything I am missing? I tried to ping the server and its address was reachable. Thanks.

    Read the article

  • node.js server not running

    - by CMDadabo
    I am trying to learn node.js, but I'm having trouble getting the simple server to run on localhost:8888. Here is the code for server.js: var http = require("http"); http.createServer(function(request, response) { response.writeHead(200, {"Content-Type": "text/plain"}); response.write("Hello World"); response.end(); }).listen(8888); server.js runs without errors, and trying netstat -an | grep 8888 from terminal returns tcp4 0 0 *.8888 *.* LISTEN However, when I go to localhost:8888 in a browser, it says that it cannot be found. I've looked at all the related questions, and nothing has worked so far. I've tried different ports, etc. I know that my router blocks incoming traffic on port 8888, but shouldn't that not matter if I'm trying to access it locally? I've run tomcat servers on this port before, for example. Thanks so much for your help! node.js version: v0.6.15 OS: Mac OS 10.6.8

    Read the article

  • Node.js Adventure - Storage Services and Service Runtime

    - by Shaun
    When I described on how to host a Node.js application on Windows Azure, one of questions might be raised about how to consume the vary Windows Azure services, such as the storage, service bus, access control, etc.. Interact with windows azure services is available in Node.js through the Windows Azure Node.js SDK, which is a module available in NPM. In this post I would like to describe on how to use Windows Azure Storage (a.k.a. WAS) as well as the service runtime.   Consume Windows Azure Storage Let’s firstly have a look on how to consume WAS through Node.js. As we know in the previous post we can host Node.js application on Windows Azure Web Site (a.k.a. WAWS) as well as Windows Azure Cloud Service (a.k.a. WACS). In theory, WAWS is also built on top of WACS worker roles with some more features. Hence in this post I will only demonstrate for hosting in WACS worker role. The Node.js code can be used when consuming WAS when hosted on WAWS. But since there’s no roles in WAWS, the code for consuming service runtime mentioned in the next section cannot be used for WAWS node application. We can use the solution that I created in my last post. Alternatively we can create a new windows azure project in Visual Studio with a worker role, add the “node.exe” and “index.js” and install “express” and “node-sqlserver” modules, make all files as “Copy always”. In order to use windows azure services we need to have Windows Azure Node.js SDK, as knows as a module named “azure” which can be installed through NPM. Once we downloaded and installed, we need to include them in our worker role project and make them as “Copy always”. You can use my “Copy all always” tool mentioned in my last post to update the currently worker role project file. You can also find the source code of this tool here. The source code of Windows Azure SDK for Node.js can be found in its GitHub page. It contains two parts. One is a CLI tool which provides a cross platform command line package for Mac and Linux to manage WAWS and Windows Azure Virtual Machines (a.k.a. WAVM). The other is a library for managing and consuming vary windows azure services includes tables, blobs, queues, service bus and the service runtime. I will not cover all of them but will only demonstrate on how to use tables and service runtime information in this post. You can find the full document of this SDK here. Back to Visual Studio and open the “index.js”, let’s continue our application from the last post, which was working against Windows Azure SQL Database (a.k.a. WASD). The code should looks like this. 1: var express = require("express"); 2: var sql = require("node-sqlserver"); 3:  4: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:ac6271ya9e.database.windows.net,1433;Database=synctile;Uid=shaunxu@ac6271ya9e;Pwd={PASSWORD};Encrypt=yes;Connection Timeout=30;"; 5: var port = 80; 6:  7: var app = express(); 8:  9: app.configure(function () { 10: app.use(express.bodyParser()); 11: }); 12:  13: app.get("/", function (req, res) { 14: sql.open(connectionString, function (err, conn) { 15: if (err) { 16: console.log(err); 17: res.send(500, "Cannot open connection."); 18: } 19: else { 20: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 21: if (err) { 22: console.log(err); 23: res.send(500, "Cannot retrieve records."); 24: } 25: else { 26: res.json(results); 27: } 28: }); 29: } 30: }); 31: }); 32:  33: app.get("/text/:key/:culture", function (req, res) { 34: sql.open(connectionString, function (err, conn) { 35: if (err) { 36: console.log(err); 37: res.send(500, "Cannot open connection."); 38: } 39: else { 40: var key = req.params.key; 41: var culture = req.params.culture; 42: var command = "SELECT * FROM [Resource] WHERE [Key] = '" + key + "' AND [Culture] = '" + culture + "'"; 43: conn.queryRaw(command, function (err, results) { 44: if (err) { 45: console.log(err); 46: res.send(500, "Cannot retrieve records."); 47: } 48: else { 49: res.json(results); 50: } 51: }); 52: } 53: }); 54: }); 55:  56: app.get("/sproc/:key/:culture", function (req, res) { 57: sql.open(connectionString, function (err, conn) { 58: if (err) { 59: console.log(err); 60: res.send(500, "Cannot open connection."); 61: } 62: else { 63: var key = req.params.key; 64: var culture = req.params.culture; 65: var command = "EXEC GetItem '" + key + "', '" + culture + "'"; 66: conn.queryRaw(command, function (err, results) { 67: if (err) { 68: console.log(err); 69: res.send(500, "Cannot retrieve records."); 70: } 71: else { 72: res.json(results); 73: } 74: }); 75: } 76: }); 77: }); 78:  79: app.post("/new", function (req, res) { 80: var key = req.body.key; 81: var culture = req.body.culture; 82: var val = req.body.val; 83:  84: sql.open(connectionString, function (err, conn) { 85: if (err) { 86: console.log(err); 87: res.send(500, "Cannot open connection."); 88: } 89: else { 90: var command = "INSERT INTO [Resource] VALUES ('" + key + "', '" + culture + "', N'" + val + "')"; 91: conn.queryRaw(command, function (err, results) { 92: if (err) { 93: console.log(err); 94: res.send(500, "Cannot retrieve records."); 95: } 96: else { 97: res.send(200, "Inserted Successful"); 98: } 99: }); 100: } 101: }); 102: }); 103:  104: app.listen(port); Now let’s create a new function, copy the records from WASD to table service. 1. Delete the table named “resource”. 2. Create a new table named “resource”. These 2 steps ensures that we have an empty table. 3. Load all records from the “resource” table in WASD. 4. For each records loaded from WASD, insert them into the table one by one. 5. Prompt to user when finished. In order to use table service we need the storage account and key, which can be found from the developer portal. Just select the storage account and click the Manage Keys button. Then create two local variants in our Node.js application for the storage account name and key. Since we need to use WAS we need to import the azure module. Also I created another variant stored the table name. In order to work with table service I need to create the storage client for table service. This is very similar as the Windows Azure SDK for .NET. As the code below I created a new variant named “client” and use “createTableService”, specified my storage account name and key. 1: var azure = require("azure"); 2: var storageAccountName = "synctile"; 3: var storageAccountKey = "/cOy9L7xysXOgPYU9FjDvjrRAhaMX/5tnOpcjqloPNDJYucbgTy7MOrAW7CbUg6PjaDdmyl+6pkwUnKETsPVNw=="; 4: var tableName = "resource"; 5: var client = azure.createTableService(storageAccountName, storageAccountKey); Now create a new function for URL “/was/init” so that we can trigger it through browser. Then in this function we will firstly load all records from WASD. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: } 18: } 19: }); 20: } 21: }); 22: }); When we succeed loaded all records we can start to transform them into table service. First I need to recreate the table in table service. This can be done by deleting and creating the table through table client I had just created previously. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: // transform the records 26: } 27: }); 28: }); 29: } 30: } 31: }); 32: } 33: }); 34: }); As you can see, the azure SDK provide its methods in callback pattern. In fact, almost all modules in Node.js use the callback pattern. For example, when I deleted a table I invoked “deleteTable” method, provided the name of the table and a callback function which will be performed when the table had been deleted or failed. Underlying, the azure module will perform the table deletion operation in POSIX async threads pool asynchronously. And once it’s done the callback function will be performed. This is the reason we need to nest the table creation code inside the deletion function. If we perform the table creation code after the deletion code then they will be invoked in parallel. Next, for each records in WASD I created an entity and then insert into the table service. Finally I send the response to the browser. Can you find a bug in the code below? I will describe it later in this post. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: // transform the records 26: for (var i = 0; i < results.rows.length; i++) { 27: var entity = { 28: "PartitionKey": results.rows[i][1], 29: "RowKey": results.rows[i][0], 30: "Value": results.rows[i][2] 31: }; 32: client.insertEntity(tableName, entity, function (error) { 33: if (error) { 34: error["target"] = "insertEntity"; 35: res.send(500, error); 36: } 37: else { 38: console.log("entity inserted"); 39: } 40: }); 41: } 42: // send the 43: console.log("all done"); 44: res.send(200, "All done!"); 45: } 46: }); 47: }); 48: } 49: } 50: }); 51: } 52: }); 53: }); Now we can publish it to the cloud and have a try. But normally we’d better test it at the local emulator first. In Node.js SDK there are three build-in properties which provides the account name, key and host address for local storage emulator. We can use them to initialize our table service client. We also need to change the SQL connection string to let it use my local database. The code will be changed as below. 1: // windows azure sql database 2: //var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:ac6271ya9e.database.windows.net,1433;Database=synctile;Uid=shaunxu@ac6271ya9e;Pwd=eszqu94XZY;Encrypt=yes;Connection Timeout=30;"; 3: // sql server 4: var connectionString = "Driver={SQL Server Native Client 11.0};Server={.};Database={Caspar};Trusted_Connection={Yes};"; 5:  6: var azure = require("azure"); 7: var storageAccountName = "synctile"; 8: var storageAccountKey = "/cOy9L7xysXOgPYU9FjDvjrRAhaMX/5tnOpcjqloPNDJYucbgTy7MOrAW7CbUg6PjaDdmyl+6pkwUnKETsPVNw=="; 9: var tableName = "resource"; 10: // windows azure storage 11: //var client = azure.createTableService(storageAccountName, storageAccountKey); 12: // local storage emulator 13: var client = azure.createTableService(azure.ServiceClient.DEVSTORE_STORAGE_ACCOUNT, azure.ServiceClient.DEVSTORE_STORAGE_ACCESS_KEY, azure.ServiceClient.DEVSTORE_TABLE_HOST); Now let’s run the application and navigate to “localhost:12345/was/init” as I hosted it on port 12345. We can find it transformed the data from my local database to local table service. Everything looks fine. But there is a bug in my code. If we have a look on the Node.js command window we will find that it sent response before all records had been inserted, which is not what I expected. The reason is that, as I mentioned before, Node.js perform all IO operations in non-blocking model. When we inserted the records we executed the table service insert method in parallel, and the operation of sending response was also executed in parallel, even though I wrote it at the end of my logic. The correct logic should be, when all entities had been copied to table service with no error, then I will send response to the browser, otherwise I should send error message to the browser. To do so I need to import another module named “async”, which helps us to coordinate our asynchronous code. Install the module and import it at the beginning of the code. Then we can use its “forEach” method for the asynchronous code of inserting table entities. The first argument of “forEach” is the array that will be performed. The second argument is the operation for each items in the array. And the third argument will be invoked then all items had been performed or any errors occurred. Here we can send our response to browser. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: async.forEach(results.rows, 26: // transform the records 27: function (row, callback) { 28: var entity = { 29: "PartitionKey": row[1], 30: "RowKey": row[0], 31: "Value": row[2] 32: }; 33: client.insertEntity(tableName, entity, function (error) { 34: if (error) { 35: callback(error); 36: } 37: else { 38: console.log("entity inserted."); 39: callback(null); 40: } 41: }); 42: }, 43: // send reponse 44: function (error) { 45: if (error) { 46: error["target"] = "insertEntity"; 47: res.send(500, error); 48: } 49: else { 50: console.log("all done"); 51: res.send(200, "All done!"); 52: } 53: } 54: ); 55: } 56: }); 57: }); 58: } 59: } 60: }); 61: } 62: }); 63: }); Run it locally and now we can find the response was sent after all entities had been inserted. Query entities against table service is simple as well. Just use the “queryEntity” method from the table service client and providing the partition key and row key. We can also provide a complex query criteria as well, for example the code here. In the code below I queried an entity by the partition key and row key, and return the proper localization value in response. 1: app.get("/was/:key/:culture", function (req, res) { 2: var key = req.params.key; 3: var culture = req.params.culture; 4: client.queryEntity(tableName, culture, key, function (error, entity) { 5: if (error) { 6: res.send(500, error); 7: } 8: else { 9: res.json(entity); 10: } 11: }); 12: }); And then tested it on local emulator. Finally if we want to publish this application to the cloud we should change the database connection string and storage account. For more information about how to consume blob and queue service, as well as the service bus please refer to the MSDN page.   Consume Service Runtime As I mentioned above, before we published our application to the cloud we need to change the connection string and account information in our code. But if you had played with WACS you should have known that the service runtime provides the ability to retrieve configuration settings, endpoints and local resource information at runtime. Which means we can have these values defined in CSCFG and CSDEF files and then the runtime should be able to retrieve the proper values. For example we can add some role settings though the property window of the role, specify the connection string and storage account for cloud and local. And the can also use the endpoint which defined in role environment to our Node.js application. In Node.js SDK we can get an object from “azure.RoleEnvironment”, which provides the functionalities to retrieve the configuration settings and endpoints, etc.. In the code below I defined the connection string variants and then use the SDK to retrieve and initialize the table client. 1: var connectionString = ""; 2: var storageAccountName = ""; 3: var storageAccountKey = ""; 4: var tableName = ""; 5: var client; 6:  7: azure.RoleEnvironment.getConfigurationSettings(function (error, settings) { 8: if (error) { 9: console.log("ERROR: getConfigurationSettings"); 10: console.log(JSON.stringify(error)); 11: } 12: else { 13: console.log(JSON.stringify(settings)); 14: connectionString = settings["SqlConnectionString"]; 15: storageAccountName = settings["StorageAccountName"]; 16: storageAccountKey = settings["StorageAccountKey"]; 17: tableName = settings["TableName"]; 18:  19: console.log("connectionString = %s", connectionString); 20: console.log("storageAccountName = %s", storageAccountName); 21: console.log("storageAccountKey = %s", storageAccountKey); 22: console.log("tableName = %s", tableName); 23:  24: client = azure.createTableService(storageAccountName, storageAccountKey); 25: } 26: }); In this way we don’t need to amend the code for the configurations between local and cloud environment since the service runtime will take care of it. At the end of the code we will listen the application on the port retrieved from SDK as well. 1: azure.RoleEnvironment.getCurrentRoleInstance(function (error, instance) { 2: if (error) { 3: console.log("ERROR: getCurrentRoleInstance"); 4: console.log(JSON.stringify(error)); 5: } 6: else { 7: console.log(JSON.stringify(instance)); 8: if (instance["endpoints"] && instance["endpoints"]["nodejs"]) { 9: var endpoint = instance["endpoints"]["nodejs"]; 10: app.listen(endpoint["port"]); 11: } 12: else { 13: app.listen(8080); 14: } 15: } 16: }); But if we tested the application right now we will find that it cannot retrieve any values from service runtime. This is because by default, the entry point of this role was defined to the worker role class. In windows azure environment the service runtime will open a named pipeline to the entry point instance, so that it can connect to the runtime and retrieve values. But in this case, since the entry point was worker role and the Node.js was opened inside the role, the named pipeline was established between our worker role class and service runtime, so our Node.js application cannot use it. To fix this problem we need to open the CSDEF file under the azure project, add a new element named Runtime. Then add an element named EntryPoint which specify the Node.js command line. So that the Node.js application will have the connection to service runtime, then it’s able to read the configurations. Start the Node.js at local emulator we can find it retrieved the connections, storage account for local. And if we publish our application to azure then it works with WASD and storage service through the configurations for cloud.   Summary In this post I demonstrated how to use Windows Azure SDK for Node.js to interact with storage service, especially the table service. I also demonstrated on how to use WACS service runtime, how to retrieve the configuration settings and the endpoint information. And in order to make the service runtime available to my Node.js application I need to create an entry point element in CSDEF file and set “node.exe” as the entry point. I used five posts to introduce and demonstrate on how to run a Node.js application on Windows platform, how to use Windows Azure Web Site and Windows Azure Cloud Service worker role to host our Node.js application. I also described how to work with other services provided by Windows Azure platform through Windows Azure SDK for Node.js. Node.js is a very new and young network application platform. But since it’s very simple and easy to learn and deploy, as well as, it utilizes single thread non-blocking IO model, Node.js became more and more popular on web application and web service development especially for those IO sensitive projects. And as Node.js is very good at scaling-out, it’s more useful on cloud computing platform. Use Node.js on Windows platform is new, too. The modules for SQL database and Windows Azure SDK are still under development and enhancement. It doesn’t support SQL parameter in “node-sqlserver”. It does support using storage connection string to create the storage client in “azure”. But Microsoft is working on make them easier to use, working on add more features and functionalities.   PS, you can download the source code here. You can download the source code of my “Copy all always” tool here.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Can't run node.js script on server reboot

    - by webstyle
    I need to listen events on port 3240 and I'm using node.js for that purpose. I need to execute my script with forever tool. I also need to run forever on server reboot. When I run forever glh.js everything works: forever list says there is a running process. But when I'm trying to run forever on server reboot I can't get it working. I've created a file in /etc/init.d with the following content: #!/bin/bash /var/www/yan/data/gitlabhook/runglh.sh &>/var/www/yan/data/gitlabhook/runglh.log When I reboot the server, the output log is the following (the same as when I run it manually via console): info: Forever processing file: glh.js But in this case forever doesn't start a process. forever list outputs: info: No forever processes running

    Read the article

  • Multiple .obj Models in THREE.js and detecting clicked objects

    - by ZedBee
    I have been following along this example to load .obj Model using three.js. Since I needed more than one model to load so I tried this loader.addEventListener( 'load', function ( event ) { var object = event.content; object.position.y = - 80; scene.add( object ); }); loader.load( 'obj/model1.obj' ); loader.load( 'obj/model2.obj' ); First: I don't know whether this is the right approach or not since I searched but didn't find any tutorial loading more than one .obj models. Second: I want to be able to click different models on the screen. I tried this which doest not seem to work for me. Any suggestions on this?

    Read the article

  • What is the correct install process to setup Node.js with Windows Azure Emulator

    - by PazoozaTest Pazman
    This question is related to this question: Node.js running under IIS Express Keeps Crashing to which I need help with reinstalling and getting node.js up and running in windows emulator working. Hello I am reinstalling my machine: Toshiha Laptop 2 GB Ram 32 bit processor What is the correct procedure from start to finish to get node.js development working, so far nothing has worked and the emulator (IIS Express) worker processor keeps crashing. No matter how many instances they all end up crashing. Up until two weeks ago my node development was working fine, but I had to do a reinstall, and since then I haven't been doing any node.js development on windows emulator because the latest June 2012 Azure SDK for Node.js is buggy. These are the steps I have taken: 1) Reformat HD 2) Insert Windows 7 N SP1 CD 3) Reboot machine into CD installation 4) Follow and wait until Windows 7 installed 5) Run Add/Remove programs + enable IIS + IIS management tools 6) Run Windows Update (installed about 53 updates) 7) Go here http://www.windowsazure.com/en-us/develop/nodejs/ 8) Click Windows Installer June 2012 and install Windows Azure SDK for Node.js - June 2012 9) Run Azure Powershell 10) Navigate to c:\node\testSite\webrole1 11) launch site: start-azureemulator -launch 12) Play around on website (then crash!) Problem signature: Problem Event Name: APPCRASH Application Name: iisexpress.exe Application Version: 8.0.8298.0 Application Timestamp: 4f620349 Fault Module Name: iiscore.dll Fault Module Version: 8.0.8298.0 Fault Module Timestamp: 4f63b65c Exception Code: c0000005 Exception Offset: 00021767 OS Version: 6.1.7601.2.1.0.256.28 Locale ID: 1033 Additional Information 1: f66d Additional Information 2: f66d807b515d6b2dc6f28f66db769a01 Additional Information 3: 7b2f Additional Information 4: 7b2f6797d07ebc2c23f2b227e779722e Am I missing a step in my resintall process? Do I have all the required files to do node.js windows azure emulator development? Why is IIS Express crashing all the time? Can I still do node.js windows azure emulator development without using IIS Express and use my local Windows 7 N (SP1) IIS 7.x that comes shipped?

    Read the article

  • I have a NGINX server configured to work with node.js, but many times a file of 1.03MB of js is not loaded by various browser and various pc

    - by Totty
    I'm using this in a local LAN so it should be quite fast. The nginx server use the node.js server to serve static files, so it must pass throught node.js to download the files, but that is not a problem when I'm not using the nginx. In chrome with debugger on I can see that the status is: 206 - partial content and it only has downloaded 31KB of 1.03MB. After 1.1 min it turns red and the status failed. Waiting time: 6ms Receiving: 1.1 min The headers in google chrom: Request URL:http://192.168.1.16/production/assembly/script/production.js Request Method:GET Status Code:206 Partial Content Request Headersview source Accept:*/* Accept-Charset:ISO-8859-1,utf-8;q=0.7,*;q=0.3 Accept-Encoding:gzip,deflate,sdch Accept-Language:pt-PT,pt;q=0.8,en-US;q=0.6,en;q=0.4 Connection:keep-alive Cookie:connect.sid=s%3Abls2qobcCaJ%2FyBNZwedtDR9N.0vD4Fi03H1bEdCszGsxIjjK0lZIjJhLnToWKFVxZOiE Host:192.168.1.16 If-Range:"1081715-1350053827000" Range:bytes=16090-16090 Referer:http://192.168.1.16/production/assembly/ User-Agent:Mozilla/5.0 (Windows NT 6.0) AppleWebKit/537.4 (KHTML, like Gecko) Chrome/22.0.1229.94 Safari/537.4 Response Headersview source Accept-Ranges:bytes Cache-Control:public, max-age=0 Connection:keep-alive Content-Length:1 Content-Range:bytes 16090-16090/1081715 Content-Type:application/javascript Date:Mon, 15 Oct 2012 09:18:50 GMT ETag:"1081715-1350053827000" Last-Modified:Fri, 12 Oct 2012 14:57:07 GMT Server:nginx/1.1.19 X-Powered-By:Express My nginx configurations: File 1: user totty; worker_processes 4; pid /var/run/nginx.pid; events { worker_connections 768; # multi_accept on; } http { ## # Basic Settings ## sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 65; types_hash_max_size 2048; # server_tokens off; # server_names_hash_bucket_size 64; # server_name_in_redirect off; include /etc/nginx/mime.types; default_type application/octet-stream; ## # Logging Settings ## access_log /home/totty/web/production01_server/node_modules/production/_logs/_NGINX_access.txt; error_log /home/totty/web/production01_server/node_modules/production/_logs/_NGINX_error.txt; ## # Gzip Settings ## gzip on; gzip_disable "msie6"; # gzip_vary on; # gzip_proxied any; # gzip_comp_level 6; # gzip_buffers 16 8k; # gzip_http_version 1.1; # gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript; ## # nginx-naxsi config ## # Uncomment it if you installed nginx-naxsi ## #include /etc/nginx/naxsi_core.rules; ## # nginx-passenger config ## # Uncomment it if you installed nginx-passenger ## #passenger_root /usr; #passenger_ruby /usr/bin/ruby; ## # Virtual Host Configs ## autoindex on; include /home/totty/web/production01_server/_deployment/nginxConfigs/server/*; } File that is included by the previous file: server { # custom location for entry # using only "/" instead of "/production/assembly" it # would allow you to go to "thatip/". In this way # we are limiting to "thatip/production/assembly/" location /production/assembly/ { # ip and port used in node.js proxy_pass http://127.0.0.1:3000/; } location /production/assembly.mongo/ { proxy_pass http://127.0.0.1:9000/; proxy_redirect off; } location /production/assembly.logs/ { autoindex on; alias /home/totty/web/production01_server/node_modules/production/_logs/; } }

    Read the article

  • Nodejs for processing js and Nginx for handling everything else

    - by Kevin Parker
    I am having a nodejs running on port 8000 and nginx on port 80 on same server. I want Nginx to handle all the requests(image,css,etc) and forward js requests to nodejs server on port 8000. Is it possible to achieve this. i have configured nginx as reverse proxy but its forwarding every request to nodejs but i want nginx to process all except js. nginx/sites-enabled/default/ upstream nodejs { server localhost:8000; #nodejs } location / { proxy_pass http://192.168.2.21:8000; proxy_next_upstream error timeout invalid_header http_500 http_502 http_503 http_504; proxy_redirect off; proxy_buffering off; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; }

    Read the article

  • What does a node.js web application's setup look like on a real production server?

    - by joe
    Being new to node js magic world, i'm wondering how does a web application's setup look like on a real production server? So far all tutorials, create the js file that is started from a console...and that's it. Anyone has created a real world web app that uses node js in the back end? Can you please describe how is it setup, and how reliable this infrastructure is ? I'm coming from the asp.net and php world that require heavy web servers...and can't have a clear idea about node stuff.

    Read the article

  • 250k connections for comet with node.js

    - by Nenad
    How to implement node.js to be able to handle 250k connections as comet server (client side we use socket.io)? Would the use of nginx as proxy/loadbalancer be the right solution? Or will HA-Proxy be the better way? Has anyone real world experience with 100k+ connections and can share his setup? Would a setup like this be the right one (Quad core CPU per server - start 4 Instances of node.js per Server?): nginx (as proxy / load balancing server) / | \ / | \ / | \ / | \ node server #1 node server #2 node server #3 4 instances 4 instances 4 instances

    Read the article

  • Node.js installation on Debian 6

    - by pvorb
    I used to use this method for node.js installation on Debian, since it was easy and everything worked fine. Even with multiple users. Since version 0.6.18~dfsg1-1 of the sid package, installation removes openssh-server. But I need OpenSSH to connect to my server. Is there any possibility to install Node.js via APT or do I have to compile it manually? This is my APT preferences file: Package: * Pin: release a=stable Pin-Priority: 800 Package: * Pin: release a=testing Pin-Priority: 650 Package: * Pin: release a=unstable Pin-Priority: 600

    Read the article

  • Learning AngularJS by Example – The Customer Manager Application

    - by dwahlin
    I’m always tinkering around with different ideas and toward the beginning of 2013 decided to build a sample application using AngularJS that I call Customer Manager. It’s not exactly the most creative name or concept, but I wanted to build something that highlighted a lot of the different features offered by AngularJS and how they could be used together to build a full-featured app. One of the goals of the application was to ensure that it was approachable by people new to Angular since I’ve never found overly complex applications great for learning new concepts. The application initially started out small and was used in my AngularJS in 60-ish Minutes video on YouTube but has gradually had more and more features added to it and will continue to be enhanced over time. It’ll be used in a new “end-to-end” training course my company is working on for AngularjS as well as in some video courses that will be coming out. Here’s a quick look at what the application home page looks like: In this post I’m going to provide an overview about how the application is organized, back-end options that are available, and some of the features it demonstrates. I’ve already written about some of the features so if you’re interested check out the following posts: Building an AngularJS Modal Service Building a Custom AngularJS Unique Value Directive Using an AngularJS Factory to Interact with a RESTful Service Application Structure The structure of the application is shown to the right. The  homepage is index.html and is located at the root of the application folder. It defines where application views will be loaded using the ng-view directive and includes script references to AngularJS, AngularJS routing and animation scripts, plus a few others located in the Scripts folder and to custom application scripts located in the app folder. The app folder contains all of the key scripts used in the application. There are several techniques that can be used for organizing script files but after experimenting with several of them I decided that I prefer things in folders such as controllers, views, services, etc. Doing that helps me find things a lot faster and allows me to categorize files (such as controllers) by functionality. My recommendation is to go with whatever works best for you. Anyone who says, “You’re doing it wrong!” should be ignored. Contrary to what some people think, there is no “one right way” to organize scripts and other files. As long as the scripts make it down to the client properly (you’ll likely minify and concatenate them anyway to reduce bandwidth and minimize HTTP calls), the way you organize them is completely up to you. Here’s what I ended up doing for this application: Animation code for some custom animations is located in the animations folder. In addition to AngularJS animations (which are defined using CSS in Content/animations.css), it also animates the initial customer data load using a 3rd party script called GreenSock. Controllers are located in the controllers folder. Some of the controllers are placed in subfolders based upon the their functionality while others are placed at the root of the controllers folder since they’re more generic:   The directives folder contains the custom directives created for the application. The filters folder contains the custom filters created for the application that filter city/state and product information. The partials folder contains partial views. This includes things like modal dialogs used in the application. The services folder contains AngularJS factories and services used for various purposes in the application. Most of the scripts in this folder provide data functionality. The views folder contains the different views used in the application. Like the controllers folder, the views are organized into subfolders based on their functionality:   Back-End Services The Customer Manager application (grab it from Github) provides two different options on the back-end including ASP.NET Web API and Node.js. The ASP.NET Web API back-end uses Entity Framework for data access and stores data in SQL Server (LocalDb). The other option on the back-end is Node.js, Express, and MongoDB.   Using the ASP.NET Web API Back-End To run the application using ASP.NET Web API/SQL Server back-end open the .sln file at the root of the project in Visual Studio 2012 or higher (the free Express 2013 for Web version is fine). Press F5 and a browser will automatically launch and display the application. Using the Node.js Back-End To run the application using the Node.js/MongoDB back-end follow these steps: In the CustomerManager directory execute 'npm install' to install Express, MongoDB and Mongoose (package.json). Load sample data into MongoDB by performing the following steps: Execute 'mongod' to start the MongoDB daemon Navigate to the CustomerManager directory (the one that has initMongoCustData.js in it) then execute 'mongo' to start the MongoDB shell Enter the following in the mongo shell to load the seed files that handle seeding the database with initial data: use custmgr load("initMongoCustData.js") load("initMongoSettingsData.js") load("initMongoStateData.js") Start the Node/Express server by navigating to the CustomerManager/server directory and executing 'node app.js' View the application at http://localhost:3000 in your browser. Key Features The Customer Manager application certainly doesn’t cover every feature provided by AngularJS (as mentioned the intent was to keep it as simple as possible) but does provide insight into several key areas: Using factories and services as re-useable data services (see the app/services folder) Creating custom directives (see the app/directives folder) Custom paging (see app/views/customers/customers.html and app/controllers/customers/customersController.js) Custom filters (see app/filters) Showing custom modal dialogs with a re-useable service (see app/services/modalService.js) Making Ajax calls using a factory (see app/services/customersService.js) Using Breeze to retrieve and work with data (see app/services/customersBreezeService.js). Switch the application to use the Breeze factory by opening app/services.config.js and changing the useBreeze property to true. Intercepting HTTP requests to display a custom overlay during Ajax calls (see app/directives/wcOverlay.js) Custom animations using the GreenSock library (see app/animations/listAnimations.js) Creating custom AngularJS animations using CSS (see Content/animations.css) JavaScript patterns for defining controllers, services/factories, directives, filters, and more (see any JavaScript file in the app folder) Card View and List View display of data (see app/views/customers/customers.html and app/controllers/customers/customersController.js) Using AngularJS validation functionality (see app/views/customerEdit.html, app/controllers/customerEditController.js, and app/directives/wcUnique.js) More… Conclusion I’ll be enhancing the application even more over time and welcome contributions as well. Tony Quinn contributed the initial Node.js/MongoDB code which is very cool to have as a back-end option. Access the standard application here and a version that has custom routing in it here. Additional information about the custom routing can be found in this post.

    Read the article

  • Add JS file on a certain page on Drupal

    - by Asaf
    I've got a JS file that I want to add to AdminAdd ContentCertain Content type After looking at template.php and checking out the function theme_preprocess_node I tried to add the JS through drupal_add_js(...) but no go. Now, I know that there's a similar question however my case is about adding a JS file to a certain page and nothing else (better seperation of JS files). Thanks. (Drupal 6)

    Read the article

  • node.js / socket.io, cookies only working locally

    - by Ben Griffiths
    I'm trying to use cookie based sessions, however it'll only work on the local machine, not over the network. If I remove the session related stuff, it will however work just great over the network... You'll have to forgive the lack of quality code here, I'm just starting out with node/socket etc etc, and finding any clear guides is tough going, so I'm in n00b territory right now. Basically this is so far hacked together from various snippets with about 10% understanding of what I'm actually doing... The error I see in Chrome is: socket.io.js:1632GET http://192.168.0.6:8080/socket.io/1/?t=1334431940273 500 (Internal Server Error) Socket.handshake ------- socket.io.js:1632 Socket.connect ------- socket.io.js:1671 Socket ------- socket.io.js:1530 io.connect ------- socket.io.js:91 (anonymous function) ------- /socket-test/:9 jQuery.extend.ready ------- jquery.js:438 And in the console for the server I see: debug - served static content /socket.io.js debug - authorized warn - handshake error No cookie My server is: var express = require('express') , app = express.createServer() , io = require('socket.io').listen(app) , connect = require('express/node_modules/connect') , parseCookie = connect.utils.parseCookie , RedisStore = require('connect-redis')(express) , sessionStore = new RedisStore(); app.listen(8080, '192.168.0.6'); app.configure(function() { app.use(express.cookieParser()); app.use(express.session( { secret: 'YOURSOOPERSEKRITKEY', store: sessionStore })); }); io.configure(function() { io.set('authorization', function(data, callback) { if(data.headers.cookie) { var cookie = parseCookie(data.headers.cookie); sessionStore.get(cookie['connect.sid'], function(err, session) { if(err || !session) { callback('Error', false); } else { data.session = session; callback(null, true); } }); } else { callback('No cookie', false); } }); }); var users_count = 0; io.sockets.on('connection', function (socket) { console.log('New Connection'); var session = socket.handshake.session; ++users_count; io.sockets.emit('users_count', users_count); socket.on('something', function(data) { io.sockets.emit('doing_something', data['data']); }); socket.on('disconnect', function() { --users_count; io.sockets.emit('users_count', users_count); }); }); My page JS is: jQuery(function($){ var socket = io.connect('http://192.168.0.6', { port: 8080 } ); socket.on('users_count', function(data) { $('#client_count').text(data); }); socket.on('doing_something', function(data) { if(data == '') { window.setTimeout(function() { $('#target').text(data); }, 3000); } else { $('#target').text(data); } }); $('#textbox').keydown(function() { socket.emit('something', { data: 'typing' }); }); $('#textbox').keyup(function() { socket.emit('something', { data: '' }); }); });

    Read the article

  • How to create a stand alone command line application with Node.js

    - by Fab
    I'm trying to find a way to use a command line nodejs application that I created on a computer without node.js installed. In other words how to package my application with node.js inside, in order to avoid the users to have node.js already installed. The tipical use case is: I run the application and the application works using the node core that is provide with the application (or the application checks if there is node.js installed, and if not it donwload and install it automatically). Do you have any idea?

    Read the article

  • Configure nginx for multiple node.js apps with own domains

    - by udo
    I have a node webapp up and running with my nginx on debian squeeze. Now I want to add another one with an own domain but when I do so, only the first app is served and even if I go to the second domain I simply get redirected to the first webapp. Hope you see what I did wrong here: example1.conf: upstream example1.com { server 127.0.0.1:3000; } server { listen 80; server_name www.example1.com; rewrite ^/(.*) http://example1.com/$1 permanent; } # the nginx server instance server { listen 80; server_name example1.com; access_log /var/log/nginx/example1.com/access.log; # pass the request to the node.js server with the correct headers and much more can be added, see nginx config options location / { proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Host $http_host; proxy_set_header X-NginX-Proxy true; proxy_pass http://example1.com; proxy_redirect off; } } example2.conf: upstream example2.com { server 127.0.0.1:1111; } server { listen 80; server_name www.example2.com; rewrite ^/(.*) http://example2.com/$1 permanent; } # the nginx server instance server { listen 80; server_name example2.com; access_log /var/log/nginx/example2.com/access.log; # pass the request to the node.js server with the correct headers and much more can be added, see nginx config options location / { proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Host $http_host; proxy_set_header X-NginX-Proxy true; proxy_pass http://example2.com; proxy_redirect off; } } curl simply does this: zazzl:Desktop udo$ curl -I http://example2.com/ HTTP/1.1 301 Moved Permanently Server: nginx/1.2.2 Date: Sat, 04 Aug 2012 13:46:30 GMT Content-Type: text/html Content-Length: 184 Connection: keep-alive Location: http://example1.com/ Thanks :)

    Read the article

  • With NVD3.js (nv.models.lineWithFocusChart), how do you set specific ticks on X-axis, when x values are dates?

    - by Panagiotis Panagi
    I'm using nv.models.lineWithFocusChart, where I'm showing some hourly measurements. So the x domain is dates. What I need is to show a tick per hour on X axis: 00:00 01:00 02:00 ... 24:00 I've tried everything I could find but nothing works. It seems that its easy to set specific ticks when values are not dates. But can't figure out how to do it for dates. Here's the code that creates the graph, if it can help: nv.addGraph -> chart = nv.models.lineWithFocusChart(view) chart.yAxis.tickFormat(d3.format(".02f")) d3.select(graphSelector).datum([]).transition().duration(500).call(chart) nv.utils.windowResize -> d3.select(graphSelector).call(chart) chart.xAxis.tickFormat (d) -> d3.time.format(ticksFormat)(new Date(d)) chart.x2Axis.tickFormat (d) -> d3.time.format(ticksFormat)(new Date(d)) chart.xAxis.tickSubdivide(9)

    Read the article

  • different small js files per page VS. 1x site-wide js file?

    - by Haroldo
    Different pages of my site have different js needs (plugins mainly), some need a lightbox, some dont, some need a carousel, some dont etc. With regards to pageloading speed should i option 1 - reference each js file when it is needed: so one page might have: <script type="text/javascript" src="js/carousel/scrollable.js"></script> <script type="text/javascript" src="js/jquery.easydrag.js"></script> <script type="text/javascript" src="js/colorbox/jquery.colorbox-min.js"></script> and another have: <script type="text/javascript" src="st_wd_assets/js/carousel/scrollable.js"></script> <script type="text/javascript" src="st_wd_assets/js/typewatch.js"></script> option 2 - combine and compress into one site_wide.js file: so each page would reference: <script type="text/javascript" src="js/site_wide.js"></script> there would be unused selectors/event listeners though, how bad is this? I would include any plugin notes/accreditations at the top of the site_wide.js file

    Read the article

  • node.js ./configure warnings

    - by microspino
    I'm trying to install node.js and I get this list of warnings when I run ./configer. I don't know which libraries/packages I miss on my Debian installation (Linux 2.6.32.6 ppc GNU/Linux on a Bubba2 sever) Checking for gnutls >= 2.5.0 : fail --- libeio --- Checking for pread(2) and pwrite(2) : fail Checking for sync_file_range(2) : fail --- libev --- Checking for header sys/inotify.h : not found Checking for header port.h : not found Checking for header sys/event.h : not found Checking for function kqueue : not found Checking for header sys/eventfd.h : not found Can you tell me what I need to install?

    Read the article

  • History.js not working in Internet Explorer

    - by Wilcoholic
    I am trying to get history.js to work in Internet Explorer because I need history.pushState() to work. I have read over the instructions on GitHub (https://github.com/balupton/History.js/) and have tried implementing it, but havent had any success. Here's what I have <!DOCTYPE html> <html> <head> <!-- jQuery --> <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.6.2/jquery.min.js"></script> <!-- History.js --> <script defer src="http://balupton.github.com/history.js/scripts/bundled/html4+html5/jquery.history.js"></script> <script type="text/javascript"> function addHistory(){ // Prepare var History = window.History; // Note: We are using a capital H instead of a lower h // Change our States History.pushState(null, null, "?mylink.html"); } </script> </head> <body> <a href="mylink.html">My Link</a> <a href="otherlink.html">Other Link</a> <button onclick="addHistory()" type="button">Add History</button> </body> Not sure what I'm doing wrong, but it's definitely not working in IE. Any help is appreciated

    Read the article

  • Passing ViewModel for backbone.js from MVC3 Server-Side

    - by Roman
    In ASP.NET MVC there is Model, View and Controller. MODEL represents entities which are stored in database and essentially is all the data used in a application (for example, generated by EntityFramework, "DB First" approach). Not all data from model you want to show in the view (for example, hashs of passwords). So you create VIEW MODEL, each for every strongly-typed-razor-view you have in application. Like this: using System; using System.Collections.Generic; using System.Linq; using System.Web; namespace MyProject.ViewModels.SomeController.SomeAction { public class ViewModel { public ViewModel() { Entities1 = new List<ViewEntity1>(); Entities2 = new List<ViewEntity2>(); } public List<ViewEntity1> Entities1 { get; set; } public List<ViewEntity2> Entities2 { get; set; } } public class ViewEntity1 { //some properties from original DB-entity you want to show } public class ViewEntity2 { } } When you create complex client-side interfaces (I do), you use some pattern for javascript on client, MVC or MVVM (I know only these). So, with MVC on client you have another model (Backbone.Model for example), which is third model in application. It is a bit much. Why don`t we use the same ViewModel model on a client (in backbone.js or another framework)? Is there a way to transfer CS-coded model to JS-coded? Like in MVVM pattern, with knockout.js, when you can do like this: in SomeAction.cshtml: <div style="display: none;" id="view_model">@Json.Encode(Model)</div> after that in Javascript-code var ViewModel = ko.mapping.fromJSON($("#view_model").get(0).innerHTML); now you can extend your ViewModel with some actions, event handlers, etc: ko.utils.extend(ViewModel, { some_function: function () { //some code } }); So, we are not building the same view model on the client again, we are transferring existing view model from server. At least, data. But knockout.js is not suitable for me, you can`t build complex UI with it, it is just data-binding. I need something more structural, like backbone.js. The only way to build ViewModel for backbone.js I can see now is re-writing same ViewModel in JS from server with hands. Is there any ways to transfer it from server? To reuse the same viewmodel on server view and client view?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >