The Office 2016 Mac Preview is here!

PMG_Hero_OFM_Retina_3200x860 2

Have you been waiting for the new Office 2016 for Mac, your wait is over !

A preview of the new Office 2016 for Mac is now officially available. Click here to give it a try

Office 2016 for Mac is powered by the cloud so you can access your documents on OneDrive, OneDrive for Business and SharePoint at anytime, anywhere and on any device.

Office 2016 for Mac shares an unmistakably Office experience–but it is also thoughtfully designed to take advantage of the unique features of the Mac. The new apps offer full retina display support with thousands of retina-optimized graphics, full screen view for native immersive experiences, and even little Mac affordances like scroll bounce.

Posted in Uncategorized | Leave a comment

Cloud 9 Infosystems Inc. gets a gold lining

GoldPRlogo

We have won Gold! Cloud 9 Infosystems Inc. has earned Gold Status Microsoft Partner on Cloud Platform for cloud competency and skills in this area. We are very pleased to reach this coveted milestone. Our Microsoft competency places us in the top 5 percent of Microsoft partners for our expertise.

Microsoft’s Gold Certified Status recognizes both our technical expertise and our commitment to Microsoft. We are committed to staying deeply informed about the latest Microsoft products and technologies so we can create innovative solutions to help give our customers a competitive advantage.

Microsoft Gold Certified Partners receive a rich set of benefits, including access, training and support, giving them a competitive advantage. We are excited to extend this advantage to our customers.

See what our Microsoft competency means for you.

Posted in Uncategorized | Leave a comment

Preview Available for Latest Version of Azure SQL Database

The availability of a new preview, which was previously announced in November, introduces near-complete SQL Server engine compatibility and more Premium performance, representing the next-generation of the SQL Database service. Internal tests on over 600 million rows of data show Premium query performance improvements of around 5x in the new preview relative to today’s Premium SQL Database and up to 100x when applying the In-memory column store technology. The features in today’s preview mark the first step toward delivering exciting new capabilities for customers on this new service architecture.

This preview continues the momentum of the general availability of the new service tiers, Basic, Standard, and Premium, which deliver scalable performance, built-in geo-replication and recovery features, and introduced the first cloud database 99.99% uptime SLA. Since September, we also made Auditing generally available and introduced Elastic Scale technologies that significantly streamline high-end scale out. Today, customers are using the scale out approach to prove the power of the service to run their mission-critical applications; one customer has almost 90TB of data across ~2,000 databases, processing ~380,000 logins per day while another customer is running a multi-tenant application with over 20TB of data across ~225,000 databases, processing over 2 million logins per day.

Today’s preview continues our journey to bring industry-leading in-memory technologies to the cloud as well as unlock new functionality that streamlines the movement of SQL Server database applications to Azure and also dramatically improves the ability for customers to work with heavier database workloads. Weichert, one of the nation’s leading full-service real estate providers, is actively using the new preview.

“We are deeply driven by ROI and committed to migrating our mission-critical apps to Azure—last year we saw a 60% cost savings after moving our real estate search engine, weichert.com, to Azure. From a strategy perspective, these SQL Database service updates are our answer to migrating and working with large data types by leveraging features such as online index rebuild, and partitioning,” stated Joe Testa, Vice President of Systems Development, Weichert.“Simply put, the results so far have been fantastic—we’re seeing >2x better performance and the advanced features that were only previously available in SQL Server now make it easier to work with our applications as we continue to migrate our mission-critical apps to Azure.”

Detailed enhancements of this preview are available on the preview getting started page, however, key highlights are as follows:

Easier management of large databases to support heavier workloads with parallel queries (Premium only), table partitioning, online indexing and worry-free large index rebuilds with 2GB size limit removed, and alter database command.
Support for key programmability functions to drive more robust application design with CLR, T-SQL Window functions, XML index, and change tracking.
Up to 100x performance improvements with support for In-memory columnstore queries (Premium only) for data mart and smaller analytic workloads.
Improved monitoring and troubleshooting: Extended Events (XEvents) and visibility into over 100 new table views via an expanded set of Database Management Views (DMVs).
New S3 performance level: offers more pricing flexibility between Standard and Premium. S3 will deliver 100 DTU and all the features available in the Standard tier for $0.2016/hour ($0.1008/hour during preview).
To try this preview, please sign up via the Azure Preview Features page or via the new Azure Portal. Pricing for databases enrolled in this preview is discounted 50% from Basic, Standard and Premium general availability pricing. Only SQL Database servers that have a mix of one or more Basic, Standard, or Premium databases (no Web or Business) will be eligible to upgrade to this new preview.

I am incredibly excited for you to try this preview which will help you more easily build or migrate more robust SQL-based applications on Azure while gaining the near-zero administration benefits of database-as-a-service.

Keywords:

Posted in Uncategorized | Leave a comment

Automated Everything with SQL Server on IaaS VMs

Director of Program Management, Azure

Today, we are excited to announce support for automated backup and automated patching, available directly in the portal for SQL Server Azure Virtual Machines. Both of these features are built with the new SQL Server IaaS Agent, an Azure VM Extension, combining the power and management ease of SQL Server with the agility offered by extensions on Azure Virtual Machines, enabling single-click backup and patching configuration and management.

Using automated backup on SQL Virtual Machines deployed in Azure, you can configure a scheduled backup on SQL Server 2014 Enterprise. With a few clicks in the portal, you can control the retention period, the storage account for the backup, and the security/encryption policies of the database.

SQL VMIn addition to automated backup, we are also announcing automated patching for SQL Server VMs. This new solution allows you to define the maintenance window directly from the portal. The SQL Server IaaS Agent will configure Windows running on your Virtual Machine with your preferred maintenance settings, including the day for maintenance, the start time of the window and the proposed duration.

Optional ConfigIt is an exciting set of new capabilities that continues to show the integrated experience for running SQL Server on the fast scale-out and fast scale-up Azure Virtual Machines. We will continue to focus on these integrated experiences over the next few months.

For more details, check out the SQL Server blog post here. Go ahead, try these features out for yourself at https://portal.azure.com. Scale-out a bit.

Posted in Cloud Computing | Tagged | Leave a comment

Running Critical Application Workloads on Microsoft Azure D-Series Virtual Machines

On the Azure Customer Advisory Team (AzureCAT), we’ve been testing the performance of one of the latest generations of hardware components now being introduced into our public cloud called the D-Series. And what’s especially cool is how they can help boost performance significantly for critical workload applications compared to the earlier VM series. This is extremely important for solutions based on Microsoft SQL Server as we described in a previous white paper. Our new findings extend those tests.

Customers told us that they wanted a straightforward way to transition their applications from traditional data centers to Microsoft Azure virtual machines (VMs), but performance is key with their critical workloads, and they weren’t always getting it.

The D-Series offers two key features related to performance, neither of which require you to make any particular application changes:

  • Local storage (temporary) based on solid-state drives (SSDs)
  • Higher number of attached data disks (up to 32 for D14 VMs)

In our performance tests, we used these new features to tune applications and saw gains in performance. For example:

  • Placing TempDB file on local SSD storage on a D13 VM gave approximately 4.5 times the throughput of an A7 VM with attached data disks, at a fraction of previous latency for the same SQL Server-generated IO patterns.
  • D14 VMs with 32 attached disks can provide up to 85 percent more write IOPS and bandwidth compared to an A7 VM with 16 attached disks.

We documented four scenarios in which the D-Series made a significant difference for our customers in the white paper, Running Critical Application Workloads on Microsoft Azure Virtual Machine.

It describes:

  • How persistent disk latency can directly impact application response times.
  • How limited throughput from persistent disks can impact application performance when SQL Server tempdb use is significant.
  • How to use SSD-based fast storage in the application tier to speed temporary file processing.
  • How to reduce compile and startup time for a large ASP.NET web application by moving the %temp% folder on a temporary drive in a D-Series VM.

In essence, new D-Series VMs in Azure can help run performance-critical workloads on both the data tier and application tier, offering better performance overall for CPU, storage, and networking, with a price performance ratio that can be favorably compared to other VM series.

Certain application scenarios such as OLTP database servers benefit mainly from local SSD-based temporary storage for extending buffer pools and hosting temporary operations. Application servers benefit from faster and low latency local storage and also from the increased CPU performance provided by this new generation of VMs.

 

Principal Program Manager, AzureCAT

Posted in Uncategorized | Tagged , | Leave a comment

What’s new in Office Online—featuring Insights for Office

, on December 11, 2014 |

The Office Online team has been busy adding integration across Microsoft Cloud services and extending the capabilities of what you can do with Office in your favorite browser.  A number of powerful features have either been rolled out recently or are beginning to roll out now. If you haven’t used Office Online in a while, here are a few updates we’ve built and implemented based on your feedback.

Insights for Office

Imagine you came across something new in a document that you are working on and want to find out more. What if you could find the information you want without disrupting your flow and without having to open a new tab in your browser to do a search? With Insights for Office, you can bring the information you want right into Word Online, in a seamlessly integrated experience that lets you learn and explore as you are working on your documents in your OneDrive.

When you right-click on a word and select Insights, Office Online understands what you are looking for and then uses the power of Bing to bring in relevant information from a variety of sources like Bing Snapshot, Wikipedia, Bing Image Search, Oxford English Dictionary and the web—all neatly presented alongside your content. You can spend as much time as you want in the Insights experience—take a quick glance or explore details and interesting articles related to your selection, and do all this while you are working in Word Online.

For example, here is a document that talks about both the Missouri River and the state Missouri. Depending on which “Missouri” you select, Insights for Office gets it right.

Office Online updates 1

Insights on the state Missouri.

Office Online updates 2

Insights on the Missouri River.

To get Insights, right-click on a word you want (or select one or more words) to know more about and select Insights from the menu (or select Insights under the Review tab on the ribbon).  We’ve also seen that our user’s type into the Tell Me box to find more information, so naturally, we added Insights through Tell Me as well—just type what you are looking for and Office Online will get you the right insights.  Please note that Insights for Office is available in Word Online in the Editing View (Click View to ensure that you have Editing View selected).

Office Online updates 3

Make sure you are in the Editing View.

Office Online updates 3a

Different ways of getting Insights for Office.

To read more about Insights for Office Online, check out the post on First look at Insights for Office Online.

Enhanced PDF support

Have you ever opened a PDF document with text embedded in images and then tried to select or copy text? If you’re in this situation, typically you might copy the image with text or even worse, manually transcribe the text in order to make it editable. Now we have a solution.

The example below is actually a photo taken using a phone, but it could have been any scanned file or image with typed text. Now you can copy the text straight from the image and even search the text clicking the FIND button!

Office Online updates 4

You can also convert the PDF into an editable Word document by clicking the EDIT IN WORD button. From there you can edit the document in your web browser or in any other Word client you prefer. When you convert a PDF into a Word document, the layout and formatting are reconstructed. That means you get to keep all of your tables, lists, headers, font size and other properties. And the original PDF will be left intact, because a new Word document is created for you.

Office Online updates 5

This feature works the best with documents that are mostly text, such as legal, business and scientific documents. If a PDF contains mostly graphics and diagrams, like a presentation or a brochure, the converter might have a little trouble with the layout or formatting. Do you think you have a file that is too complex for us? Give it a try and let us know how well we did via the HELP IMPROVE OFFICE!

Pagination

Pages have always been an integral part of the Word editing experience. Not only in polishing or printing documents but also in how we author and structure our documents. Most people refer to the length of a document in pages or refer to a specific piece using the pages as an anchor by saying something like “it’s in the second paragraph on page four.” We know that you take pride in creating polished well—structured documents that make an impact. We’ve been making some improvements in Word Online to help you make the best of your authoring experience. You will now can see where pages end in the document.

Office Online updates 6

Whether it’s moving the two lines that slipped to the page below or fixing that table that is spanning across pages, editing in Word Online is now a much more powerful experience with visible page boundaries. Work with pages like you always have before!

We also put the number of pages in your document and the current page you’re on in the status bar. That way you can keep a tab on how long your document is and find that paragraph on a particular page. Meeting Professor Smith’s guideline is now easier than ever! With the added power of pages, creating those great looking documents is easier than ever.

Insert symbols

Have you ever been writing a document and had the need to insert a symbol that isn’t part of your standard keyboard layout? Maybe you’re writing about another company and you really need the copyright © symbol or perhaps an international currency symbol like £ or ¥. We heard stories from people who would get creative and find Alt key combinations or use a search engine to copy and paste the single special character from some other content. Wouldn’t it be easier to have Symbols in Office Online?

Well, we’ve been hearing that feedback loud and clear and we’re in the process of rolling out the Symbols gallery under the Insert tab so you can easily add symbols into your documents, presentations and notes.

Office Online updates 7

Office Online updates 8With this gallery you can insert some of the most popular symbols in Word. Just go to the Insert tab, click the Symbols gallery, and then click the symbol you like and it will be inserted in your document with the current font. And this feature is not just limited to Word Online—you can also insert these in PowerPoint Online and OneNote Online, too!

Not seeing the symbol you need? We totally understand—unfortunately, we weren’t able to include all of the symbols in the gallery in its first version. We’re always listening, though, and we’re counting on you to tell us which symbols you still need. If there are any symbols you needed but couldn’t find, definitely click that Request a New Symbol button at the bottom of the gallery to let us know.

Tell Me

Tell Me has been around in Office Online for quite a while now and people frequently use it to get things done more efficiently and quickly—you actually might be one of them! But we are not done yet—we are continuously listening to feedback to figure out more ways to make Tell Me even more helpful. Here are a few things we have recently added to Tell Me.

You already heard about Insights—you can now type a topic you are interested in into the Tell Me box and access rich information about it. Besides that, we added two more features to Tell Me.

The first one is Word Count. Of course, Word Online has been showing the number of words in your document for a long time (the number is displayed in the lower left-hand corner of the browser window). But we also saw a lot of users turning to Tell Me to ask about the word count—so we decided to add it to Tell Me. You now can type “word count” or “number of words” or “how many words” (remember, Tell Me lets you use the language you are comfortable with) and the result will be the new Word Count command. Click it—et voila, the number of words is shown right there.

The second change is more far-reaching in that it affects a number of different commands. We enabled Tell Me to not only show commands that are available directly on the ribbon, but to also surface commands that are hidden away in sub-menus. For example, if you want to change your document’s paper size to A4, you now only need to type something like “size a4″ and Tell Me will present you the “A4 Page Size” command directly in its drop-down—no need to click into the “Page Size” sub-menu anymore. This makes it even easier to get to the command that fits your intent fast and efficiently! Of course, Tell Me still also shows the “Page Size” command, so you can easily get to other options.

Office Online updates 9

Tell us what you think about Office Online

On the Office Online Team, we value our users and what you have to say about our product. We are continuously listening to you through our various feedback channels seeking for improvements. Traditionally, you would visit the Microsoft Answers Community Forums for support questions and privately submit feedback through our Help Improve Office mechanism from within Word, Excel or PowerPoint Online. Today we’re excited to introduce a new and more effective way to provide us ideas and suggestions.

Please visit the Word Online, Excel Online and PowerPoint Online Feedback Forums and immediately see suggestions provided by other members in the Office Online community. You can type your own suggestions in the “Enter your idea” box along with a description as shown in the illustration below. As you type, other suggestions that match yours will be shown. You can vote for suggestions, including your own. You have 10 votes and can use up to 3 votes per suggestion; you get your votes back once the ideas you’ve voted for have been fulfilled by us.

Office Online updates 11

While we cannot guarantee a response to every post, we promise to listen to each and every suggestion. To help us more effectively address your needs, please provide detailed and constructive feedback telling us what you want and why you want it. The more information you can provide us the better. So you want your voice to be heard? Get started and visit Word Online UserVoice, Excel Online UserVoice and PowerPoint Online UserVoice today (for OneNote feedback, visit OneNote UserVoice).

Posted in Uncategorized | Leave a comment

Working with Dates in Azure DocumentDB

JSON (www.json.org) is a lightweight data-interchange format that is easy for humans to read and write yet also easy for machines to parse and generate. JSON is at the heart of DocumentDB. We transmit JSON over the wire, we store JSON as JSON, and we index the JSON tree allowing queries on the full JSON document. It is not something we bolt on to the product, it is core to the service.
It should therefore come as no surprise that DocumentDB natively supports the same data types as JSON; String, Number, Boolean, Array, Object and Null.

1

A .NET developer might notice the omission of certain types they are used to. Most notable is probably the .NET DateTime type. So, how do we store, retrieve, and query using DateTime if the database doesn’t support these types natively?

Keep reading as that is the purpose of this post …

For the purposes of this post let us assume we’re working with the following sample POCO objects.

public class Order
{
    [JsonProperty(PropertyName="id")]
    public string OrderNumber { get; set; }
    public DateTime OrderDate { get; set; }      
    public DateTime ShipDate { get; set; }                
    public double Total { get; set; }
}

This is a simple demonstration of how an Order could be represented in .NET.
The order has two DateTime properties, OrderDate and ShipDate. For the purposes of this post we will be focusing primarily on these two properties.
The default format used by the DocumentDB SDK for handling DateTime objects is to convert them in to the ISO 8601 string format. Therefore, if we do nothing but pass our Order object to DocumentDB, as shown in the code snippet below,

var doc1 = client.CreateDocumentAsync(colLink, new Order { 
      OrderNumber = "09152014101",
      OrderDate = DateTime.UtcNow.AddDays(-30),
      ShipDate = DateTime.UtcNow.AddDays(-14), 
      Total = 113.39
});

The two .NET DateTime properties would be stored as a string similar to;

{
    "OrderDate": "2014-09-15T23:14:25.7251173Z",
    "ShipDate": "2014-09-30T23:14:25.7251173Z"
}

That string looks nice and readable like that, so why is this a problem?

DocumentDB has support for range based indexes on numeric fields allowing you to do range queries, (e.g. where field > 10 and field < 20). To avoid doing costly scans when doing range queries (records older than yesterday, or orders placed last week, or orders shipped today) on dates we need to convert the string representation of a Date to a number, so that we can use range indexes on these fields.

DocumentDB and JSON are neutral to how they represent DateTimes and you can use this neutrality to powerful effect to best suit your application’s needs.

This post introduces two methods of dealing with DateTime properties efficiently, but there are many more implementations that would treat Dates and Times efficiently.

For the rest of the post we will be treating DateTimes as epoch values, or the number of seconds since a particular date. We are using 1 January 1970 00:00 in this post, you can use a different starting point if you like depending on your data. I.e. if you are doing orders and your system only needs to deal with orders from today, then pick today as your starting point. Our fictitious system has lots of historical orders, so we need to go back in time a little.

Here is a simple .NET extension method for DateTime that will do this conversion for us;

public static class Extensions
{
    public static int ToEpoch(this DateTime date)
    {
        if (date == null) return int.MinValue;
        DateTime epoch = new DateTime(1970, 1, 1);
        TimeSpan epochTimeSpan = date - epoch;
        return (int)epochTimeSpan.TotalSeconds;
    }
}

Now how do you use this in your application?

Well there are two ways to proceed, and you can choose the way that best suits the needs of your application.

The first way is store a number field that represents the DateTime instead of the DateTime itself.

The easiest way to do this is to implement a custom serializer & deserializer for dealing with JSON. I’m going to show doing this with JSON.NET by implementing a customer JsonConverter that changes the default behavior of a DateTime property.

To do this we define a class that extends the JsonConverter abstract class and overrides the ReadJson and WriteJson methods.

public class EpochDateTimeConverter : JsonConverter
{
    ...
}

Below is the WriteJson implementation that takes  in  a .NET DateTime and outputs a number using the same ToEpoch() extension method we created above.

public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
    int seconds;
    if (value is DateTime)
    {
        DateTime dt = (DateTime)value;
        if (!dt.Equals(DateTime.MinValue))
            seconds = dt.ToEpoch();
        else
            seconds = int.MinValue;
    }
    else
    {
        throw new Exception("Expected date object value.");
    }

    writer.WriteValue(seconds);
}

We also have the reverse, the ReadJson method, which will be used during serialization from JSON back in to .NET. This method takes in a number representing the number of seconds since Jan 1 1970 UTC and returns the .NET DateTime equivalent.

public override object ReadJson(JsonReader reader, Type type, object value, JsonSerializer serializer)
{
    if (reader.TokenType == JsonToken.None || reader.TokenType == JsonToken.Null) 
        return null;

    if (reader.TokenType != JsonToken.Integer)
    {
        throw new Exception(
            String.Format("Unexpected token parsing date. Expected Integer, got {0}.",
            reader.TokenType));
    }

    int seconds = (int)reader.Value;
    return new DateTime(1970, 1, 1).AddSeconds(seconds);
}

To use this in an application we need to decorate our DateTime properties with the JsonConverter attribute. Now, when these properties are serialized / deserialized Json.NET knows not to work with its default but to use our custom code instead.

    public class Order
    {
        [JsonProperty(PropertyName="id")]
        public string OrderNumber { get; set; }

        [JsonConverter(typeof(EpochDateTimeConverter))]
        public DateTime OrderDate { get; set; }

        [JsonConverter(typeof(EpochDateTimeConverter))]
        public DateTime ShipDate { get; set; }

        public double Total { get; set; }
    }

The result after serialization is now a number in JSON and a number being stored in DocumentDB.

{
    "OrderDate": 1408318702,
    "ShipDate": 1408318702 
}

To do an efficient range queries on a numeric field in DocumentDB we have to define a Range Index on the path containing our numeric field, when we create the DocumentCollection.

The example below shows creating a DocumentCollection with a custom IndexPolicy. We use a Numeric Precision of 7 bytes for this range index because we are dealing with numbers in the billions. For a smaller range of numbers it would be sufficient to use smaller precision levels.

var collection = new DocumentCollection
{
    Id = id
};

//set the default IncludePath to set all properties in the document to have a Hash index
collection.IndexingPolicy.IncludedPaths.Add(new IndexingPath
{
    IndexType = IndexType.Hash,
    Path = "/",
});

//now define two additional paths in Order that we know we want to do Range based queries on
collection.IndexingPolicy.IncludedPaths.Add(new IndexingPath
{
    IndexType = IndexType.Range,
    Path = "/\"OrderDate\"/?",
    NumericPrecision = 7
});

collection.IndexingPolicy.IncludedPaths.Add(new IndexingPath
{
    IndexType = IndexType.Range,
    Path = "/\"ShipDate\"/?",
    NumericPrecision = 7
});

collection = client.CreateDocumentCollectionAsync(dbLink, collection).Result;

Once this is in place, doing a query on a DateTime property is as easy as;

//convert "7 days ago" to epoch number using our ToEpoch extension method for DateTime
int epocDateTime DateTime.UtcNow.AddDays(-7).ToEpoch();

//build up the query string
string sql = string.Format("SELECT * FROM Collection c WHERE c.OrderDate > {0}", epocDateTime);

//execute the query and get the results in a List
var orders = client.CreateDocumentQuery<Order>(col.SelfLink, sql).ToList();

That’s one approach, and it works very efficiently. The downside however is that you lose the human readable date string in your database. Should another application connect to our database the custom deserializer won’t necessarily be run and the value returned is now difficult to deal with because not many humans can do an epoch conversion in their heads (I certainly can’t).

The second way to implement this is to preserve the readable DateTime field but add an additional field to your document which stores the numeric representation of the DateTime in addition to the string representation.

To do this, we can create a new custom type that looks like this;

public class DateEpoch
{
   public DateTime Date { get; set; }
   public int Epoch
   {
       get
       {
           return (this.Date.Equals(null) || this.Date.Equals(DateTime.MinValue))
               ? int.MinValue
               : this.Date.ToEpoch();
        }
    }
}

Now change the two DateTime properties of the Order object to this new type like this;

public class Order
{
    [JsonProperty(PropertyName = "id")]
    public string OrderNumber { get; set; }

    public DateEpoch OrderDate { get; set; }

    public DateEpoch ShipDate { get; set; }

    public double Total { get; set; }
}

With this in place, when you now pass your object to DocumentDB you will end up with JSON that looks something like this;

{
    "OrderDate": {
        "Date": "2014-09-15T23: 14: 25.7251173Z",
        "Epoch": 1408318702
    },
    "ShipDate": {
        "Date": "2014-09-30T23: 14: 25.7251173Z",
        "Epoch": 1408318702
    }
}

As you did with the first technique you have to define a custom IndexPolicy on your DocumentCollection except this time we add the range path on the Epoch value only. You could exclude the date string path from the index if you like but keeping the default hash index still allows for equality operations if you want to do that.

var collection = new DocumentCollection
{
    Id = id
};

//set the default IncludePath to set all properties in the document to have a Hash index
collection.IndexingPolicy.IncludedPaths.Add(new IndexingPath
{
    IndexType = IndexType.Hash,
    Path = "/",
});

//now define two additional paths in Order2 that we know we want to do Range based queries on
collection.IndexingPolicy.IncludedPaths.Add(new IndexingPath
{
    IndexType = IndexType.Range,
    Path = "/\"OrderDate\"/\"Epoch\"/?",
    NumericPrecision = 7
});

collection.IndexingPolicy.IncludedPaths.Add(new IndexingPath
{
    IndexType = IndexType.Range,
    Path = "/\"ShipDate\"/\"Epoch\"/?",
    NumericPrecision = 7
});

//could also exclude the Date portion of the dates in Order2 as we're never going to 
//index on these, but will leave these there because they're still indexed with a hash 
//so you could do equality operations on them.

collection = client.CreateDocumentCollectionAsync(dbLink, collection).Result;

Now to query with this approach you could execute the following LINQ query, I chose to demonstrate LINQ here and not SQL as in the previous example, because this is another advantage of this approach.

var orders = from o in client.CreateDocumentQuery<Order2>(col.SelfLink)
    where o.OrderDate.Epoch >= DateTime.Now.AddDays(-7).ToEpoch()
    select o;

Or, you can choose the following LINQ Lambda syntax if you favor that;

var orders2 = client.CreateDocumentQuery<Order2>(col.SelfLink)
    .Where(o => o.OrderDate.Epoch >= DateTime.UtcNow.AddDays(-7).ToEpoch())
    .ToList();

And of course the SQL syntax we used earlier would also be valid and give the same result and is what you would use when LINQ was not available to you;

string sql = String.Format("SELECT * FROM c WHERE c.OrderDate.Epoch >= {0}", 
                 DateTime.UtcNow.AddDays(-7).ToEpoch());

This second approach has two main advantages over the first. Firstly it does not rely on a custom serialization technique for specific tool like JSON.NET so it can be used with other JSON serializers or with other languages beside just .NET. This technique would work equally well in say Node.JS or in Python. Secondly because we keep the human readable DateTime value as a string if another application queries your data, humans will still have a readable version of the date to work with.

This approach requires more storage and results in slightly larger documents as a result, but if you are not sensitive about the storage costs or the size of the document being too big then I favor this approach over the first for the benefits listed above.

And that’s it, DateTime properties in DocumentDB. Done

If you would like the sample code for this blog post you can download it from here

To get started with Azure DocumentDB head over to the service documentation page on azure.com  => http://azure.microsoft.com/en-us/documentation/services/documentdb/ where you can find everything you need to get up & running.

Principal Director, SQL- DocumentDB

Posted in Uncategorized | Leave a comment