Archive for January, 2016

28
Jan
XGBoost
  • Synergic Partners
  • 5197 Views
  • 0 Comment
  • Kaggle . Machine Learning . XGBoost .

In Kaggle machine learning competitions, two techniques tend to be dominant: the use of groupings of decision trees for structured data and neural networks when the data includes images or sound. Traditionally, Random Forest dominated competitions in structured data, but another algorithm has recently surpassed it in these competitions: Gradient Boosted Trees. Like RF, GBT classifies examples through the use of a grouping of decision trees. In the case of the latter, the trees are constructed sequentially, adding at each iteration the tree that best compensates for the errors in the other trees. This method is called gradient because the model evolves tree by tree towards a minimum of error. The tool used in these cases is called XGBoost, an…

26
Jan
Neural Style
  • Synergic Partners
  • 4302 Views
  • 0 Comment
  • No tags

In recent years, neural networks have been used in a variety of fields through Deep Learning techniques and have been especially successful in the recognition and classification of images. Examples of these applications include the automatic tagging of videos according to the objects recorded, facial recognition and the processing of texts based on images. Convolutional Neural Networks are a kind of neural network specifically adapted to the recognition of images. They consist of a series of nodes that filter and hierarchically store information about an image, from the highest level of detail to the lowest. The information stored on the nodes can be used not only to recognize similar images, but it also allows for approximate reconstructions of the original…

Featured customer