Internet of Things (IoT) is a technological invention that has the potential to impact on how we live and how we work by connecting any device to the Internet. Consequently, a vast amount of novel applications will enhance our lives. Internet Engineering Task Force (IETF) standardized the Constrained Application Protocol (CoAP) to accommodate the application layer and network congestion needs of such IoT networks. CoAP is designed to be very simple where it employs a genuine congestion control (CC) mechanism, named as default CoAP CC leveraging basic binary exponential backoff. Yet efficient, default CoAP CC does not always utilize the network dynamics the best. As a result, CoCoA has been exposed to better utilize the IoT networks. Although CoCoA considers the network dynamics, the RTO calculation of CoCoA is based on constant coefficient values. However, our experiments show that these constant values, in general, do not achieve the best throughput. Inspired by these observations, we propose a new machine learning-based CC mechanism called as mlCoCoA that is a variation of CoCoA. Particularly, mlCoCoA sets retransmission timeout (RTO) estimation parameters of CoCoA adaptively by using a machine learning method. In this study, we applied support vector machines on a self-created dataset to develop new models for improving the throughput of the IoT network with dynamic selection of CoCoA coefficient values. We carried out extensive simulations in Cooja environment coupled with Californium. Our results indicate that compared to the performance of default CoAP CC and CoCoA mechanisms, mlCoCoA has merit in terms of improving the throughput of CoAP applications.