Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added AdaBoostRegressor #89

Merged
merged 5 commits into from
Oct 6, 2021
Merged

Conversation

26tanishabanik
Copy link
Contributor

@Thilakraj1998
Copy link
Contributor

@26tanishabanik

avoid \ in configuration for consistency

       "base_estimator":{'object':[tree.DecisionTreeRegressor(max_depth=3),tree.DecisionTreeRegressor(max_depth=5),\
                                                                                    tree.DecisionTreeRegressor(max_depth=7),\
                                                                                     tree.DecisionTreeRegressor(max_depth=10))]},

keep it like:

          "base_estimator":{'object':[
                          tree.DecisionTreeRegressor(max_depth=3),
                          tree.DecisionTreeRegressor(max_depth=5),
                          tree.DecisionTreeRegressor(max_depth=7),
                          tree.DecisionTreeRegressor(max_depth=10))
            ]},

@26tanishabanik
Copy link
Contributor Author

@26tanishabanik

avoid \ in configuration for consistency

       "base_estimator":{'object':[tree.DecisionTreeRegressor(max_depth=3),tree.DecisionTreeRegressor(max_depth=5),\
                                                                                    tree.DecisionTreeRegressor(max_depth=7),\
                                                                                     tree.DecisionTreeRegressor(max_depth=10))]},

keep it like:

          "base_estimator":{'object':[
                          tree.DecisionTreeRegressor(max_depth=3),
                          tree.DecisionTreeRegressor(max_depth=5),
                          tree.DecisionTreeRegressor(max_depth=7),
                          tree.DecisionTreeRegressor(max_depth=10))
            ]},

Sure

@26tanishabanik
Copy link
Contributor Author

@26tanishabanik

avoid \ in configuration for consistency

       "base_estimator":{'object':[tree.DecisionTreeRegressor(max_depth=3),tree.DecisionTreeRegressor(max_depth=5),\
                                                                                    tree.DecisionTreeRegressor(max_depth=7),\
                                                                                     tree.DecisionTreeRegressor(max_depth=10))]},

keep it like:

          "base_estimator":{'object':[
                          tree.DecisionTreeRegressor(max_depth=3),
                          tree.DecisionTreeRegressor(max_depth=5),
                          tree.DecisionTreeRegressor(max_depth=7),
                          tree.DecisionTreeRegressor(max_depth=10))
            ]},

Its done

@Thilakraj1998
Copy link
Contributor

@26tanishabanik for integer and float parameter specify only min and max value to consider

for example :
"n_estimators":{'int':[10, 5000]},
"learning_rate":{'float':[1e-3,0.1]},

@26tanishabanik
Copy link
Contributor Author

@26tanishabanik for integer and float parameter specify only min and max value to consider

for example : "n_estimators":{'int':[10, 5000]}, "learning_rate":{'float':[1e-3,0.1]},

Okay

@26tanishabanik
Copy link
Contributor Author

@26tanishabanik for integer and float parameter specify only min and max value to consider

for example : "n_estimators":{'int':[10, 5000]}, "learning_rate":{'float':[1e-3,0.1]},

Done

@Thilakraj1998 Thilakraj1998 added the hacktoberfest-accepted For accepting PR of participants in Hacktoberfest label Oct 6, 2021
@Thilakraj1998 Thilakraj1998 merged commit 40ee1ab into blobcity:main Oct 6, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
hacktoberfest-accepted For accepting PR of participants in Hacktoberfest
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants